Having finished Revolt of the Public and the Crisis of Authority in the New Millennium by Martin Gurri at the end of last week, I started reading Life 3.0 Being Human in the age of AI by Max Tegmark. Both are recommended reads and question what our future society will be, politically and digitally.
I travelled to Edinburgh for work on Monday. The trip down on an LNER service was so much more pleasant than last week’s Scotrail journeys, let down only by flaky wifi.
— Ian Watt (@watty62) February 4, 2019
Looking back through my tweets for the week, they seem to be split between the utterly negative (about Brexit, as usual) and the positive (e.g. Data Fest 2019).
On Tuesday, I hosted our eleventh Aberdeen Data Meetup. We had two excellent presentations, from Adam Sroka and Nick Radcliffe. Having such great speakers means that we are growing the audience monthly to the point that we are turning people away each month. We also need to think about varying the format occasionally – with some practical sessions.
I’ve been championing Data Fest fringe to my various networks and especially the opportunity host Fringe events in Aberdeen. It looks like we will have 9 or 10, which is quite an improvement on last year’s two!
One of our events will be a Workshop which is a collaboration between Code The City and Wikimedia UK. Speaking of which, I spotted this on Twitter this weekend which executes a complex SPARQL query on Wikidata. This is something I would like to be able to do at this level!
Our Air Quality Hackathon which happens 16-17 February looks like it will be very busy – almost all tickets are now gone!
I continued to work on the data gathering for the IoT in Schools side project. Having struggled with speeds of uploads to Google Sheets (GS), we experimented with Adafruit plus either Zapier or IFTTT to get the data there. Finally I refactored my Python code to write directly to GS. I also created a new routine to check if we’ve a backup file of un-uploaded data and if so to flush that to GS. I then played with some rudimentary data viz using some test data.