Last night, I attended my first Ottawa Python Authors Meetup. It was the first time that I had attended despite wanting to attend for a long time. (Mr. Releng also works with Python and thus every time there's a meetup, we discuss who gets to go and who gets to stay home and take care of little Releng. It depends on if the talk to more relevant to our work interests.)
|
The venue was across the street from Confederation Park aka land of Pokemon. |
I really enjoyed it. The people I chatted with were very friendly and welcoming. Of course, I ran into some people I used to work with, as is with any tech event in Ottawa it seems. Nice to catch up!
|
The venue had the Canada Council for the Arts as a tenant, thus the quintessentially Canadian art. |
The speaker that night was
Emily Daniels, developer from
Halogen Software who spoke on Artificial Intelligence with Python. (
Slides here,
github repo here). She mentioned that she writes Java during the day but works on fun projects in Python at night. She started the talk by going through some examples of artificial intelligence on the web. Perhaps the most interesting one I found was a recurrent neural network called
Benjamin which
generates movie script ideas and was trained on existing sci-fi movies and movie scripts. Also, a short film called
Sunspring was made of one of the generated scripts. The dialogue is kind of stilted but it is interesting concept.
After the examples, Emily then moved on to how it all works.
Deep learning is a type of machine learning that drives meaning out of data using a hierarchy of multiple layers that mimics the neural networks of our brain.
She then spoke about a
project she wrote to create generative poetry from a RNN (recurrent neural network). It was based on a
RNN tutorial that
she heavily refactored to meet her needs. She went through the code that she developed to generate artificial prose from the works of H.G. Wells and Jane Austen. She talked about how she cleaned up the text to remove EOL delimiters, page breaks, chapters numbers and so on. And then it took a week to train it with the data.
She then talked about another example which used data from Jack Kerouac and Virginia Woolf novels, which she posts some of the results to
twitter.
She also created a twitter account which posts generated text from her RNN that consumes the content of
Walt Whitman and Emily Dickinson. (I should mention at this point that she chose these authors for her projects because copyrights have expired on these works and they are available on the
Gutenberg project)
After the talk, she field a number of audience questions which were really insightful. There were discussions on the inherent bias in the data because it was written by humans that are sexist and racist. She mentioned that she doesn't post the results of the model automatically to twitter because some of them are really inappropriate since these novels since they learned from text that humans wrote who are inherently biased.
One thing I found really interesting is that Emily mentioned that she felt a need to ensure that the algorithms and data continue to exist, and that they were faithfully backed up. I began to think about all the Amazon instances that Mozilla releng had automatically killed that day as our capacity had peaked and declined. And of the great joy I feel ripping out code when we deprecate a platform. I personally feel no emotional attachment to bring down machines or deleting used code.
Perhaps the sense of a need for a caretaker for these recurrent neural networks and the data they create is related to the fact that the algorithms that output text that is a simulacrum for the work of an author that we enjoy reading. And perhaps that is why we maybe we aren't as attached to a ephemeral pool of build machines as we are are to our phones. Because the phone provides a sense human of connection to the larger world when we may be sitting alone.
Thank you Emily for the very interesting talk, to the
Ottawa Python Authors Group for organizing the meetup, and Shopify for sponsoring the venue. Looking forward to the next one!
Further reading
Read more...