I've gathered here with a few other hundred crazy people to talk about software and I've attached the notes I took from the first day. I have more in depth notes for a number of the talks as well that are in a less "curated" form.
Rich Hickey does an absolutely amazing job taking what is otherwise a very complex concept, and reduces it to a very simple concept. Maps, Filters, and reductions are all really special cases of fold, but why would we restrict ourselves to only operating on sequences? An operation like "double this" really should be applicable to all sorts of data structures. Transducers are what happens if you actually separate out the implicit data structure from the work that you're doing, leaving incredibly clear functions. Now you can use a transducer with a channel, a lazy sequence, or nearly anything your heart desires. Highly Recommended Talk.
Jepsen II: Electric Boogaloo
Kyle Kingsbury is definitely my favorite presenter I've seen at Strangeloop.
Linearizability is the gurantee that after your right is done, every one else
will see those in reads. A slightly weaker form is sequential consistency,
which has an order which may not actually be temporal. Every knows that you
can only choose 2 of Consistency, Availability, and partition tolerance, however most people don't realize you have to pick partition tolerance and availability doesn't mean "You can return errors to maintain availability" because you'll be made fun of publicly at conferences.
Dynamic Programming at Ease
Stefanie Schirmer gives one of my favorite but most unaccessible talks at Strangeloop. She recasts dynamic programming as an Algebra of a scoring function and a choice function. The questions that you can then ask in this space are really all just products then in that algebra.
Perhaps more impressively, this has been turned into a framework that lets you write only the product piece. This is one of the talks that I am definitely going to rewatch, because I think it will have even more to offer on the second viewing.
Elixir and the Internet of Things
Doug Roher explains why he chose Elixir, a language that compiles to Erlang, to build a prototype for a client that handles Thermostats. A pretty good introduction to the rationale of Erlang and the pitfalls that come with it, I think it's worth a read.
Testing Distributed Systems w/ Deterministic Simulation
Will Wilson, a FoundationDB, talked about their approach to testing distributed systems. They've built a full fledged simulator of their network, file interactions, and every other source of nondeterminism in their codebase. They've also built a small cluster of physical hardware that they have network controlled power supplies for both the network switches and the machines, which they use to cause frequent terrible failures in.
They also wrote a C++ framework for actors that uses a preprocessor and code generation. They really, really wanted to use C++ apparently...
These people are sort of crazy.
The Mess We're In
Joe Armstrong, Creator of Erlang, goes on a vaugely philosophical rant about how it might be possible to remove the complexity of all of these duplication of files, and laughs about how the billions of lines of legacy code we've written gurantee us nearly eternal employment. He ends by discussing an interesting idea around a universally addressable content system.
Onyx: Distributed Workflows for Dynamic Systems
Many Jobs can be described as "take a bunch of data from some places, transform it, and shove it somewhere." Mine is, and there's a pretty solid chance yours is too. Michael Drogalis made a new framework for processing data that tries to make batching and streaming more interoperable, adds transactional semantics, and a more language agnostic approach.
He approaches data/transforms as a directed acyclic graphs, which is a very clear way of representing that process. This seems like a very interesting project, and something to keep an eye on.
Shenandoah, a pause free GC (almost)
Christine Flood, a redhat researcher talks through the recent work that her group has done to make a nearly pause free GC. This GC trades off total time in exchange for avoiding total stops by using "forwarding pointers" in the older region of memory to the newer region of memory. The goal is to eventually move to a totally pause free GC somewhat soon.
The tradeoff is that there are "read/write" barriers (perhaps someone else can explain this better) and there's an extra word of memory for most objects.
* No video posted yet *
Vulpes: A Functional Deep Belief Net
Rob Lyndon talks through some early Neural Network research and then delves into Belief Networks. Prior to the 2006 work, neural networks were more or less a dark art. Belief networks pick out an optimal topology for backpropagation training. Rob then goes into how to do this programming functionally on the GPU and the benefits over using the C API. He then shows some acutal examples of the code that's used in implementing these against the GPUs.
It would have been great to see more examples of GPU programming examples then we actually saw.
Fun talk Bonus
Via an interesting project of building out an Dubstep music using the HTMl browser audio apis. A hillarious talk, that's actually pretty interactive, I rather enjoyed this talk.
As a side note, he also used react for his presentation and it seemed to work quite nicely for him and some of the animations he built.