- Networks and history (bearman2002networks)
- Parallelizing wfst speech decoders (mendis2016parallelizing)
- BBDB on EIEIO – An Introduction to Object-Oriented Emacs Lisp. Okay I thought EIEIO is a very low powered CLOS but I was wrong about the feature sets.
- 1000x Faster Data Augmentation and Population based training of neural networks. Everything works man.
- Knowledge is a stone-age concept, we're better off without it. This is really interesting, even though I feel author is favoring true positives without probably considering if that really is important. The interesting thing is that this provides a lot of thought food for understanding the way we think about aims of machine intelligence in general.
- How TV Ratings Work. I came to know the pervasiveness of Nielsen while reading Cultural Logic of Computation but here were a few more pieces that I didn't know about, specially sweeps week and the audio fingerprinting tech used for spying on shows.
- Why cultural heritage benefits the rich and powerful above all
- [bearman2002networks] Bearman, Moody & Faris. 2002. "Networks and history." Complexity, 8(1), 61-71. link. doi.
- [mendis2016parallelizing] Mendis, Droppo, Maleki, Musuvathi, Mytkowicz & Zweig. 2016. "Parallelizing wfst speech decoders", 5325-5329, in in: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), edited by