1 Problems/Solutions

I tried to improve the 1d time maps to use them for visualizing git commits in these reviews. Overall I still don't have a general solution for binning, but here is what I am doing:

Sticking with idea of keeping things mostly qualitative, I go with time diffs (here the difference in seconds between commits) and take natural logs of them. I am then binning values directly by flooring (which is not the best way) them which gives uniform bins. The useful part is the set of reference markers for time points like 10 SEC, which makes the plot work out okay for now.

With the decision to keep area as the chart primitive, I can apply things like gradients specifying more intense periods (frequent commits) and also convey a general sense of volume.

2 Readings/Explorations

  • Machine Learning: The High-Interest Credit Card of Technical Debt (sculley2014machine).
  • Harnessing Non-Linearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication (jaeger2004harnessing).
  • Continuing on CCGs, read Learning to map Sentences to Logical Form: Structured Classification with Probabalistic Categorial Grammars (zettlemoyer2012learning).
  • Started using parenscript. First impression: Feels a little clumsy as compared to plain JS, but I know there will be nice things coming if I stick to it and start writing macros and other stuff.

3 Programming

From this week, I will have a plot showing my commit history for the last 5 weeks (latest on the bottom). The idea is to emphasize the intensity, which covers both the commit counts and the frequency. The darker regions along the x-axis represent more frequent events (i.e. I am committing every few seconds or so) and the height represents the number of events. It is obviously not very readable without context (you can try reading this old post) but it's a little low on priority for me to fix.

Commits for week 42-2018 and 4 previous weeks.


Overall, I don't think I created anything major outside of work this week.

4 Writing

  • Shepherded a few old notes from a couple of books which will go in journal sometime soon.

5 Media

Bibliography

  • [sculley2014machine] Sculley, Phillips, Ebner, Chaudhary & Young. 2014. "Machine learning: The high-interest credit card of technical debt." , , link. doi.
  • [jaeger2004harnessing] Jaeger & Haas. 2004. "Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication." science, 304(5667), 78-80. link. doi.
  • [zettlemoyer2012learning] Zettlemoyer & Collins. 2012. "Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars." arXiv preprint arXiv:1207.1420, , link. doi.
  • [spector2012google] Spector, Norvig & Petrov. 2012. "Google's hybrid approach to research." Communications of the ACM, 55(7), 34-37. link. doi.