Four short links: 28 September 2017
Deep Learning, Knowledge Base, Algorithm Transparency, and Formal Methods
- New Theory Cracks Open the Black Box of Deep Learning (Quanta) — Talk (on YouTube), and paper (on arXiv) are interesting, but the article itself has lots of layman-accessible morsels. Tishby and Shwartz-Ziv also made the intriguing discovery that deep learning proceeds in two phases: a short “fitting” phase, during which the network learns to label its training data, and a much longer “compression” phase, during which it becomes good at generalization, as measured by its performance at labeling new test data.
- YAGO — YAGO is a large semantic knowledge base, derived from Wikipedia, WordNet, WikiData, GeoNames, and other data sources.
- ProPublica Seeks Source Code for New York City’s Disputed DNA Software — good to see more places legally testing opaque algorithms.
- New Ways of Coding (The Atlantic) — from testing to formal methods, a readable and accurate survey of discontent about modern software development. The real problem in getting people to use TLA+, he said, was convincing them it wouldn’t be a waste of their time. Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil. And yet, they’re useful.