Paper out today by Etay Hay (now PI @CAMHnews) examining how the geometric features of touched objects are encoded by the population of peripheral tactile neurons and biologically-plausible synaptic mechanisms for reading this out in the CNS. https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008303
Long story short: We previously showed that first-order tactile neurons have spatially complex fields and this lets them do cool things like signal the geometric features of things you touch -- PNS FTW. https://www.nature.com/articles/nn.3804
This peripheral computational power is great but it makes things very unruly and ruins intuitions about how the population of these things works together, how the brain might get this information out in any realistic way, and what its all good for.
This paper is a theoretical account of some possibilities along these lines that is very focused on coincidence detection across neurons at various timescales. Of course whether the brain actually does these things is another story we need/want/will to tackle.