I'm really excited about our new model documentation solution at @paperwithcode! In this thread I'm going to show you all the cool, innovative things you can do with this tool 👇 https://twitter.com/paperswithcode/status/1361672297732591617?s=20
Pre-trained models in a library or repository can been indexed, allowing users to search and filter by factors like evaluation metrics and the year of paper publication. Much easier to see what's available in a library.
The majority of model pages come with visualizations of the architecture - so you can see *exactly* how it works from input to output, including tensor dimensions. Below we can visualize the Inception modules in @ChrSzegedy's Inception v3. https://paperswithcode.com/lib/timm/inception-v3
Because of the "model-is-code" paradigm of modern DL frameworks, code and weights are tightly coupled. So we link directly to the location in the library where the model is loaded. Just click the code button! (We also link to the paper and weights).
We index headline information such as parameters, flops, training data, resources and training time. This gives you information on what you might need if you want to train the model yourself.
We also index the training techniques and architectural motifs. This really aids understanding. Looking at a ResNeSt model and don't know what split attention is? Click the link and you'll go straight to our methods encyclopedia explaining how the building block works!
Our results section allows you to compare hundreds of models in a single view on axes like performance metrics, efficiency metrics and hyperparameters. It allows you to see cool things like the effect of scaling an architecture, see EfficientNet below: https://paperswithcode.com/lib/timm/tf-efficientnet
But performance and efficiency are one thing, what about how many epochs the model was trained? weight decay parameter? initial learning rate? We index all of that too...
The idea of structured model docs is still very much a beta project. Stay tuned for updates. Feedback is also very welcome!
You can follow @rosstaylor90.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.