psc's website
    • Posts
    • Mentoring / Education
      • CME is A-OK
      • GridWorld Playground
      • Intro a Transformers
      • Intro to RL
      • Preparing your resume
      • Tips for Interviewing at Google
      • Tips for Reviewing Research Papers
    • MUSICODE
      • Phase 1
        • 0-Introducing
        • 1-Musical Note & Computation
        • 2-Bits & Semitones
        • 3-Leitmotifs & Variables
        • 4-Live Coding & Jazz
        • 5-Repeats & Loops
      • Introducing
      • Losses, Dissonances, and Distortions
      • Portrait of Hallelagine
    • Art
      • Cost of Beauty
      • Covid Music
      • Family
      • JiDiJi
      • Musical Aquarium
    • Misc
      • Artificial General Relativity
      • Crosswords
      • Origins of April Fool's Day
      • PongDay
      • yovoy
    • Research
      • Other
        • RigL
      • RL
        • 2020 RL Highlights
        • Contrastive Behavioral Similarity Embeddings
        • Dopamine
        • Flying balloons with RL
        • Metrics & continuity in RL
        • MICo
        • Revisiting Rainbow
        • Scalable methods ...
        • SparseRL
        • Statistical Precipice
        • Tandem RL
      • Creativity
        • Agence, a dynamic film
        • GANterpretations
        • ML-Jam
    Rigging the Lottery: Making All Tickets Winners

    Rigging the Lottery: Making All Tickets Winners is a paper published at ICML 2020 with Utku Evci, Trevor Gale, Jacob Menick, and Erich Elsen, where we introduce an algorithm for training sparse neural networks that uses a fixed parameter count and computational cost throughout training, without sacrificing accuracy relative to existing dense-to-sparse training methods. You can read more about it in the paper and in our blog post.

    September 16, 2020 Read
    Navigation
    • About
    • Recent Posts
    • Publications
    Contact me:
    • Twitter
    Mastodon

    Toha
    © 2020 Copyright.
    Powered by Hugo Logo