joyce_xxz's profile picture. MSc @UCalgaryML | Previously @ShanghaiTechUni

Yufan Feng @ NeurIPS

@joyce_xxz

MSc @UCalgaryML | Previously @ShanghaiTechUni

Joined March 2023
Similar User
JerryYin777's profile picture. Senior @UMNComputerSci | Contributor of LLM Yi & SenseNova5o|ex-Intern @ https://t.co/18PtAfVKtu @ THUNLP @ SenseTime | MLSys & LLM | CUDA | Looking for MLSys 25fall PhD Position

@JerryYin777

Evens1sen's profile picture. Software Developer | Photographer | Electronics Enthusiast | Classical & Cantopop Lover

@Evens1sen

_MattJiang_'s profile picture. CSE Ph.D. student in systems @UMich | Life & Computer Systems Enthusiast

@_MattJiang_

murezsy_'s profile picture. PhD Student @OhioStateCSE @mvapich #MVAPICH #AI+HPC
#HPC #MLsys #ComputerVision

@murezsy_

Albertc40248219's profile picture. Pursuing Graduate Students at Polytechnic University. Research Fields: Auto Drive,Computer Vision,Deep Learning 🤓🤩🥳 wanna get a PhD Pos

@Albertc40248219

xuefen19's profile picture. One blessed with another chance to breathe ❤

@xuefen19

Yufan Feng @ NeurIPS Reposted

1/3 Today, an anecdote shared by an invited speaker at #NeurIPS2024 left many Chinese scholars, myself included, feeling uncomfortable. As a community, I believe we should take a moment to reflect on why such remarks in public discourse can be offensive and harmful.

drjingjing2026's tweet image. 1/3 Today, an anecdote shared by an invited speaker at #NeurIPS2024 left many Chinese scholars, myself included, feeling uncomfortable. As a community, I believe we should take a moment to reflect on why such remarks in public discourse can be offensive and harmful.

Yufan Feng @ NeurIPS Reposted

I'm proud that the @UCalgaryML lab will have 6 different works being presented by 6 students across #NeurIPS2024, in workshops (@unireps, @WiMLworkshop, MusiML) and the main conference! 🎉 Hope to see you at our posters/talks 🤓, full schedule at calgaryml.com 🧵(1/4)

yanii's tweet image. I'm proud that the @UCalgaryML lab will have 6 different works being presented by 6 students across #NeurIPS2024, in workshops (@unireps, @WiMLworkshop, MusiML) and the main conference! 🎉

Hope to see you at our posters/talks 🤓, full schedule at calgaryml.com

🧵(1/4)

Yufan Feng @ NeurIPS Reposted

✨Our new @unireps paper tries to answer why the Lottery Ticket Hypothesis (LTH) fails to work for different random inits through the lens of weight-space symmetry. We improve the transferability of LTH masks to new random inits leveraging weight symmetries. 🧵(1/6)

JainRohan16's tweet image. ✨Our new @unireps paper tries to answer why the Lottery Ticket Hypothesis (LTH) fails to work for different random inits through the lens of weight-space symmetry. 

We improve the transferability of LTH masks to new random inits leveraging weight symmetries.

🧵(1/6)
JainRohan16's tweet image. ✨Our new @unireps paper tries to answer why the Lottery Ticket Hypothesis (LTH) fails to work for different random inits through the lens of weight-space symmetry. 

We improve the transferability of LTH masks to new random inits leveraging weight symmetries.

🧵(1/6)

Yufan Feng @ NeurIPS Reposted

Knowledge #distillation is a widely used model compression method. We explore the nuanced impact of temperature on distilled models' #fairness ⚖️. Our findings show distilled students are less fair than their teachers at typical temps, but can be more fair in some instances…🧵👇

Aidamo27's tweet image. Knowledge #distillation is a widely used model compression method. We explore the nuanced impact of temperature on distilled models' #fairness ⚖️. Our findings show distilled students are less fair than their teachers at typical temps, but can be more fair in some instances…🧵👇

Yufan Feng @ NeurIPS Reposted

Come chat with us about our work, Dynamic Sparse Training with Structured Sparsity, tomorrow at #ICLR2024 from 4:30-6:30PM in Hall B #47. Not in Vienna? No problem. Check out our poster and a short video describing the work here: iclr.cc/virtual/2024/p… More info in 🧵

"Dynamic Sparse Training with Structured Sparsity" (openreview.net/forum?id=kOBkx…) was accepted at ICLR 2024! DST methods learn state-of-the-art sparse masks, but accelerating DNNs with unstructured masks is difficult. SRigL learns structured masks, improving real-world CPU/GPU timings

mikelasby's tweet image. "Dynamic Sparse Training with Structured Sparsity" (openreview.net/forum?id=kOBkx…) was accepted at ICLR 2024! DST methods learn state-of-the-art sparse masks, but accelerating DNNs with unstructured masks is difficult. SRigL learns structured masks, improving real-world CPU/GPU timings


Yufan Feng @ NeurIPS Reposted

"Dynamic Sparse Training with Structured Sparsity" (openreview.net/forum?id=kOBkx…) was accepted at ICLR 2024! DST methods learn state-of-the-art sparse masks, but accelerating DNNs with unstructured masks is difficult. SRigL learns structured masks, improving real-world CPU/GPU timings

mikelasby's tweet image. "Dynamic Sparse Training with Structured Sparsity" (openreview.net/forum?id=kOBkx…) was accepted at ICLR 2024! DST methods learn state-of-the-art sparse masks, but accelerating DNNs with unstructured masks is difficult. SRigL learns structured masks, improving real-world CPU/GPU timings

Yufan Feng @ NeurIPS Reposted

Text to video is here. And it is at the demonic phase.


United States Trends
Loading...

Something went wrong.


Something went wrong.