Show HN: Control your X/Twitter feed using a small on-device LLM
imbue.comWe built a Chrome extension and iOS app that filters Twitter's feed using Qwen3.5-4B for contextual matching. You describe what you don't want in plain language—it removes posts that match semantically, not by keyword.
What surprised us was that because Twitter's ranking algorithm adapts based on what you engage with, consistent filtering starts reshaping the recommendations over time. You're implicitly signaling preferences to the algorithm. For some of us it "healed" our feed.
Currently running inference from our own servers with an experimental on-device option, and we're working on fully on-device execution to remove that dependency. Latency is acceptable on most hardware but not great on older machines. No data collection; everything except the model call runs locally.
It doesn't work perfectly (figurative language trips it up) but it's meaningfully better than muting keywords and we use it ourselves every day.
Also promising how local / open models can now start giving us more control over the algorithmic agents in our lives, because capability density is improving.
I love the idea of this. Twitter used to be the go to place for real time community knowledge, but the algorithm has started pushing content that I don't want. I would love to be able to tailor it more to my needs. How are you addressing the on-device option? I'd definitely be most interested in using this in a way that doesn't send information to external servers. Thank you!
On the browser extension side, we're forking WebLLM, adding support for more modern multimodal models, and doing some optimization so that an M4 chip can keep up with scrolling. You can actually use it in bouncer today by going into settings and turning on the experimental local models.
On the mobile side, we're working to get 4B models running in the Apple Neural Engine. Main bottleneck for Mobile is actually battery life. Neither are quite optimized enough to formally brag about, but we're almost there!
I wish you luck! This is a clever and creative approach. I feel like we are inching towards on device solutions and I love seeing people work the problem like this.