we really all are going to make it, aren’t we? 2×3090 setup.

“`html I was blown away by a post on Reddit about someone creating a ‘club-3090’ setup, which is essentially a British AI…

By AI Maestro May 14, 2026 1 min read
we really all are going to make it, aren’t we? 2×3090 setup.

“`html

I was blown away by a post on Reddit about someone creating a ‘club-3090’ setup, which is essentially a British AI club for running local LLaMAs (Large Language Models) without relying on cloud services. This has significant implications for the future of AI development and deployment.

  • The post discusses how even relatively low-budget setups like one using WSL2 can achieve impressive performance, with throughput speeds up to 4000 PP/s and token per second (tk/s) rates around 113. This is a substantial improvement over cloud-based models, which often have slower TPS and lower precision.
  • One user found that switching from WSL2 to Ubuntu as a dual-boot setup significantly improved performance metrics by nearly doubling the processing power. With no NVLink used, this indicates significant gains can be made with simpler configurations.
  • This local AI future is not just about speed; it also offers more control and security over data handling and privacy. The user mentions using their own model to perform monkey patches and code reviews, highlighting its utility in various applications beyond just text generation.

These developments suggest that we may be on the cusp of a new era where AI models can operate locally with high performance and efficiency, potentially democratizing access to advanced AI capabilities. The question remains whether smaller models could achieve frontier-level intelligence within the next year or two.

“`

### Takeaways:
– Local LLaMAs are now feasible even on modest hardware configurations.
– Performance improvements have been substantial, making these setups comparable to cloud-based solutions in terms of efficiency and speed.
– The shift towards local AI offers enhanced security and control over data handling.


Originally published at reddit.com. Curated by AI Maestro.

Stay ahead of AI. Get the most important stories delivered to your inbox — no spam, no noise.

Name
Scroll to Top