Make VS scalable by switching to 64 bit
No matter how fast and efficient VS will be, we will eventually reach some limit.
I've reached the memory limit since VS 2003 -- around 1.3gb memory usage VS started to give out of memory exceptions. That was using 20-30 projects.
Nowadays we have more than 70 projects in the solution and VS 2010 doesn't reach the x86 memory limit (on a 64 bit windows). Eventually we'll reach the number of projects that will be the tipping point for VS.
Using a 64 bit build of VS will enable us to just buy memory and still work with that solution. 16gb machines are not that expensive today.
With VS2015 Update 2 VS stopped crashing when exceeding memory limit a little over 2GB. Rather it started displaying "Low memory detected" message... I have 16GB of RAM. VS can access only little over 2GB because it is 32bit app. Please make it 64bit so we, with bigger projects can use all the goodies that VS have! Thx
When running on a 64 Bit machine, it can be extremly annoying that Visual Studio can only access ~4GB of RAM, which can cause issues with larger projects. Please create a 64 Bit version of Visual Studio!
This could prevent crashes and hangs in larger Visual Studio projects.
Get rid of thunking issues, dual DLLs, remoted debugging required, and 32-bit default project nonsense. Upgrade the package to native 64-bit.
In the modern 64-bit desktop computer world, 32-bit apps are woefully inadequate and archaic. Show a little innovation and keep up with tht times. Raw GUI performance is not everything these days, that is really lame excuse not to rebuild for 64-bit.
Most Microsoft server products that we develop against now run in 64 bits. However the VS IDE is still only available as a 32 bit version.
Integrating the IDE to these server products, eg automating development tasks when targeting SharePoint, BizTalk etc, is now often unfeasible because the object models or PowerShell commands that you need to invoke won't run in a 32 bits process.
By eliminatig the 32/64 bit boundary (Wow64) between the IDE and MS Server products integration on 64 bit systems will be much easier.
I have to load some huge projects that would benefit of using access to full available RAM
Many thanks to those of you to who provided the initial suggestion and all the related comments. We wanted to take a moment to provide further context to this issue and the context that informed our decision to close this.
Firstly, we recognize the spirit that is largely driving the request to move the Visual Studio IDE to be a 64-bit application: in particular, the ability to work with the largest solutions without performance constraints. We’ve made a goal for the division of driving performance improvements across the product, starting with Visual Studio 2015 Update 3, where we will be shipping many Roslyn performance improvements that will particularly reduce memory consumption with large projects.
As we’ve observed telemetry from Visual Studio usage from developers who have opted-in to share feedback with us, we’ve identified a number of other areas where we can improve performance. Some of examples of areas we’re working on in response to such data include:
• Performance improvements to the C# background code analysis engine that collects errors and warnings,
• Performance improvements to the C# GoTo Implementation and Find All References,
• Enhanced code analyzer diagnostic v2 engine,
• Better discoverability of UI to disable full solution analysis,
• Better messaging to users about the circuit breaker that system uses to automatically turn off compute intensive operations like full solution analysis when Visual Studio is under stress.
Another feature we’re adding to Visual Studio “15” that will enable it to support the very largest codebases is Open Folder (https://blogs.msdn.microsoft.com/visualstudio/2016/04/12/open-any-folder-with-visual-studio-15-preview), which enables any folder to be opened without first creating a solution or project. We’re using this feature internally as part of the development of Visual Studio with great results, and you can now check it out in the latest preview builds (https://www.visualstudio.com/downloads/visual-studio-next-downloads-vs).
Lastly, the work we are doing to create a lightweight installer will help reduce the number of components installed by default, which will also improve performance: https://blogs.msdn.microsoft.com/visualstudio/2016/04/01/faster-leaner-visual-studio-installer.
So why not just move Visual Studio to be a 64-bit application? While we’ve seriously considered this porting effort, at this time we don’t believe the returns merit the investment and resultant complexity. We’d still need to ship a 32-bit version of the product for various use cases, so adding a 64-bit version of the product would double the size of our test matrix. In addition, there is an ecosystem of thousands of extensions for Visual Studio (https://visualstudiogallery.msdn.microsoft.com) which would need to also port to 64-bit. Lastly, moving to 64-bit isn’t a panacea – as others have noted (https://blogs.msdn.microsoft.com/ricom/2016/01/11/a-little-64-bit-follow-up/), unless the work doesn’t fit into a 32-bit address space, moving to 64-bit can actually degrade performance.
So instead of a large porting effort to move the entirety of the IDE to 64-bit, we are instead working to target components that are resource-constrained today and move them out-of-process, which also offers other engineering benefits including better code sharing and more agile development; and we will continue to monitor overall system impact, not just the IDE impact.
In light of the above, we are closing this item for now since we are not moving the IDE to 64-bit in the next product release. Naturally, we will continue to consider this for future versions, but right now we don’t believe the scales tip in favor of this work. Of course, we remain heavily invested in 64-bit for runtimes, compilation, and profiling.
One last word about performance: when you have specific issues, it’s now really easy to use the feedback tools right inside Visual Studio to send us performance traces that can help us identify any potential issues. See instructions on how to do this here: https://msdn.microsoft.com/en-us/library/mt280277.aspx.
As ever, thank you for your feedback – keep it coming. We care about building the best and most productive development environment regardless of your target platform or development system.
Best wishes, Visual Studio Team
James Hood commented
64th comment! Woohoo!
Look guys, we're fine with you dropping 32-bit devenv support. Just think of it as dropping VB6. Sure, you'll get UserVoice suggestions to "bring back 32-bit devenv" for eternity, but the majority of your developers will be happy with the change.
I agree that 64-bit isn't a panacea for performance, in fact I just perused an article in the April 1993 issue of Microsoft Systems Journal that was discussing the introduction of 32-bit Win32s (on Windows 3.1) and it made many of the same points. Obviously a memory leak or a poorly-selected algorithm or architecture is going to be a problem whether it is in-proc or not. Moving items out of proc is really just a stop-gap measure for 32-bit memory space exhaustion and adds some performance overhead and data duplication.
VS itself seems to make due with the 32 bit memory budget just fine. But Resharper is a memory hog.
Resharper is so valuable that all serious developers have it installed.
Resharper *should* move its data out of process but apparently this is very hard to do (as they have not done so for many years).
I believe that most developers do not depend on native code addins. Addins breaking is not a concern. Most managed extensions should be portable to 64 very easily. Since the 64 bit VS version will find *great* and immediate adoption all relevant plugins will be ported very quickly.
Also, I would sacrifice any plugin and any feature at all for a 64 bit version.
As someone who often runs up against memory limits in VS, despite having a 64GB workstation, this flippant and extremely belated response to a very reasonable, much needed, and long-overdue feature infuriates me.
MS's arguments against moving to 64-bit are straight from 2009.
Reduced performance from a move to 64-bit? Ok, just offset that with these other improvements you say you are making, done and done.
Doubling product matrix? Just make 2015 the last 32-bit version. Anyone who has a use case for 32-bit can just stick with 2015 or earlier. Just like anyone with a use case for 16-bit can stick with VB4, lol.
Many active extension authors would jump at the chance to port their extensions to 64-bit, because they are feeling the same pain as the rest of us. How many times in the past has MS rebased the extension ecosystem? More than once by my recollection.
I think y'all are so used to the rancid, 32-bit flavor of your dog food that you don't notice like the rest of us how disgusting it tastes.
Oh, and closing this legitimate request and killing voting on it was inappropriate.
Kasper Østergaard commented
Nice way to ignore Rico Mariani's own semi-rebuttal to his own article: https://blogs.msdn.microsoft.com/ricom/2016/01/04/64-bit-visual-studio-the-pro-64-argument/
This is a ******** decision and argument.
On of the bigest .Net project at hungary cant use some sollution wide code analytic and diagnostic tool of Visual Studio because not only the whole sollution with its 15 project but some single sollution is even bigger than the tools can handle and the studio just says low memory detected and the tool is disabled for this sollution.
The problem is that while the development machine contains 16 Gigs of ram and a 64bit windows 10 the Studio only eat ~3,5-4Gigs which maks the system to have more than 6-8 gigs of ram free while the IDE says low memory detected.
Somewhere under a million lines of code the IDE just die :S
The fact that it took you 5 ******* years to respond, shows how little respect you have for your customers.
Apple isn't abusive, and I welcome every last Microshit user to switch. you'll be happy you did.
and this reluctance to fix your ******* bugs is why OS X and linux are gaining ground quarter after quarter, for a decade at this point.
I can't wait for this terribly managed company to bite the dust, and I can't ******* wait to laugh in Gates face at how he thought he was brilliant for hiring lazy people, and they're his ******* undoing.
Good ******* riddance.
I know it's cool to hate options in Microsoft-land these days, but what if you provided two binaries, one 32, one 64 bit? Users then could choose between increased performance or no crashes.
I have migrated my big projects to Bjam, without vs, it's very fast. VS becomes slower and slower since 6.0, it's time to select better tools.
Jon Miller commented
Steve B commented
This is a joke. Why does EVER OTHER PROGRAM IN EXISTENCE see performance benefits from moving to 64 bit, but Visual Studio would not? You guys are really drinking the Kool-Aid if you honestly believe the bull$hit you're shoveling.
I.e. "we screwed up the code so badly we don't want to spend time fixing it to be 64-bit ready but don't want to admit it"
Daniel Pamich commented
Please listen to you customers. Visual Studio just doesn't work as a 32bit software, I have it crash 3-4 times daily due to it running out of memory. Small projects work fine, large ones crash regularly.
If I have to unload projects via VS Funnel, I might as well go back to using a text editor. As all the features I now need in an IDE don't work. In fact the pure existent of VS Funnel proves there is a major issue with 32bit VS!! If VS worked as it should there wouldn't need to be these desperate work arounds!!!!
SQL Server has got massively faster since it moved to 64bit, maybe their team can give your team some tips on high performance data structures and 64bit code.
I don't care if VS is slightly slower when it moves to 64bit, I just want an IDE that is stable and works with large solutions.
Please reconsider this decision, so your customers can have a working IDE.
Andreas Erben commented
Is this a joke? This is 2016.
Let me add - I do not think that Microsoft should be in the business of telling people to use hacks like swapping out projects out of memory.
Pretty much all integrated environments there are that work with large amounts of assets have a 64 bit version. An increasing amount of software in the industry is only produced in a 64 bit version. For Microsoft server products this is pretty much the case universally. And frankly - today, the difference between the footprint of a desktop or a server system are both beyond the 32bit threshold anyways, even a standard notebook configuration nowadays is 8GB. Only niche systems are running on less than 64bit.
If there are benefits of having parts of it running in 32 bit or have a 32 bit option - then encapsulate those, instead of going the other way to be restricted for good by a 32 bit limit.
The response feels, sorry to say, like an elaborate cop-out.
On one end you talk about performance issues (which I am sure there are ways to address with a specific mode for it, so give people a 32bit mode or subsystem if they want to), on the other hand you have absolute limitations that cannot be overcome in the 32bit world.
I am aware that VS is a behemoth - and there are many aspects to consider, still, back to the original point - 2016 to look at an application and even remotely implying that it is *better* as a 32 bit application when even smart phones switch to 64 bit - feels - wrong.
James Johnston commented
"we suggest you to leverage the VSFunnel extension ... by unloading projects not being used in a smart way"
Why does this remind me of MS-DOS programming in the 1980s? Maybe some of them still work for VS Team? https://en.wikipedia.org/wiki/Overlay_(programming) --- "An overlay manager ... loads the required overlay from external memory into its destination region when it is needed. Often linkers provide support for overlays."
I thought we got past this in the PC world with Windows NT & virtual memory in 1993. Operating systems have this nifty feature called "virtual memory" and "swap space" so that users don't need to be concerned with manually loading & unloading data to fit into limited "memory." You just need to increase the address space to 64 bits to continue to grow.
Please don't force the bad old days of MS-DOS-style overlay programming on us!
Bruce Shankle commented
Could someone at Microsoft forward this to Don Box? I'm curious if he agrees with this. Sounds like an absurd cop-out to me. There are many benefits to a 64-bit address space. Why would you cite performance degradation 'possibilities' as a reason not to do something that prevents the product from being completely unusable for large-scale things?
At the very least fire up razzle and do a 64-bit build yourself and share some 'real' numbers with us. We're not stupid.
Absurd! Heresy! Blasphemy!
I had count to 10 before commenting on this post, I was so angry especially after reading ricom’s so technically and realistically weak argumentation on why a studio is better than a mansion.
Every dev in my company has 64 GB RAM in their desktops, that costs nothing compared to their paychecks. And you make them wait on an hour glass while you do swap? That is not even to mention the context switch and frustration. I want to keep everything in memory just because it is there, it is empty, it is cheap and makes me more productive. Stop constraining me! Let me fly!
You are not listening to your customers. They are trying to tell you something. You are then spinning it around back in their faces as if they are doing something wrong. It is good for you to come up with a suggested work around. However, that will not fly for long. They will want a solution to the problems they are TELLING you. Listen to them or they will just move on to another platform that does support them.
This may be difficult to hear but your customers are telling you what to prioritize. So you have a few choices here. You optimize VS to work much better in 2 gig and with resharper not blow away all the memory. You find out how to make the profiler not blow away a bunch of memory. You find out how to make that all work in 2 gig nicely. Your customers *CAN'T* They do not have access to the source code like you do. They are relying on *you* to fix the perf and memory issues not give them flimsy excuses. It is why they buy those nice 3k MSDN licenses. Or you tell them 'oh well not that big of a deal'. They will just move onto other platforms that can do what they need.
There's a scenario that right now that VS can't do: profile applications that run for over one hour. The profiler in VS crashes because it runs out of memory.
Bill Hoag commented
I agree with the other commenters that 32-bit architecture is too constraining for Visual Studio. As an everyday user of VS (on a large C++/C# solution using Resharper C# (VS usually tips over with Resharper C++)) who has to endure it's sluggishness and restarts due to a memory straight-jacket. The features that want to make VS great (Intellisense and all of the nifty stuff when right-clicking on code) need to keep LOTS of reference data in RAM - too much for a 32-bit architecture. Please reconsider your decision.