When we hit the debug button in visual studio SSIS project - does it run in 32 or 64 bit mode - and is 32bit provider compatible with 64bit run mode? - visual-studio

I have a new VM on which I have installed Visual studio.
I created a new SSIS project, and am trying to use the oledb data source task to access an .accdb MS-Access file.
However I cannot see the Provider. So I installed the Access 32 bit runtime. Now I can see the provider. I am reading that since visual studio is a 32-bit tool, we have to install the Access 32bit runtime. Otherwise if we install the 64bit runtime, then we will not be able to see the provider in the list because visual studio is 32bit and only shows 32bit providers.
When I hit the debug button in visual studio SSIS project it can access the MS-Access file. I am now confused and want to ask - when the project is running, does it run in 32 or 64 bit mode? If the answer is 64bit mode, then how can it access the MS-access file using the 32bit provider? Does it mean that project running in 64 bit mode can utilize 32 bit mode runtime/provider?
Assuming no settings are changed, when debug button is pressed, then does the program execute is 32 or 64 bit mode? Does the OS bitness have any impact on this?

You are quite right and quite close here.
Since Visual Studio is x32 bits?
Well, if your project is set to x86 (x32 bits) or set to ANY CPU?
Then hitting F5 to run + debug (or ctrl-F5 to run the .exe without debug), then you will get (have) a x32 bit running program. Some caution here. If you use ANY cpu? then again FROM VS f5 to run - you get a x32 bit process.
But, if you go to the windows commnd line?
Well, if you launch teh windows x64 bit command line (default on most computers), then if the project is ANY cpu? Then your app run will be x64 bits. (and ANY un-manged code such as Access, or any other windows code used for COM autoamtion? it will break.
If you launch the windows x32 bit command line (with any CPU) then you get a x32 bit version running.
So, from VS - F5 - always x32 bits.
From windows command line - depends.
So, what above suggests? Well, if your intention is to have + use x32 bits process? Then FORCE/SET the project to your intentions. That way, no "by chance" will fool you, and your application will ALWAYS run as to the project settings.
So, in such cases, I do avoid ANY CPU.
Now, what happens if you force the project to x64 bits? eg this:
Well, if you pick x64 bits? (and not any cpu, and not x86).
Then hitting f5 (or ctrl-F5), it WILL run the application as a x64 bits in-process. I am not sure quite how VS works, but they have some kind of "bridge" that marshals the x64 bit debugger to talk to VS x32 bits.
So, if you force the project to x64, then it will always run as x64 bit - including from VS.
So, this means:
if you set project to x64 bits - use a connection builder in settings?
You can use the connection builder but since (we assume) that you using
access x64 bits? Then the test connection button in VS will NEVER work.
But, if you run the code as x64, and have access x64, then it will work. So ONLY the test connection button fails - and that's due to VS being x32.
If you have even the wrong version of Access (say x32 bits), and your project is set to x64? Then the connection builders will work and EVEN the test connection will work (because test connection is ALWAYS x32 bits from VS). This has the effect of you building a connection, hitting test connection - it says good!!! But when you run the project (f5, ctrl-f5), the project runs as x64, and it will fail (this example assumes x32 bits).
Now this is building an writing code from VS. I don't' know of a SSIS package built with Visual Studio works different. But we do NOT want to confuse Visual Studio (VS) with that of say sql studio or other systems - they don't have the project "cpu" or "bit" setting options like you do for a VS project.
So, I quite much suggest if your intention is x32 bit operations, then ALWAYS force the VS project to x32 bits, and thus come time to launch that .exe from windows, then it will never run with a wrong or un-expected bit size.
I not tested a SISIS integration project, but if building this from VS? Then once again, force/set the project bit size.
In effect to remove the "chance" of the wrong bit size? Then force the project to the given bit size you as the developer were intending to use here - that way this is not left to chance.
There ARE times when you want to use ANY cpu. A really good example is when you build say class library code to share amoung projects. In that case, ANY cpu is a good choice, since then EVEN projects forced to x32 or x64 can referances those external assemblies and library code.
However, if you force those external assemblies to a given bit size, then only projects running that that correct bit size can consume such libraries.
.net code (managed) is differnt then un-managed code. .net code has the ability to run "either" x32 or x64 if you choose ANY cpu. But un-managed code (external non .net code such as Access) can't change on the fly like .net code can. And I use the term "on the fly lose here". Since ONCE a .net program starts running x32 or x64? It remains that bit size until terminated.
So, ANY CPU is fine for external class library code you write and thus want to include in any project. But the main project .exe program has to use SOME external non .net (non managed) code? Then you would do very well to force the project bit size settings.
And to answer you last question?
If the provider is a .net one, such as sql provider or whatever? Then it is managed code - cpu settings don't matter. it is ONLY when you start using external code systems that are NOT managed and NOT .net code. MS-Access is one such common example. So would be any windows c++ or even a lot of commerial programs.
For example. Sage/Simply accoutning, and Quickbooks accounting? They offer .net SDK's to interface to those accoutning packages. But they only have x32 bit verisons of those desktop programs - and they are not .net prorgrams. So once agian, you do well to force the project to x32 bits.
So, no x64 bit process can consume a x32 bit process. nor can you consume external librares that don't match.
However, if that library code and system is .net (managed code), and was compiled and created with any CPU, then you don't have any restrictions in regards to using ANY cpu, or even consuming those libraires when you force the .net project cpu settings.
So this whole system ONLY breaks down when you introduce or start using external code libraires.
For example, if you use .net ghostscript library? They have two versions - x32 and x64. And thus just like MS-Access you have to match up the bit size of those external libraries.
This actually is a REALLY nasty problem say for Adobe PDF. Their pdf viewers are x32 only. In fact it was only what - last month they started offering x64 bit versions of their PDF adobe reader. And they no doubt started doing this since office is now moving towards being x64 bits and not a x32 bit system/program.

Related

Running 32 bit in Visual Studio and interacting with Oracle

I am using Visual Studio, Test Complete and C#. I am basing it off another project which also uses the same. This other project uses Oracle to query and works fine.
I based all my settings on the other project (any CPU, all the same references, etc). However, when I run it and attempt to connect to oracle I get an exception about a bad image (this happens when you run 64 bit and try to load a 32-bit client). I copied all the references, and tried all combinations of build. I do think I am running 64-bit because when I am debugging and try to insert a line it says it is disallowed in 64-bit mode. On the other application it lets me insert lines during debugging.
Here is the exception:
e {"Attempt to load Oracle client libraries threw BadImageFormatException. This problem will occur when running in 64 bit mode with the 32 bit Oracle client components installed."} System.Exception {System.InvalidOperationException}
I try replacing the oracle library but it always seems to go back. Better would be to run in 32-bit mode. As you can see from the picture, for some reason "Prefer 32-bit" is disabled (also in the application that works).
Any idea what to do? This will also, once it runs on my machine, be transferred to another.
This is Visual Studio 2012 (11.061219.00)
Use connection.GetType().Assembly.Location to check which DLL is actually loaded.
Typically (and simplified, see How the Runtime Locates Assemblies) assemblies are loaded from GAC (Global Assembly Cache) or from directory where your .exe is stored. Note, the GAC takes precedence!
If a .exe or .dll file is x86 or x64, you can check with sigcheck tool.
In Oracle Client 12.1 Setup and later the ODP.NET is not added to GAC anymore, you need to add it manually.
You may use my Oracle Connection Tester which could show you the problem of your installation.

How to connect .net API to MS Access database

Dear Experts : I have an existing MS Access database , and I have created a web API using Visual Studio 2019. I am trying to create an ADO.net entity data model or even Code first . The problem is that I cannot find in the data source when I create new data source , the oledb .
Any idea how to connect to Access via API
Thanks in-advance
Do you have the Access database engine installed?
I would install the Access database engine from here:
https://www.microsoft.com/en-us/download/details.aspx?id=54920
Make sure you choose the correct bit size. If you running the web server as x32 bits (the default), then install Access x32 (x86) version. If you are running your .net application as x64 bits, then install the x64 bit version of Access. Keep in mind that Visual Studio is a x32 bit application. So, if you choose "any" CPU, it will launch your application as x32 bits.
If you force your project to x64, then you need to have installed the x64 bit version of the Access database engine. Keep in mind that running as x64 bits will work for debug, and running and testing. However, while you can use the connection builders in VS, the final test connection will ALWAYS fail if you using the x64 bit version of Access. This is because VS is a x32 bit application. So, the test connection button will not work, and if you going to use the dataset designer (or now the newer version (entity frame work), then you best develop with the x32 bit version of Access.
If you just developing local, then not a problem, but most hosted web sites (in fact near all) are x64 bits). If you not using the IIS express local, then you have to force your project to x64 bits. And testing connections to the database will be a challenge (your code, or debug code will work, but actual test connections from VS will fail if your project is forced as x64 bits.

Why is there no 64-bit version of VS2013?

I downloaded Visual Studio 2013 from DreamSpark but it's the 32-bit version and I couldn't find any 64-bit version. Is there none, and if so why is there no 64-bit version of Visual Studio?
Update (May 2021)
Visual Studio 2022 will ship as a 64-bit build: https://visualstudiomagazine.com/articles/2021/04/19/vs-2022.aspx
Original answer (Dec 2013)
First, there is a 64-bit C++ compiler that comes with Visual Studio tool set. So you can always change your project settings to make 64-bit builds of your app as needed.
Now, to answer the original question.
Think of it from a cost and ROI perspective. From years of shipping software at Microsoft, here's how I've seen the consideration for 64-bit builds get made.
When the 32-bit app works just fine on 64-bit, it's almost a non-starter to consider 64-bit.
Most of the projects at Microsoft aren't simple little Visual Studio projects in which the developer can just flip the Project settings from 32-bit to 64-bit. (I actually don't know if the Visual Studio team compiles Visual Studio with a VS project.) They are often well over a million lines of code that build with the VS compiler set, but from a command line and Makefile environment. Switching to 64-bit means updating a lot of this build infrastructure.
There is a cost of porting from 32-bit to 64-bit. The first cost is just fixing the bugs, getting the code to compile, restructuring the build environment, and all the upfront work just to get the initial build going.
There is an ongoing cost you pay for having separate 32-bit and 64-bit builds of an application. You have to build it twice every day. You have to run the test collateral on it twice every day. It's not a 2x cost, but it's not free either.
With more SKUs from the same code base, it increase that chances that a developer will break something when he checks in. Of course there can be automated tests to prevent this, but it will slow the developer down since he will have to go back and fix the other SKU that he doesn't have installed locally on his test machine.
Now here are some of the motivations for moving to 64-bit:
You really need to take advantage of 64-bit performance and memory architectures. Large database servers that use as much memory as possible will benefit from accessing more than 2GB limit imposed on a 32-bit Windows process.
You need to integrate with something already compiled with 64-bit. For example, if you want to write a shell extension for Windows, you will need a 64-bit build to run on 64-bit Windows. That doesn't mean the entire app has to be ported, but it does mean this component will need a separate 64-bit build.
You have a platform or API story for external developers to consider. Usually, they have their own needs for 64-bit builds. Hence, they may need a 64-bit ready API from you even if your native app can get away with 32-bit support.
Your team has just been re-organized into the Windows division and your team's code has been deemed necessary to be included into the next Windows release. There's no decision to be made anymore - your code will be compiling for 32-bit, 64-bit, and ARM (Surface RT).
Source code files should not be multiple gigabytes -- there's no reason for a text editor / development environment to use 64-bit pointers, which consume twice as much RAM for no benefit. Larger pointers make data structures containing pointers larger, requiring more memory bandwidth to move them around, and fitting fewer inside the CPU's data cache, so that the number of cache misses may increase as well.
The 32-bit editor is perfectly capable of launching and interacting with the 64-bit compilers, linkers, and debuggers when needed. Having only a 32-bit editor also simplifies the plugin model greatly.
The reason is the same as it has always been. It would require a significant effort to port a code base as large as Visual Studio to 64-bit and according to Microsoft, the benefits would be few and far in between.
In fact, MS claims that such a port could slow down Visual Studio due to the consumption of more memory. There would be poorer cache locality due to 64-bit pointers being stored in various places in the code. There is much code in VS that uses custom arena based allocators, although MS is trying to get rid of them. These could also possibly result in poorer performance, since pointer management within the arena would deal with 64-bit pointers which would occupy twice the space of their current 32-bit counterparts.
Given the tens of millions of lines of code that are Visual Studio, the effort to convert, test and tune a 64-bit version seems fraught with delays while having a seemingly small chance of having a positive outcome. If anything, MS seems more intent on porting Visual Studio to managed code in order to reap the benefits present there - a decision that is hard for us C++ developers to swallow.
For the present term, Microsoft recommends running Visual Studio in a 64-bit version of Windows, thus doubling the available address space (2 GB to 4 GB) without paying a 2x penalty for pointer storage within the VS process.

App created in Visual Studio on XP 32 crashes 64 on access violation. Recompile in 64 bit environment to find bug, works fine. What?

So I've been developing on Windows XP Visual Studio 2008. I guess it is building my C++ app in 32 bit mode. When I run the program on my new Windows 7 64 bit box, it gets half way through loading a then throws an access violation error. So, I loaded all my development tool and recompiled the project on Windows 7 to find the crash site but it works perfectly! What? How do I make my app work on x64? Do I have to release two seperate versions? I know I can target 64 bit but I don't like two separate executables. I've searched but keep getting the two version solution or everything .net. This is a native C++. is there an x86 flag somewhere?
"Do I have to release two seperate versions?"
It depends on what your app does. The majority of 32bit apps work just fine in WoW mode.
"is there an x86 flag somewhere?"
Yup. Open up the configuration manager (Alt-B, O) and you will likely see a win32 in the platform selection.
Why your app crashes is going to take some debugging. You should be able to attach the debugger to the 32bit version on the 64bit OS.
Have you considered the possibility that you may be trashing memory? From the way that the application happens to be loaded into memory, it may be that the 32-bit app on 32-bit Windows and the 64-bit app on 64-bit Windows "get lucky", i.e. no important memory locations are overwritten, whereas the 32-bit app on 64-bit Windows is less lucky and crashes.
To find out if you are indeed trashing memory, you can use tools such as Purify and Valgrind.
As the others' answers already said, it is very likely that the application has bugs that are being hidden in one environment. A good tool that has a low cost of entry in terms of learning curve is Microsoft's application verifier. That along with the debugging tools can provide a very good start. Take a look at the gflags utility.

Missing Visual Studio features when running in 64-bit mode

Can someone tell me why I don't have all of the dev studio windows available to me when I develop on a 64-bit platform? I upgraded my dev desktop box to server 2003 x64 to match our deployment platform. Since then (I'm using VS2005) I've noticed that several windows aren't available. I can't view Processes (which is the most annoying) so I don't know which processes I'm attached to. I can attach to a process fine, but it won't show me what is already running under the debugger. There are others, but that's the one that sticks out in my mind at the moment.
My question is where are these limitations of developing under 64 bit documented (assuming they are)? (Of course, I also get the "Edit/Continue" warning dialog all the time telling me that doesn't work in 64-bit)
Also, is VS2008 any better under 64 bit?
Follow-up: Apparently my question is a little bit vague. I'm developing a 64-bit app on a 64-bit development environment. "Recompile it in x86" doesn't solve my problems.
Follow-up #2: I'm giving it one more shot. I WANT TO DEBUG A 64 BIT PROGRAM ON A 64 BIT ENVIRONMENT AND I DON'T HAVE ALL OF THE VISUAL STUDIO FEATURES SHOWING UP. HOW DO I GET THEM?
Follow-up #3: I just installed XP 64 (previously I was using Server 2003 64-bit) and those features all showed up again (Process window, etc). Apparently the server version of windows doesn't provide all of the dev features.
Can anyone tell me why?
"Edit/Continue" can work if you change the build setting to X86 :)
Here was the suggestion from StackOverflow about it.
I had some problem with NUNIT when debugging code. The solution was to use the special program in the \bin\ folder Nunit-x86.exe for old code built in x86 and use the Nunit.exe for x64 built.

Resources