SSIS doesn't save any changes - etl

I have an SSIS project on VS2017 and in it, I have a connections manager. I also have connection strings to files on the server. I am in the middle of the migration process, so I have to replace the paths to all files. But when I replaced the old path with the updated path, SSIS doesn't save anything and I can still see the relevant errors, even when I execute the package...
What can I do to make SSIS save the updated path?
I added here an image showing what string I replace and after saving - of course, nothing changes. I also tried to update the path using an XML file or another window but after executing the package or exiting - the old path kept.
Please help me.

Related

VS2015 Setup Project not updating Access Database included in package when reinstalling

I'm trying to build a Visual Studio Installer Setup Project that deploys multiple C# projects and some other files. Included in these other files, there are Access Database with forms that needs updates.
To illustrate the problem, I simplified it :
1- Create a new Access Database file, add a simple form to it with a button and a label and save it.
2- Add the file to the setup project;
3- Set DetectNewerInstalledVersion and RemovePreviousVersion to true
4- Build the project.
5- Run the setup executable.
To that point, everything has worked fine
6- Reopen the Access Database file, add a button or a label to the form, save it.
7- Change the Version number of the Setup project, and at the same time the ProductCode as suggested by VS2015.
8- Rebuild the setup project.
9- Reinstall the software.
Expected: the Access data should have been updated with the new button/label.
What is happening: The file hasn't been updated.
Why is that ? I've seen people talking about the Assembly version number of projects included in a setup project, but that's not my case since I'm not deploying the ouput of a project. I'm simply deploying a file that should have been removed during the uninstall process.
If I do the exact same steps as described before but with a text file in which I add text, it works fine, but for some reason in does not work with an Access Database.
What's wrong ?
If you install a data file, and then run a program that updates it you've added user data to that file, or database. The file overwrite rules don't allow a modified data file to be replaced in the kind of update that VS setups do:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa370531(v=vs.85).aspx
Basically it would be a bad idea to ship a product that installs a database that the user updates with potentially large amounts of data, only to have the new version of the product delete the entire database. It's not clear to me how your app deals with updates (do you want that added button/label just to be completely lost, or do you save them in some way?) so recommending a solution is difficult, but maybe you need an uninstall custom action to delete the database, or on an upgrade you add the updates to the existing DB instead of removing it and starting again.
I found a solution. The file wasn't updating because the modified date had changed.
From MSFT site:
Nonversioned Files are User Data—If the Modified date is later than
the Create date for the file on the computer, do not install the file
because user customizations would be deleted. If the Modified and
Create dates are the same, install the file. If the Create date is
later than the Modified date, the file is considered unmodified,
install the file.
Since I had two Access databases (front-end with the forms, backend with the data tables) and needed only one to be updated (the frontend where the forms are), here's the workaround:
1) Change REINSTALLMODE property to amus instead of the default omus. It will force the reinstallation of all files. To do that, I used a PostBuildEvent as explained here.
2) Set the backend file property Permanent to true
3) Add a Launch Condition: Search Target Machineto check if the backend file exists on the computer. Name it something like BACKENDEXISTS
4) Add a Condition value to the backend file in the File System view to install the file only if it hasn't been found by the Launch Condition. In this case, it will be not BACKENDEXISTS. If this is a first install, it will install the file because it hasn't been found. If this is an update, it will find the file because of the Permanent property and will not replace it.

Visual Studio 2010 file could not be found in your workspace issue

I'm getting this message when connected to a TFS repository and trying to edit a local .sql file that's not yet in the repository.
The item C:\bla\blabla\blablabla\USP_BLA.sql could not be found in
your workspace, or you do not have permission to access it. No items
were checked out
If I disconnect from the TFS repository everything is fine. If I re-connect to the TFS repository I start getting the message, even just trying to type/save the file.
The file is actually there in my local folder and I have the corresponding permissions, for those who are thinking otherwise. It is just a text file that I should be able to edit irrespective of whether I'm connected to the repository or not.
I got Windows SP1 installed on my Windows 7 this morning; don't know if that could be the cause of the issue. We are using the.NET Framework 4.
Any idea of what the issue is or how to resolve it?
It turns out the issue was with one of the "facts" in my question, "file that's not yet in the repository".
A file with the same file name I had locally was already there in the TFS repository, it had been checked-in a month ago by someone else.
Only when I tried to check-in the new local copy I became aware of that; the file was not supposed to be there.
Anyway, after checking-out the file from TFS everything went back to normal; I'm able to edit/save the file without issues, even if connected to TFS.
Try to get the latest version of your source code It should resolve the issue..Or Get Specific Version and allow to overwrite existing files.If still does not resolve make sure the file is present on local directory.
Do not forget to rebuild after getting latest code

Seriously, overriding the DefaultDataPath in the sqlcmdvars for a SQL Database project deployment

I have an SQL 2008 database project in Visual Studio 2010 that is sync'ed on a regular basis from a schema comparison during the development phase. This same project is also under TFS source control. I have two environments, Debug and Production. Each environment is a single machine that runs both IIS and SQL Server. The production environment however has different data and log paths for the database D:\Data\ and E:\Logs\ versus my development server at the standard c:\program files\sql....\data.
What I'm trying to do is setup the way I transact my deployments from the debug to production environments. I've gotten WebDeploy 2.1 setup and I build my deployment packages in Visual Studio via the right-click context menu on the website project. I want to manually copy deployment packages to the production server via RDP, so there's no over the wire concerns here. The deployment package settings are setup to include all databases configured in Package/Publish SQL tab. In the Package/Publish SQL tab I don't pull data from data/schema from an existing database because I want to deploy from the SQL database project instead. So I just point to the pre-generated .sql script file located in my database project's /sql/release folder. To top it off, I generate the .sql script in the post-build events in the SQL project via VSDBCMD.exe /dd:- /a:Deploy /manifest:... so that a simple solution rebuild all, then website project deploy ensures I always have the latest .sql script in the deployment package.
This is great and all, but I have a major problem here I can't seem to overcome. It has to do with the database data and log files paths being different from debug to production environments. I actually receive an exception during the WebDeploy in IIS on the production server that says it can't find c:\programs files...\MyDatabase.mdf file. And what's scarey is after this exception, the entire database is deleted. The empty databases I create right before doing the deployment. Happen both times I tried messing around with it. I'm not sure how I feel about that, but I'm hoping I could find a reliable solution to this.
I have been feverishly looking for a way to change the paths during a deployment and have found many places that mention changing the paths in the *.sqlfiles.sql files under Schema Objects\Database level objects\Storage\Files because the path it tries to deploy to is the path specified in those because of the Schema Comparisons and Writes from the Debug SQL server database. Changing the paths here will work temporarily, until I do my next schema comparison and write, then the sqlfiles.sql files will get overwritten with the info from the Debug database again. And I don't want to have to remember to never update these files during a schema comparison because any mistake has the potential to delete the production database.
I think my salvation lies in my Release.sqlcmdvars file. It's a tease actually, I can see a place I "could" type the default database path, but it appears to a read-only field as it mentions "Location where database files are created by default (set when you deploy)." It would be grand if I could specify the paths here. Is there any way at all to specify the path in a variable here that would override the paths from the *.sqlfiles.sql files?
In the solution where I work at, there are two custom variables in the sqlcmdvars called Path1 and Path2 that I thought were reserved names that do such that. However, this doesn't work in my solution and the difference between the two solutions are the other solution gets deployed via TFS build controller. Doing the TFS build controller route isn't an option really because I opted out to save money while using a third party source control service.
Any help with this would be great. I have even gone so far as to create separate *.sqlfiles.sql files for debug and release and configured the dbproj file to use one or the other depending on the Configuration, but this doesn't seem to be working either. Also, using the custom PATH1 variable in the sqlfile.sql file like FILENAME = '$(PATH1)\Cameleon_log.ldf', doesn't work either. I seriously think it shouldn't be this difficult. Am I missing something simple here??
Thanks!
Okay, this was an exercise in futility. Apparently with out syncing with the target database during the script generation the script would be exactly what is needed to build the database from scratch. Even if I could override the file paths, the deployment would complain about database objects already existing. I needed to specify the connection string of the target database in the deploy settings so a comparison is done during the script generation and only the relevant differences are added to the script. I really wanted to avoid exposing my production SQL server to the outside world, but it is what it is. No need to override the paths anymore because it looks the database file paths are conveniently ignored during this comparison!!

Visual Studio setup creation: Conditional installation of a file based on file search fails

I had created a setup 7.7.0 using Visual Studio 2010. The setup installed all the files correctly. Now I create the setup 7.8.0. The Upgrade Code for both 7.7.0 and 7.8.0 is the same and the Product Code is different.
In the Launch Conditions Editor, I have added a search condition FILEEXISTS1 that searches for a file, d.xml, in a particular location on the system. In the File System Editor, I added the condition "NOT FILEEXISTS1" on the d.xml file to evaluate if the file is present and if it is present, to not install d.xml.
My problem is that having this condition removes the d.xml file altogether when 7.8.0 is installed. However, if I just have the search condition FILEEXISTS1 that searches for the file, but does not evaluate it (meaning I do not have the condition property NOT FILEEXISTS1 evaluated on d.xml), then the file is not overwritten.
I am confused by this behavior. Am I missing something here?
Does anyone know why this happens? Any help would be greatly appreciated.
A major upgrade automatically uninstalls the old version before installing your new one. This means that the old file is removed and a new one is installed.
So conditioning your new file is not a good approach for preserving the original one.
To determine if the new file is installed, Windows Installer uses the file versioning rules for the component key path. Here is an article with more details: http://setupanddeployment.com/windows-installer-bugs/missing-files-upgrade
A solution for preserving your old file is to create a backup before the upgrade starts and restore that backup after the upgrade finishes. This can be done through custom actions. Perhaps this will help: http://setupanddeployment.com/installer-concepts/preserve-data-install

Problem with an MSI distribution

So I am continuing testing and releasing changes to my app and I have come across a pain point that I am unsure how to deal with.
First off, my app uses a SQL Server CE database to store information and I need to be able to make changes to this db so I've created an internally updating process that runs whenever the application runs to make sure the db is up to date.
The crux of this internal update process is another SDF file named DBUpdates.sdf that contains all of the db schema changes that need to be applied.
The problem I am having is that the MSI distribution I created will not overwrite this file. It appears that when SQL Server CE opens this file, it changes the Modified date/time of the file. This is a flag to the MSI process that the file has changed, and that it shouldn't overwrite the file. Well now I am seeing that my db changes aren't being applied, because the MSI process thinks the user has changed this file.
At this point I am kind of stumped. I was planning on using an MSI distribution but maybe I can't. What do you think?
What about storing your .sdf as an embedded resource in your executable, and then extracting it to a temporary location on disk (as necessary) and perform the updates.
Unversioned files with MSI can be a bit difficult to handle if you need to force the installation of the file. You can see this previous question, for some ideas, How to add a version number to an Access file in a .msi.
The question contains a link to this blog post, http://blogs.msdn.com/astebner/archive/2005/08/30/458295.aspx, which suggests the way I prefer to deal with this problem. Add the .sdf file to be part of your executable's component. The downside to this is if someone delete the .sdf file, but not your executable I don't think a repair of the application will catch this. If your using Visual Studio to create your MSI files then this may prove a difficult solution to implement. I strongly suggest your check out WIX in that case. It is a better MSI build system.

Resources