In Visual Studio 2012 we have new project type called SQL Server Database Project and when compiled will create a .dacpac output file.
A Google for this term will link it with Data-tier Application but I cannot link this name with D A C P A and C?
What does dacpac stand for?
In the article 'It’s Data Tier Application and Data Application Component' from December 23 2009 Microsoft's Buck Woody states DAC stands for:
Data Application Component
And he also talks about the result as a package, so it seems reasonable to surmise:
The PAC part probably refers to package.
This is the relevant quote (relevant part in bold, my emphasis):
OK – In SQL Server 2008 R2 we did “re-use” an acronym or two (DAC and
DTA), but it’s important to remember there are actually two parts to
this new feature. One is the Data Application Component (DAC) and the
other is the Data Tier Application (DTA). The DAC is the file created
for a DTA.
In SQL Server 2008R2 and Visual Studio you’ll find there is a new way
to write and transfer database code. I’ll blog about it more as I
finish my testing, but the process works kind of like this…
You can “birth” a Data Tier Application in two places. You can create
a new project type in Visual Studio where the developer can create the
database structure, put all of the policies that they want to enforce
and so on there. The DBA can also right-click a database and make a
Data Tier Application out of a current structure.
In both cases, something called a DAC – or Data Application Component
– is created. It’s a file with the payload of the the major parts of
the structure of the database and so on. that’s the “package” you use
to transfer the DTA around.
From there, you right-click in the “Data Tier Application” node in SQL
Server Management Studio on another Instance and “Deploy” the Data
Tier Application. It will build the database for you and keep it
together as a DTA. You can make changes in the “originating” system or
code, and then “upgrade” the Data Tier Application.
So there you have it. It’s DTA and DAC – but I think you’ll know the
difference when the time comes to use one…
This InfoWorld article suggests a slight variation on the above, stating that the DACPAC part stands for:
Data-tier Application Component PACkages.
DAC stands for Data-tier Application
PAC stands for package
So to my thoughts, DACPAC stands for Data-tier Application package
First off: read this.
dacpac is a Data-tier application. More concretely, it is the file that gets created when you build a SQL Server Database Project. It contains all of the information necessary to build the db objects specified in your SQL DB project.
Related
I currently have a very large .NET solution in Visual Studio 2017 containing 12 different projects. There is a single data access layer project that is utilized by most of the other 11 projects which consist of websites, class libraries, windows services, unit tests, etc.
One of the databases used in my DAL project changes the primary host nearly every week (completely outside of my control). In order to accommodate this change, it requires me to redeploy code at irregular intervals, against our standard deployment lifecycle.
Is it possible to have a single external file that is read upon application start to be used as the hostname in the entity framework connection string? I've tried splitting all of the EF connections out of the app/web.config files into their own ConnectionStrings.config, but entity framework throws a metadata error each time, or says that it cannot find the CSDL/SSDL/MSL files.
What I envision is simply updating a single text/config file with the adjusted database server name, then restarting each service/website/console app/etc., without having to truly deploy new code.
Any help would be greatly appreciated!
I am writing a "framework" of SQL Server (targeting 2008R2 and 2012) stored procedures and common/reference tables. Each SSDT/Date-tier project represents a different component (can be within the same database) within this framework (e.g., MasterDBExtensions project has stored procedure extensions/add ons within the Master Database, SQLServerAgentExtensions (msdb), etc).
I developed a farily strict security model heavily based on Schema, database roles, and certificates.
My issue is how do I "share"/copied these certificates between the active project and the referenced projects so that when I publish the active project to SQL Server the certificates are properly copied, etc. (btw I need to share/copy the certificates for cross-database object access so that I do not need to turn the Trustworthy flag on, Service Broker, and also for linked server access).
Just for clarification I have some TSQL Code that represents what I mean:
use [DatabaseA]
Create Certificate [MyCertFromA] ...
Backup Certificate [MyCertFromA] to File = 'MyCertFromA.cert'
use [DatabaseB]
Create Certificate [MyCertFromA] from File = 'MyCertFromA.cert'
SSDT/Data-tier will NOT allow me to place the Backup and Create/From statement (I get the "This statement is not recognized in this context"). If I move the Backup Certificate to a Pre/Post Script I run into issues with file permisions and other issues (e.g., Certificate NOT found). Besides the Pre/Post scripts do not run if I use the project as a database reference in other projects.
So... What am I doing wrong or does anyone have any suggests around these issues?
Thanks!
Enviroment: SQL Server 2012 (I also target SQL Server 2008R2), MSVS 2010, SqlServer Data Tools Dec. 2012, SQL Sever Data-Tier Application Framework May 2013, C# 4.0
Looks like this has been answered somewhat by CTP4 Certficate issue - sometimes server, sometimes database scoped
I've always been intrigued by Visual Studio Database Projects, and while they seem to be quite capable, I've never used them to any great degree outside of simplistic proof-of-concept work. I want to try this for a new project, and I'm also interested in using an EF layer on top of it, but in past test projects this has involved some decent effort.
I'm curious: has Visual Studio matured its product integration to support a single workflow that builds the database project, builds the EF layer on top of it, and finally builds the code, without intermediate steps involved?
We are a small team and we don't have dedicated SQL developers, and our primary goal is to bring the database into Visual Studio and to get it nicely under source control (TFS), and to achieve strong integration between from end to end. We're interested in growing into EF, and will probably start simple by treating it like a simple ORM tool to begin with if possible.
Has anyone actually done this that can provide insight into the process?
We have used VS2014, tool seem much the same and early version
Don't think has been much changes over the years
We have EDMX model and a DB project in the solution
Does mean that you need to keep the db project up to date.
But this is easy to do, you just publish you EDMX to local box/target
Then can import the changes with a schema compare of local to the project.
So they you can still have Model driven DB design
And use the DB project to deploy changes to the Dev/Stage/Live boxes
And can publish with automated deployments also.
The db project has a post build scripts option
Where you can use it to do seed data
And also a pre-build where you can do db manipulation if need to change structure and types of fields types when the data is on a live db.
Schema compare tool are rather good in Visual Studio
Can compare a DB to DB, DB to Project, or Schema file to either also
I have been asked to create an MVC web application in VS 2010, and was instructed to use a SQL express database for my data. I am using EF Code-First for creating and managing my data. The database was created in VS2010, and is attached via "AttachDBFilename" in the web.config.
I have used SQL CE before with MVC with no problems, however the attached SQL Express DB is causing weird issues.
For one thing, when I try to deploy the app, it fails and tells me that it cannot copy the database.mdf because it is in use by another process. I have NOT opened the database in VS2010 nor SSMS. Of course the program code accesses it - is there some reason that connection would remain open? I am using boilerplate code from the scaffolding.
I should mention that I use a ProjectInitializer.cs to create the sample data. It runs at every launch for the moment, since I am testing quite a bit.
The other problem I have is that if I delete the database, it fails to recreate it. It says that my windows account does not have access to the (now non-existent) database that it is trying to create. I literally have to create a new database with new name, as anything that was created previously (with that DB name) fails.
I assume there is some sort of residual info being left somewhere that is out of synch, but I don't know what it is. I've closed all connections to the file in VS 2010, deleted the files, both any found via VS2010 and any physical files I see in the app_data directory.
Any help or suggestions would be appreciated.
Shut down the web server (Cassini, IIS, IIS Express) and try again. The file can remain locked if the web process is still referencing the file. In addition the loaded EF context will retain the db name. Ensure the visual studio browser isn't running in the tray still either.
I usually create a solution folder in Visual Studio and put my DB scripts in them. I always use at least this set of scripts:
Drop model
Create model script
User functions
Stored procedures
Static data (lookup tables)
Test data (not deployed)
Then I simply combine them and run against an SQL Server so I'm able to recreate the whole DB in a single step (by combining these scripts into a single one and executing it).
Anyway. I've never used projects in either:
Visual Studio or
SQL Management Studio
I've tried creating SQL Server 2008 Database Project in Visual Studio 2010, but I'm somehow overwhelmed by all the possible server settings (which I prefer to stay default as set on the server anyway). So I'm a bit confused: Should I use this project template or should I just do the same thing I always did?
What do you use and why? What are advantages I may benefit from by using either?
If I were you I would continue to do it the way you are doing it. In fact I do! The advantages of having the actual .sql files right there in a folder for you to use/edit/look at in my opinion are far better than the advantages you get by using a DB project. DB Project would be used if you were doing something like Storage Reports, were you have to communicate with like 8 databases and compare then to 8 different databases and save result sets etc... Now don't get my wrong there are advantages of Database Projects, I just don't think they are actually doing much help when you have such a simple setup that works already.
Advantages of the SQL Server 2008 Database Project in VS10:
Not having to switch back and forth
from your current client you use to
communicate with your SQL server.
Decent Data and Schema compare tools.
Gives you a one-click way to reverse
engineer a database into source
control, and keep it up to date.
You can compare projects to physical
databases and vice-versa. (This makes it pretty easy to keep your database up to date, no matter where you make change it: file system database project, or in the physical database itself)
If the current tool your using is not specifically tailored to SQL Server, this one is.
Extremely helpful if you need to do
unit tests directly on the database
without using abstractions.
If you're looking for something a little less complicated, you might want to try SQL Source Control. This won't even require you to maintain scripts, as it doesn't this for you behind the scenes. It will, however, only work as a solution for you if you use either TFS or SVN. And it costs $295...
It has a 28-day trial period, so if you're happy to try it out, I'd be interested in your feedback.