sql server - can dacpac be used for managing databases having large volume of data? -


our current database of 200mb, once application goes live, expecting grow large volume.. may be, 20-30 gb of data in that.

we planning use "dacpac" (generated database project - ssdt) deploy on production server. database created number of tables, , lots of initial data in lookup tables.

however, concern future deployments when using "dacpac" (generated database project - ssdt) upgrade database on production server.

as have no past experience of using dacpac deployments, can please suggest me following -

  1. does deployment depend on volume of data? or if depends upon schema changes? example, if target database of 20-30 gb, how approximate time can take upgrade it?
  2. how can version database schema?
  3. can upgrade process rolled if goes wrong?

and lastly, better traditional way of manual writing sql scripts upgrade database?

  1. from experience, volume of data have impact when deploying dacpac. time increase depend on changes being applied in dacpac across database. advice here try , test larger volumes of data gauge increase in time, may minimal

  2. all our objects stored within sql server data tools (ssdt) visual studio project, version controlled within tfs, when need build based on additional checking, create new version us

  3. this can depend on type of updates applying, , whether wish invest time in understanding take rollback each schema update.

i using dacpac's , find useful hosting sql objects within 1 visual studio project. going manual way increase chances forget include 1 or more patches due number of changes required.


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -