sql server - can dacpac be used for managing databases having large volume of data? -
our current database of 200mb, once application goes live, expecting grow large volume.. may be, 20-30 gb of data in that.
we planning use "dacpac" (generated database project - ssdt) deploy on production server. database created number of tables, , lots of initial data in lookup tables.
however, concern future deployments when using "dacpac" (generated database project - ssdt) upgrade database on production server.
as have no past experience of using dacpac deployments, can please suggest me following -
- does deployment depend on volume of data? or if depends upon schema changes? example, if target database of 20-30 gb, how approximate time can take upgrade it?
- how can version database schema?
- can upgrade process rolled if goes wrong?
and lastly, better traditional way of manual writing sql scripts upgrade database?
from experience, volume of data have impact when deploying dacpac. time increase depend on changes being applied in dacpac across database. advice here try , test larger volumes of data gauge increase in time, may minimal
all our objects stored within sql server data tools (ssdt) visual studio project, version controlled within tfs, when need build based on additional checking, create new version us
this can depend on type of updates applying, , whether wish invest time in understanding take rollback each schema update.
i using dacpac's , find useful hosting sql objects within 1 visual studio project. going manual way increase chances forget include 1 or more patches due number of changes required.
Comments
Post a Comment