Monthly Archives: October 2011
Up to SQLServer 2008 R2 (10.5), we were unable to change default path (easily) for “Backupfiles”, though it was possible after making some changes in registry at path. “
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\ MSSQLServer\BackupDirectory”
Note: For SQL Server 2005 its MSSQL.1, for SQL Server 2008, its MSSQL10 and for SQL Server 2008 R2 its MSSQL10_50
Thanks toSQL Server 2011 (Code name DENALI), as now we can change this default path for backup filesto our required one by just opening server properties page and on “DatabaseSettings” tab.
Try Taking abackup from SSMS and now you can find that it has already pointing toward yougiven path.
To extract data frompervasive data files Pervasive database Engine is required. Latest Trialversion 11 can be downloaded from http://www.pervasivedb.com/psqlv11/pages/default.aspx
Once you have installedpervasive Engine, follow following steps to view and import it into SQL Server.
1. Open Control Center through programs>Pervasive> PSQL v11>control Center and documentation.
2. Make sure services pervasive PSQLRelational Engine and pervasive PSQL Transaction Engineare running (Expand services in Left panel)
3. To open a database Expand Engines>[server name] > Databases
4. Right Click on Databases and selectNew>Database
5. In Database Name Box give a name toyour database.
6. In Location Box locate the databasefiles path. (.mkd, .ddf, .dat)
7. You can uncheck “relationalintegrity enforced”.
8. Click Finish to complete the process.
9. Expand Database and you can see theadded database.
10. Double click on a table to see records
Moving data into SQLServer:
1. Right click on target database andselect Tasks> import data to open import and export wizard.
2. Hit next on the welcome screen.
4. To choose a data source click on thedrop down list and select Pervasive Provider, Release 3.2
5. Under Standard connection: specifypervasive Database Name, Host and server DSN as shownin figure below. DSN DEMODATA is automatically created byPervasive Engine.
6. Click next.
7. Chose SQL server native Client asDestination. Hit Next.
8. Click next
9. Provide Query to extract data from agiven table. Click parse to verify. Click next
10. Change Destination table name and click on EditMappings button. You can also preview source data.
11. In column mappings you can specify Field names,data type and size for destination table.
12. Click “OK” button to close Column mappings and hit finish toexecute the import process.
13. On successful completion refresh database in SQLServer to see imported data.
SQL Profiler is a high-quality tool to figure outdifferent database issues, like “Which are costliest queries running”, “Whichquires acquired exclusive locks”, “Which indexes are missing” and soon. But in development environment and on production when solving a problem,developers like to use SQL Profiler to get exact procedure call which is beinggenerated by front end application.
Worst practice is, that developers like to useexisting built in templates for this purpose and normally use default one i.e. STANDARD. If you are also using SQL Profiler for this procedure call purpose, thenselecting STANDARD trace template is not a good choice, as on productionserver it affects its performance and even on development server it returnsmuch more extra information.
Good practice is, if you have not created your owntemplate then always select TUNING.
It also has some extra information so when you onlyneed to catch procedure calls generated from your application then click on “EventSelection” tab and keep only “RPC:Completed” event. You need not toselect “Sp: stmt Completed” as you just need to capture “execute procedure calls”and not all the statement inside this procedure. You can also omit “SP:Batch Completed” as we need calls that are only generated from application. If you also need to capture calls from SSMS then you can keep it.
To avoid extra work pressure on server and to getyour required results only, you must also apply filters on “Database Name”and “Text”
Use % sign, just like you use in LIKE operator.
Now run your trace and you will find your required results quickly and clearly, even without putting extra work load on database server.