1 / 38

Microsoft SQL Server 2008

Microsoft SQL Server 2008. Advance Features. Manu Kapoor. Contents. Working with Performance Counters and SQL Server Profiler (Merge Approach) Securing data with Transparent Data Encryption Managing Resource Pool with Resource Governor Working with SQL Server Plan Guides

elyse
Download Presentation

Microsoft SQL Server 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Microsoft SQL Server 2008 Advance Features Manu Kapoor

  2. Contents Working with Performance Counters and SQL Server Profiler (Merge Approach) Securing data with Transparent Data Encryption Managing Resource Pool with Resource Governor Working with SQL Server Plan Guides Working Change Data Capture Working with Change Tracking Multi Server Management using Central Management Server Using Policy Based Management Using PowerShell in MS-SQL Server 2008

  3. We all understand that SQL Profiler is a handy tool for performing workload analysis and perfmon offers capturing OS/ SQL Server performance counters so they may be analyzed during real-time or at a later stage. • Combining Performance Counter metrics with MS-SQL Profiler workload trace provides us great opportunity to pin-point the exact statement/ Stored Procedure or Batch that is causing high resource utilization. Up until this point there was practically no-good-way to know which statement in user sessions is causing the excessive resource consumption specially when there are numerous concurrent user sessions hitting the instance. • Together with MS-SQL Server 2008 profiler and performance logs this can be easily achieved as we shall see in this session. • Continued to next slide 1. Working with Performance Counters and SQL Server Profiler

  4. Working with Performance Counters and SQL Server Profiler Continued • Configuration Steps: • We all understand that SQL Profiler is a handy tool for performing workload analysis and perfmon offers capturing OS/ SQL Server performance counters so they may be analyzed during real-time or at a later stage. • 1. Setting-up the PerfMon • Open Perfmon, start  run  perfmon and press enter • Expand Performance Logs and Alerts • Right click on Counter logs and select New Log Settings • Add counters in General Tab, for our session we will capture • Processor: %Processor • SQL Server Buffer Manager: Page Life Expectancy • SQL Server Buffer Manager: Buffer Cache Hit Ratio • Physical Disk: Average Disk Sec/ Read • Physical Disk: Average Disk Sec/ Write • Continued to next slide

  5. Working with Performance Counters and SQL Server Profiler Continued • 2. Setting-up the MS-SQL Profiler • Create a new SQL trace and add the following Events • T-SQL: Batch Completed • T-SQL: Batch Started • Optionally you can also select other events as well as columns to display • In the next slide, we will how to set-up the transaction workload. • Continued to next slide

  6. Working with Performance Counters and SQL Server Profiler Continued • 3. Setting-up the Transaction Workload • We will executing some scripts to simulate the load on the server. The script will create a table, push some data into it and while this insertion is going on we will also fire selection using NOLOCK keyword. This will cause CPU to gain spikes, that will be captured by perfmon. • -- First Query Window • CREATETABLE T1 • ( • EmpIDINT, • EmpName VARCHAR(50) • ) • GO • DECLARE @iINT • SET @i= 1 • WHILE @i<= 100000 • BEGIN • INSERTINTO T1 SELECT @i,'Employee '+CAST( @iASVARCHAR) • SET @i= @i+ 1 • END • -- Second Query Window ( Load Simulation ) • SELECTCOUNT(*)FROM T1 WITH (NOLOCK) • Continued to next slide

  7. Working with Performance Counters and SQL Server Profiler Continued • 4. Merging Perfmon and MS-SQL Profiler Trace • At this point both Perfmon and MS-SQL Profiler must have captured enough data for us to view and analyze. To see the them action, perform the following steps: • Stop SQL Profiler Trace and Save the trace to some location. • Stop Perfmon Logs in the Perfmon tool • In MS-SQL Profiler window, click on File menu and open the recently saved trace file • Click on the File menu again and click on Import Performance Data • Select the file which you have configured for capturing the perfmon logs • Select counters to view (For our example, we will just pick Processor: %Processor ) • You will now see both T-SQL statements and perfmon graph appearing in the cascaded window mode • As you navigate and traverse between the T-SQL statements, you will also see the changes in the Graphs indicating the CPU spikes.

  8. MS-SQL 2008 offers out-of-the box feature for securing your data files and database backups and we call this as Transparent Data Encryption (TDE). • TDE works at the storage level and has the following key properties: • This feature is only available with Enterprise Edition, Enterprise Evaluation Edition and Developer Edition • TDE is transparent for applications because it does not require any code changes at the application level. The data is encrypted/decrypted in the memory before going/coming from the I/O device • TDE has some additional usage over-head generally between 3%-5%. As per Microsoft, under TPC-C workload testing, the overhead never went beyond 28% • Why to use TDE • The data stored in the physical database files are in clear-text, can be viewed using a common text editor. I can’t believe this ? • The data stored in the database BACKUP files are also in clear-text, can be viewed using a common text editor. I can’t believe this too ? • Continued to next slide 2. Securing data with Transparent Data Encryption

  9. Securing data with Transparent Data Encryption Continued • Setting-up Transparent Data Encryption: • We will be creating a database called “ResearchDB” that will have one table. We wish to encrypt this database and test TDE. • Configuring TDE requires the following steps: • 1. CREATE MASTER KEY (This will always be the one in your instance) • USEMASTER • GO • CREATEMASTERKEYENCRYPTIONBYPASSWORD='p@$$w0rd‘ • 2. CREATE SERVER CERTIFICATE • USEMASTER • GO • CREATECERTIFICATE SECURE_CERT WITHSUBJECT='USED FOR TRAINING SESSION DEMO' • Continued to next slide

  10. Securing data with Transparent Data Encryption Continued • 3. CREATE DATABASE ENCRYPTION KEY • USE ResearchDB • GO • CREATEDATABASEENCRYPTIONKEY • WITHALGORITHM=AES_128 • ENCRYPTIONBYSERVERCERTIFICATE SECURE_CERT • ( Pay attention to the warning ) • 3. ENABLE ENCRYPTION ON THE DATABASE • ALTERDATABASE ResearchDB SETENCRYPTIONON • Before we get onto the real actions let us check few more things: • SELECT*FROMsys.DATABASES • SELECT • DB_NAME(database_id)ASDatabaseName, Encryption_StateASEncryptionState,key_algorithmASAlgorithm,key_lengthASKeyLength • FROMsys.dm_database_encryption_keys • GO • SELECT*FROMSYS.SYMMETRIC_KEYS • Continued to next slide

  11. Securing data with Transparent Data Encryption Continued • Before we get onto the real actions let us check few more things: • SELECT * FROM SYS.CERTIFICATES • SELECT*FROMSYS.DATABASES • SELECT • DB_NAME(database_id)ASDatabaseName, • Encryption_StateAS • EncryptionState,key_algorithmASAlgorithm, • key_lengthASKeyLength • FROMSYS.DM_DATABASE_ENCRYPTION_KEYS • GO • --1 = Unencrypted • --2 = Encryption in progress • --3 = Encrypted • --4 = Key change in progress • --5 = Decryption in progress (after ALTER DATABASE…SET ENCRYPTION OFF) • Continued to next slide

  12. Securing data with Transparent Data Encryption Continued • At this point our database (ResearchDB) is encrypted. We can perform the following tasks (Real action): • Take the database off-line and read the contents in NOTEPAD • Take the database back-up and read its content in NOTEPAD • Few Key Questions: • Can you restore the encrypted database backup on any other Enterprise edition instance ? • Yes you can, provided you have the valid certificate • Can you attach the encrypted database file to other Enterprise edition instance ? • Yes you can, provided you have the valid certificate • Can you do an IMPORT/EXPORT of the database tables which is encrypted to other instance • Yes, this DOES NOT need certificate. Import/Export works outside of TDE ….  • Can you restore the encrypted database backup on any other Non-Enterprise edition instance ? • NO YOU CANNOT • Why is it important to BACKUP the CERTIFICATE ? • To allow restoration on same or other instance (Of-course Ent/Dev edition only) • WARNING: Your database will be useless if you don’t have the back-up of the certificate and you want to restore the database or even worse if your certificate is deleted. Look at the example in the next slide • What other algorithm keys available ? • AES_128 | AES_192| AES_256| TRIPLE_DES_3KEY • Continued to next slide

  13. Securing data with Transparent Data Encryption Continued • (Handling a worst-case scenario) • What will happen if the certificate is removed ? • As simple as that, your database will become in-accessible. You will not be able to recover it • Issue Simulation: • Ensure that you have taken the back-up of the certificate that you have used. • BACKUP CERTIFICATE SECURE_CERT • TO FILE='C:\Secure_Cert.cer' • WITHPRIVATEKEY (FILE='C:\Secure_Cert.pvk',ENCRYPTIONBYPASSWORD='test@123') • Issue the command DROP CERTIFICATE <CERTIFICATE_NAME> • Restart MS-SQL database engine, check error log file and database state • This must be looking ugly to you, the database is crashed • Solution: • Restore this certificate again • CREATECERTIFICATE SECURE_CERT • FROMFILE='C:\Secure_Cert.cer' • WITHPRIVATEKEY (FILE='C:\Secure_Cert.pvk',DECRYPTIONBYPASSWORD='test@123') • Restart the MS-SQL database engine service • Check your database state • It should be fine now !!!!

  14. Resource Governor as the name implies is used to manage the resources available on the server machine. We would normally associate resources with CPU and Memory. In MS-SQL 2008, the new feature Resource Governor went beyond what is had traditionally offered, that is it allows database administrators to manage the workloads according to the situation. The Resource Governor has the following components: Resource Pool: - A resource pool represents the physical resources of the server. There are two resource pool namely internal and default which are created when SQL Server 2008 is installed. However, SQL Server also supports the creation of user defined resource pools. In a resource pool a DBA can specify MIN or MAX value in percentages for CPU and Memory utilization. The Internal pool basically represents the resources which are consumed by SQL Server itself for its running. This pool cannot be altered by a user in any way. The default pool is a predefined user pool which contains the default group. The important thing to note is that the default pool cannot be dropped or created, however it can be altered as required. Workload Group: - A workload group acts as a container which accepts the sessions from SQL server users, which are similar in nature based on the classification criteria that are applied to each requests. As in Resource Pool's there are two predefined workload groups namely internal and default defined by SQL Server. The incoming requests to the server are classified into default workload when there is no criteria defined to classify the incoming request, or there was an attempt made to classify the requests into a nonexistent workload group or there is a failure with the classification Classification: - Classifications are internal rules that classify the incoming requests and route then to a workload group. This classification is based on a set of user written criteria contained in a scalar function which will be created in the Master database. Once a DBA enables the Resource Governor on SQL Server then each and every single session to the server will be evaluated by the user defined classifier function. Continued to next slide 3. Managing Resources with Resource Governor

  15. Managing Resources with Resource Governor Continued • Problem: • We want to limit CPU % for the sessions initiated by the user ”dbadmin” to 40% • We want to limit CPU % for the sessions initiated by the user ”dbuser” to 10% • For all other users 50% CPU is allowed • Setting-up Resource Governor: • For our session we will adopt this approach: • Create a user with the name ”dbadmin” • Create a user with the name ”dbauser” • Resource Pool: • ADMINS: Min CPU 0%, MAX CPU 40% • Workload Group Name: GROUPADMINS • USERS: Min CPU 0%, MAX CPU 10% • Workload Group Name: GROUPUSERS • Continued to next slide

  16. Managing Resources with Resource Governor Continued • Create a classifier function using the following T-SQL: • CREATEFUNCTIONdbo.ClassifierResources() • RETURNSSYSNAMEWITHSCHEMABINDING • BEGIN • DECLARE @VAL VARCHAR(32) • SET @VAL ='default'; • IF'dbadmin'=SUSER_SNAME() • SET @VAL ='GROUPADMINS'; • IF'dbuser'=SUSER_SNAME() • SET @VAL ='GROUPUSERS'; • RETURN @VAL; • END • 5. Attach this classifier function to the Resource Governor: • ALTERRESOURCEGOVERNOR • WITH ( CLASSIFIER_FUNCTION =dbo.ClassifierResource) • Continued to next slide

  17. Managing Resources with Resource Governor Continued • Create a classifier function using the following T-SQL: • CREATEFUNCTIONdbo.ClassifierResources() • RETURNSSYSNAMEWITHSCHEMABINDING • BEGIN • DECLARE @VAL VARCHAR(32) • SET @VAL ='default'; • IF'dbadmin'=SUSER_SNAME() • SET @VAL ='GROUPADMINS'; • IF'dbuser'=SUSER_SNAME() • SET @VAL ='GROUPUSERS'; • RETURN @VAL; • END • 5. Attach this classifier function to the Resource Governor: • ALTERRESOURCEGOVERNOR • WITH ( CLASSIFIER_FUNCTION =dbo.ClassifierResource) • Continued to next slide

  18. Managing Resources with Resource Governor Continued • Activate the changes done to the Resource Governor • ALTERRESOURCEGOVERNORRECONFIGURE • Resource Governor in Action • At this point we have created the resource pool and is ready to be used. To se Resource Governor in action follow these steps: • Open Perfmon start  run  type perfmon and press enter • Select SQL Server: Resource Pool Counter Class and add the counter CPU usage % and ensure you have selected default/ internal/ ADMINS/ USERS resource pools • Make three NEW connections to the instance: • One using Administrator • One using dbadmin • One using dbuser • Open three New Query window for each session • Execute the following T-SQL for LOAD SIMULATION • Observe the CPU usage % for each Resource Pool

  19. Plan guides are used to optimize the performance of queries when you cannot or do not want to change the text of the query directly. This can be useful when a small subset of queries in a database application deployed from a third-party vendor are not performing as expected. Plan guides influence optimization of queries by attaching query hints or a fixed query plan to them. In the plan guide, we specify the Transact-SQL statement that we want optimized and either an OPTION clause that contains the query hints you want to use or a specific query plan you want to use to optimize the query. When the query executes, SQL Server matches the Transact-SQL statement to the plan guide and attaches the OPTION clause to the query at run time or uses the specified query plan. As an example, take this query which is coming from vendor-deployed application: SELECT * FROM <SOME_LARGE_TABLE_NAME> This query is not making use any indexes even if they are defined and is not limiting the rows at all. In such a situation MS-SQL Server or for that matter any other RDBMS will probably use parallel execution to derive the results. This can hinder the performance specially when con-current user sessions are also sending their own work-loads.. Plan guides can help-us take care of this situation by attaching OPTION ( MAXDOP 1 ) Continued to next slide 4. Working SQL Server Plan Guides

  20. Working SQL Server Plan Guides Continued • For our session, we will use the following method: • CREATEDATABASE ResearchDB • USE ResearchDB • -- Create the table • CREATETABLE T1 • ( • EmpIDINT, • EmpName VARCHAR(50), • Salary INT • ) • -- Push some data into this table • DECLARE @iINT • SET @i= 1 • WHILE @i<= 50000 • BEGIN • INSERTINTO T1 SELECT @i,'Employee '+CAST( @iASVARCHAR), @i* 2 • SET @i= @i+ 1 • END • Continued to next slide

  21. Working SQL Server Plan Guides Continued • -- Create some indexes • CREATECLUSTEREDINDEXIDX_EmpIDON T1(EmpID) • CREATENONCLUSTEREDINDEXIDX_EmpNameON T1( EmpName ) • -- Check what is the execution plan of this query, this is taken as SCAN instead of SEEK • SELECT*FROM T1 WHERE EmpName LIKE'Employee 1%'; • -- Try using FORCESEEK (Just a test) • SELECT*FROM T1 WITH ( FORCESEEK)WHERE EmpName LIKE'Employee 1%'; • -- Creating Plan Guide • EXECsp_create_plan_guide • @name =N'TEST PLAN FORCESEEK', • @stmt =N'SELECT * FROM T1 WHERE EmpName LIKE ''Employee 1%'';', • @type =N'SQL', • @module_or_batch=NULL, • @params=NULL, • @hints =N'OPTION ( TABLE HINT ( T1, FORCESEEK) )'; • GO • Continued to next slide

  22. Working SQL Server Plan Guides Continued • -- Check what is the execution plan of this query AGAIN • SELECT*FROM T1 WHERE EmpName LIKE'Employee 1%'; • We can clearly see the change in the execution plan now. • Can we check it in profiler as well ? • Yes you can, you have to include Plan Guide Successful event under performance Event Class • Controlling plan guides: • Viewing which plans are created in my database • SELECT*FROMsys.plan_guides • Enabling the SQL Plan Guide • EXECsp_control_plan_guide'ENABLE', @name =N'TEST PLAN FORCESEEK’ • Disabling the SQL Plan Guide • EXECsp_control_plan_guide'DISABLE', @name =N'TEST PLAN FORCESEEK’ • Dropping the SQL Plan Guide • EXECsp_control_plan_guide'DROP', @name =N'TEST PLAN FORCESEEK'

  23. Change Data Capture is yet another Enterprise Edition Feature that allows Database administrators to capture the changes happening in the database tables (Including DDL) changes. Change data Capture has the following key properties: It uses Transaction Log File to capture the changes It uses SQL Server Agent Job to traverse though the Log file and push the data into relational tables It is ASYNCHRONOUS in nature (Unlike Change Tracking which works in a Transaction Mode hence SYNCHRONOUS) It can only be configured using T-SQL Unlike Change Tracking it also keeps values that was changed Internally CDC uses sp_replcmds system stored procedure (This is the same procedure which built-in to MS-SQL database engine for Transactional Replication) to traverse through the Transaction Log file and pushes the data into the Capture Instance Continued to next slide 5. Change Data Capture

  24. Change Data Capture Continued • Configuring Change Data Capture • As stated earlier change data capture is enabled by using system stored procedure. Before enabling the database for CDC please ensure that SQL Server Agent services is started. To enable the Change Data Capture execute the following stored procedure: • CREATEDATABASE ResearchDB • USE • USE ResearchDB • -- Enable Change Data Capture on the database first • EXECsp_cdc_enable_db • -- Check which all databases are enabled for CDC • SELECT*FROMSYS.DATABASES • -- Create the table which on which we want to enable CDC • CREATETABLE T1 • ( • EmpIDINTPRIMARYKEY, • EmpName VARCHAR(50), • Salary INT • ) • Continued to next slide

  25. Change Data Capture Continued • -- Enable Change Data Capture for the table now • EXECsp_cdc_enable_table • @source_schema='dbo', • @source_name='T1', • @role_name='CDCRole', • @supports_net_changes= 1 • -- Performing some DMLs now • INSERTINTO T1 SELECT 1,'Employee 1', 5000,NULL,NULL • INSERTINTO T1 SELECT 2,'Employee 2', 6000,NULL,NULL • INSERTINTO T1 SELECT 3,'Employee 3', 7000,NULL,NULL • DELETEFROM T1 WHEREEmpID= 2 • UPDATE T1 SET Salary = Salary + 2 WHEREEmpID= 3 • Since the database and the table has been enabled for CDC, we can now query the built-in functions to see the affected changes. Look at the sample in the next slide. • Continued to next slide

  26. Change Data Capture Continued • -- Viewing changed data by passing Specific LSNs • DECLARE @begin_lsnBINARY(10) • DECLARE @end_lsnBINARY(10) • SET @begin_lsn= 0x0000001C0000010F0018 • SET @end_lsn= 0x0000001C000001140003 • SELECT*FROM cdc.fn_cdc_get_net_changes_dbo_T1( @begin_lsn, @end_lsn,'all') • SELECT*FROM cdc.fn_cdc_get_all_changes_dbo_T1( @begin_lsn, @end_lsn,'all') • -- Tacking DDL Changes • ALTER TABLE T1 ADD NewCol VARCHAR(50) • GO • EXECsys.sp_cdc_get_ddl_history'dbo_T1' • As can be seen above the CDC also captures the DDL changes. • Continued to next slide

  27. Change Data Capture Continued • -- Useful System Tables: • SELECT*FROMcdc.captured_columns • SELECT*FROMcdc.change_tables • SELECT*FROMcdc.ddl_history • SELECT*FROMcdc.index_columns • SELECT*FROM cdc.dbo_T1_CT • -- Useful Stored Procedures • EXECsys.sp_cdc_get_ddl_history'dbo_T1' • EXECsys.sp_cdc_get_captured_columns'dbo_T1' • EXECsys.sp_cdc_help_change_data_capture'dbo','T1' • Continued to next slide

  28. Change tracking in SQL Server 2008 enables applications to obtain only changes that have been made to the user tables, along with the information about those changes. With change tracking integrated into SQL Server, complicated custom change tracking solutions no longer have to be developed. • Change Tracking has the following properties: • Unlike CDC the Change Tracking is configured using ALTER command. Not limited to Enterprise Edition. • Change Tracking is SYNCHRONOUS and happens within the transaction • Uses VERSION to track the changes. This version has a DATABASE scope. • Change Tracking cannot capture DDL statements (Unlike CDC which does) • Unlike Change Data Capture does not retain OLD values • Just tracks the changes using primary key, does not show other columns. Does not use any SQL Server Agent jobs to track changes. • To configure Change Tracking, follow these steps: • -- Enable Change Tracking on the Database • ALTERDATABASE ResearchDB SETCHANGE_TRACKING=ON • -- Check which all database are enabled for Change Tracking • SELECT*FROMSYS.CHANGE_TRACKING_DATABASES • -- Enable Change Tracking on the Table • ALTERTABLE T1 ENABLECHANGE_TRACKINGWITH (TRACK_COLUMNS_UPDATED=ON) • -- Check which all tables are enabled for change tracking • SELECT*FROMSYS.CHANGE_TRACKING_TABLES • -- Check Internal Tables, observe the name used for the internal tracking table • SELECT*FROMSYS.INTERNAL_TABLES • Continued to next slide 6. Change Tracking

  29. Change Tracking Continued • Making use of Change Tracking: • -- Perform some DML operations • INSERTINTO T1 • SELECT 1,'Employee 1', 8000 • INSERTINTO T1 • SELECT 2,'Employee 2', 8000 • INSERTINTO T1 • SELECT 3,'Employee 3', 8000 • INSERTINTO T1 • SELECT 4,'Employee 4', 8000 • -- SEE THE TRACKED CHANGES • SELECT*FROMCHANGETABLE(CHANGES T1, 0 ) CT • -- GETTING RECORDS AS WELL AS CHANGES • SELECT CT.*, T1.*FROM • CHANGETABLE(CHANGES T1, 0 )AS CT • FULLOUTERJOIN • T1 ONCT.EmpID= T1.EmpID • Continued to next slide

  30. Change Tracking Continued • -- PERFORM THE DELETE • DELETEFROM T1 WHEREEmpID= 3 • -- SEE THE TRACKED CHANGES, NOTICE WE WILL ONLY SEE THE NET CHANGES • SELECT*FROMCHANGETABLE(CHANGES T1, 0 ) CT • SELECT*FROMCHANGETABLE(CHANGES T1, 2 ) CT • -- WHAT IS THE CURRENT CHANGE TRACKING VERSION • SELECTCHANGE_TRACKING_CURRENT_VERSION() • -- UNDERSTANDING UPDATES • UPDATE T1 • SET SALARY = 7000 • WHEREEmpID= 1 • -- Now see the tracked changes • SELECT*FROMCHANGETABLE(CHANGES T1, 1 ) CT ORDERBY SYS_CHANGE_VERSION • SELECT*FROMCHANGETABLE(CHANGES T1, 0 ) CT ORDERBY SYS_CHANGE_VERSION • Continued to next slide

  31. Change Tracking Continued • -- Making use of change context • -- Change Tracking Context helps us to determine who has done the changes • DECLARE @ApplicationNameVARBINARY(128)=CAST('SSMS ADHOC Batch'ASVARBINARY(128)); • WITHCHANGE_TRACKING_CONTEXT ( @ApplicationName) • UPDATE T1 • SET Salary = Salary+10 • WHEREEmpID= 2 • SELECT*,CAST( SYS_CHANGE_CONTEXT ASVARCHAR(128))FROMCHANGETABLE(CHANGES T1, 2 ) CT • Continued to next slide

  32. Central management servers store a list of instances of SQL Server that is organized into one or more central management server groups. Actions that are taken by using a central management server group act on all servers in the server group. This includes connecting to servers by using Object Explorer and executing Transact-SQL statements and Policy-Based Management policies on multiple servers at the same time. Versions of SQL Server that are earlier than SQL Server 2008 cannot be designated as a central management server. • To configure Central Management Server, follow these steps: • Click on the Registered Servers in the View Menu • Right click on Central Management Servers node and click on New Server Registration • In the New Server Registration dialog box, provide the value for the Server Name. • Click on Test and Save • Right click on the newly created Central Management Server, click on “New Server Group” • Provide some name to the Group and Click OK • Right click on the newly created server group and select “New Server Registration” • Add servers (Instances) of your choice that you want to centrally managed. Click OK • Testing: • Select newly created Central Management Server • Click on “New Query” Button and execute the following query: • SELECT * FROM SYS.DATABASES • You will notice that: • This query is fired against all the databases contained in the group • The result-set contains Server Name under the context as additional column 7. Multi-Server Management using Central Management Server

  33. Policy Based Management is yet another great new feature available in MS-SQL 2008. Policy based management (PBM) is used to evaluate policies across instance or instances and with the help of PBM we can evaluate existing policies or create one to enforce rules. Primarily, policies can be used for: On demand or scheduled Evaluation Enforcement Auto correction Note: Originally this feature was called the Declarative Management Framework but has since been renamed.  There are a number of terms that we need to define in order to begin to understand Policy-Based Management: Target - an entity that is managed by Policy-Based management; e.g. a database, a table, an index, etc. Facet - a predefined set of properties that can be managed Condition - a property expression that evaluates to True or False; i.e. the state of a Facet Policy - a condition to be checked and/or enforced Making use existing policies: MS-SQL 2008 offers several pre-configured/ pre-defined policies that we can use to evaluate the health of the databases. For example we can make of “Database Auto Close” policy to check which all databases within the instance are non-compliant. Continued to next slide 8. Policy Based Management

  34. Policy Based Management Continued • Example of On-demand evaluation of existing policy on the current instance • To perform on-demand evaluation of the policy , expand Policy Management in SSMS, Right click on Policy and then click on Evaluate. In the Evaluate Policy dialog box, click on browse button and select File as source. Select the available policy from the list. Click on Evaluate button to see the evaluation results. You can also click on Apply button to make the changes • Example of On-demand evaluation of existing policy on the multiple instances • To perform on-demand evaluation of the policy on multiple instances,: • Create a Central Management Server (Using registered server tool window) • Create Server Group • Register the instances • Right click on the central management server,: • Click evaluate policies • Select source policy • Click on evaluate • View the evaluation results • You can also Apply the changes if you wish • Continued to next slide

  35. Policy Based Management Continued • Example of On-demand evaluation of existing policy on the multiple instances • To perform on-demand evaluation of the policy on multiple instances,: • Create a Central Management Server (Using registered server tool window) • Create Server Group • Register the instances • Right click on the central management server,: • Click Evaluate Policies • Select Source Policy • Click on Evaluate • View the Evaluation results • You can also Apply the changes if you wish • Continued to next slide

  36. Policy Based Management Continued Example of scheduled evaluation of existing policy To perform scheduled evaluation of the policy: Expand Policy Management in SSMS Right click on Policies, click on Import Policy Select appropriate policy and click OK Right click on the newly imported policy and click properties In the evaluation mode, select “On Schedule” and create the scheduler Click OK Notice that a job is created that will run as per schedule and evaluate the policy The logs can be viewed by right clicking on the policy and click “View History” Note: the View History will always show “Failed” icon as long as the database fails the evaluation test. Continued to next slide

  37. Policy Based Management Continued • As with existing policies we can also create our policy, evaluate and deploy it on the instance. This is particularly useful when we want enforce standards, for example, all tables in the database should be prefixed with “tbl” • To create custom policy, follow these steps: • Expand Policy Management in SSMS • Expand Conditions, Right click on it and click New Condition • In the New condition dialog box, provide value for names and select “Multipart Name” for the Facet. • Select • @Name for the Field • LIKE for the operator • ‘tbl%’ for the operator, click OK • Now right click on the Policies, click New Policy. Provide Name for the Policy. • In the Check Condition section, select the condition that you created in the previous step • Select Every Table in Every Database for “Against Target”. Click OK • You can now evaluate this newly created policy against your database as shown in the previous step • NOTE: You can also attach this policy to the database by Right licking on the database  Policies  Categories  Select your policy  Evaluation  On change: Prevent  Click OK. After this try to create a table which does not start with ‘tbl’ and seetthe output. The transaction will be aborted.

  38. PowerShell is the .NET based automation engine that Microsoft shipped in November 2006. Ttcan be embedded into .NET applications but is usually used as a command line shell and scripting language. PowerShell install packages are available in 32 bit and 64 bit versions for Windows 2003, Windows XP and Windows Vista. It is an installable feature in Windows Server 2008 i.e. it is part of the operating system install. It is incorporated into a number of Microsoft products including Exchange 2007, a number of System Center products including Operations Manager 2007, Data Protection Manager 2007 and Virtual Machine Manager. A number of third party vendors including Quest, Special Operations Software, SDM software, IBM, Citrix and VMware have produced PowerShell adds for their products or incorporated PowerShell directly into their products. PowerShell is now part of Microsoft’s Common Engineering Criteria and will be incorporated into all major products. • To use Powershell, open it from SQL Serve Management Studio • Samples: • You can navigate through the instances like a file system • You can invoke SQL Command using Invoke-sqlcmdcmdlet, as in: • Invoke-sqlcmd –query “SELECT * FROM T1’ • You can perform the formatting of the output using FORMAT-TABLE | auto 9. Using Powershell in MS-SQL Server 2008

More Related