350 likes | 452 Views
SVC03. Development Best Practices and Patterns for Using Microsoft SQL Azure Databases. Tony Petrossian Principal Program Manager. Agenda. Service Recap Server Management Security and Access Control Connectivity and Tracing Latency and Connectivity Patterns
E N D
SVC03 Development Best Practices and Patterns for Using Microsoft SQL Azure Databases Tony PetrossianPrincipal Program Manager
Agenda • Service Recap • Server Management • Security and Access Control • Connectivity and Tracing • Latency and Connectivity Patterns • Dealing with large amounts of data • Getting Data In and Out
SQL Azure Recap • SQL Azure provides capacity on demand • Create a “server” • Create database within your server • Server lifecycle controlled via Portal • Portal login using LiveID • Database lifecycle via normal TSQL • Authenticate using ‘SQL login’ over TDS+SSL
SQL Azure Network Topology Applications use standard SQL client libraries: ODBC, ADO.Net, PHP, … Application Internet Azure Cloud TDS (tcp) Security Boundary Load balancer forwards ‘sticky’ sessions to TDS protocol tier LB TDS (tcp) Gateway Gateway Gateway Gateway Gateway Gateway Gateway: TDS protocol gateway, enforces AUTHN/AUTHZ policy; proxy to backend SQL TDS (tcp) SQL SQL SQL SQL SQL SQL Scalability and Availability: Fabric, Failover, Replication, and Load balancing
Servers • Each SQL Azure server provides • Geo-location (has a unique DNS name) • A zone for administration policy • A point of billing and reporting aggregation • Where should I create my server? • Best practice: co-locate server with Windows Azure app role (if using) to reduce latency • When should I create a new server? • Trade off between geo/admin/billing
Server Management HTTP Portal • Through the Portal • Add/Drop server • Establish admin credentials • View usage reports • Network access configuration • Through the Master Database • Fine-tune firewall settings through code • User logins • Usage and metrics reporting (billing) • Create/Drop databases User DB Master DB TDS User DB User DB SQL Azure Server
Server: Network Access Control • Each server defines a set of firewall rules • Determines access policy based on client IP • By default, there is NO ACCESS to server • Controlled using • TSQL API against Master DB: sys.firewall_rules, sys.sp_set_firewall_rule, sys.sp_delete_firewall_rule • Portal UX
Security: AUTHN and AUTHZ • SQL Azure uses SQL authentication (UID/PWD) • Authorization model fully compatible with SQL • Admin roles has permission for • CREATE/DROP database • CREATE/DROP/ALTER login • GRANT/REVOKE rights • Modifying server firewall settings
Server: Billing and Reporting sys.bandwidth_usage: usage in KB sys.database_usage: instance count by SKU
Connecting to SQL Azure • SQL Azure connection strings follow normal SQL syntax • Except for an unusual username format • Format of username for authentication: • ADO.Net:Data Source=server.database.windows.net;User ID=user@server;Password=password;... • ODBC:Driver={SQL Server Native Client 10.0}; Server=server.database.windows.net; Uid=user@server;Pwd=password;... • Applications connect directly to a database • “Initial Catalog = <db>” in connection string • No support for context switching (no USE <db>)
Built-in Connection Management • Connections may drop due to: • Network connectivity blips • Idle or long running transactions • Idle > 5 minutes • Long running transactions > 5 minutes • Throttling (taking too many resources) • Measured by IO load and CPU utilization • Database failover activity • Load balancing used to ensure ‘resource fairness’ • DOS protection may deny connectivity: • If too many failed connection attempts are made • Your server is not impacted by such attacks
Application Design Topics • Most-applicable SQL Best Practices • Connection Pooling • Query Parameterization • Batching • Scaling with data and load • Sharding • Building copies • Deploying and uploading data • Bulk copy • SSIS • Sync
Use Pooled Connections • Increases efficiency by removing re-login • // When pooling, use connection and return immediately • // Do not hold for a long time – pool ensure fast turnaround • // one second use • using (SqlConnection conn = new SqlConnection(…)) • { • conn.Open(); • using (SqlCommandcmd = conn.CreateCommand()) • { • cmd.CommandText = …; • … • } • } • using (SqlConnection conn = new SqlConnection(…)) • { • conn.Open(); …
Connections: Retry on failure • Connections can drop for variety of reasons • Idleness • Transient (network) errors • Intentional throttling • First step: reconnect immediately • Handles idleness- and transient-disconnects • Gateway handles connection retry for app • Connections attempted for ~30s before failure • What to do on connection failure? • Wait (10 seconds), then retry • Change your workload if throttled • Server health can be checked via Portal • TSQL APIs will come in later releases
Connection Pattern • while(true) • { • using(SqlConnection connection = newSqlConnection(connStr)) • { • try • { • connection.Open(); • using (SqlCommandcmd = connection.CreateCommand()) • { • cmd.CommandText = @"SetBCPJobStartTime"; • cmd.CommandType = CommandType.StoredProcedure; • cmd.Parameters.Add(newSqlParameter(@"@BCPJobId", BCPJobId)); • cmd.ExecuteNonQuery(); • } • } • catch (Exceptionexc) • { • // deal with error • } • } • // more stuff • // ….. • }
Tracing Connectivity Problems • Each session assigned a unique ‘sessionId’ • Tracks session state and service errors • Retrievable from CONTEXT_INFO() • Save this with each new connection • If you need support, support personnel will greatly appreciate that value
Tracing Helper Pattern • Guid? sessionId = null; • using (SqlConnection conn = new SqlConnection(…)) • { • // Grab sessionId from new connection • using (SqlCommandcmd = conn.CreateCommand()) • { • conn.Open(); • cmd.CommandText = • "SELECT CONVERT(NVARCHAR(36), CONTEXT_INFO())"; • sessionId = new Guid(cmd.ExecuteScalar().ToString()); • } • // continue processing • ... • }
Maximize your performance • Batching: push logic to Server • Use stored procedures and batching • Limit number of round trips to server • Example: batch 10 statements vs. 10 round-trips • Parameterized queries • Parameterize queries (limits compiles) • Declare all parameters, type and length • Lack of parameter size leads to cache bloat
Declare Parameter Lengths! • // Length inferred: leads to cache bloat • cmd.CommandText= "SELECT c1 FROM dbo.tblWHERE c2 = @1"; • cmd.Parameters.Add("@1", SqlDbType.NVarChar).Value = "1"; • … • cmd.Parameters.Add("@1", SqlDbType.NVarChar).Value = "22"; (@1 nvarchar(1)) SELECT c1 FROM dbo.tbl WHERE c2 = @1(@1 nvarchar(2)) SELECT c1 FROM dbo.tbl WHERE c2 = @1 // Length supplied: no cache bloat cmd.CommandText = "SELECT c1 FROM dbo.tbl WHERE c2 = @1"; cmd.Parameters.Add("@1", SqlDbType.NVarChar, 128).Value = "1"; (@1 nvarchar(128)) SELECT c1 FROM dbo.tbl WHERE c2 = @1
Maximize elasticity benefits • SQL Azure balances databases across machines • Divide your data into smaller chunks • Makes for better load-balancing • Ensures we can place your data on the most-appropriate servers • Using shorter transactions • Ensures we can respond to issues faster • Avoid impacting others sharing the same box • Thus avoiding being throttled by SQL Azure
Shards: Managing Lots of Data App Server Node 76 Node 2 Node 14 Node 19 Node 21 Node 33 Node 55 Node 99 Master DB DB1 DB2 DB3 DB3 DB4 DB5 DB6 SQL Azure Cluster
Copies: Managing Lots of Read Access App LB Server Node 76 Node 2 Node 14 Node 19 Node 21 Node 33 Node 99 Master DB DBC1 DBC2 DBC3 DBC4 DBC5 DBC6 SQL Azure Cluster
Skip This Ad Getting Data In and Out • Go See Mark Scurrell’sSession: • Using the Microsoft Sync Framework to Connect Apps to the Cloud (SVC23) • SQL Azure supports standard SQL data import and export patterns • Use bulk loading patterns where possible • BCP – console .EXE bulk load/export tool • SSIS – SQL integration server • Bulk APIs in ODBC and ADO.Net • SQL Azure supports data synchronization • With on-premises DBs and client stores
Getting Large Data into SQL Azure • Always good advice: • Break batches up into smaller, consumable chunks • Add retry and tracing logic to ensure robust resume in face of failures
Data Import: ADO.Net Bulk Copy API • // Bulk data import • using(SqlBulkCopy bulk = newSqlBulkCopy(newSqlConnection(conn)) • { • DestinationTableName= "dbo.data", • BatchSize = 2000, // Transaction size (length) • BulkCopyTimeout= 10000, // Transaction timeout • NotifyAfter= 1000, // Progress callback • }) • { • bulk.SqlRowsCopied+= newSqlRowsCopiedEventHandler( • myProgressCallback); • bulk.WriteToServer(sourceDataReader); • }
SSIS demo
Data Export/Import: BCP.EXE • // BCP example • SET SRV=somesrv.database.windows.net • SET LOGIN=mylohin@somesrv • SET PW=something • SET S_DR=C:\flats • bcpTPCH2.dbo.supplier in %S_DR%\supplier.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.nation in %S_DR%\nation.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.region in %S_DR%\region.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.customer in %S_DR%\customer.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.part in %S_DR%\part.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|“ • bcp TPCH2.dbo.supplier out %S_DR%\supplier.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.nation out %S_DR%\nation.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.region out %S_DR%\region.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.customer out %S_DR%\customer.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|" • bcp TPCH2.dbo.part out %S_DR%\part.tbl -c -U %LOGIN% -P %PW% -S %SRV% -t "|"
End to End Demo: Azure BCP Loader Browser • Load blobs to Azure • BCP from Azure worker • Reduced latency improved throughput WebRole PutBlob() NewJob() Jobs Blobs GetJob() GetBlob() Target DB WorkerRole BCP
Summary • Many SQL Server patterns apply to SQL Azure • Use SQL best practices wherever possible • Patterns discussed: • Connectivity (to database, not server) • Tracing and support • Batching, Pooling and Parameterization • Getting data in and out
YOUR FEEDBACK IS IMPORTANT TO US! Please fill out session evaluation forms online at MicrosoftPDC.com
Learn More On Channel 9 • Expand your PDC experience through Channel 9 • Explore videos, hands-on labs, sample code and demos through the new Channel 9 training courses channel9.msdn.com/learn Built by Developers for Developers….