• SQL Server training
  • Write for us!

Esat Erkec

A case study of SQL Query tuning in SQL Server

Gaining experience in SQL query tuning can be very difficult and complicated for database developers or administrators. For this reason, in this article, we will work on a case study and we are going to learn how we can tune its performance step by step. In this fashion, we will understand well how to approach query performance issues practically.

Pre-requirements

In this article, we will use the Adventureworks2017 sample database. At the same time, we will also use the Create Enlarged AdventureWorks Tables script to obtain an enlarged version of the SalesOrder and SalesOrderDetail tables because the size of this database is not sufficient to perform performance tests. After installation of the Adventureworks2017 database, we can execute the table enlarging script.

Case Study: SQL Query tuning without creating a new index

Imagine that, you are employed as a full-time database administrator in a company and this company is still using SQL Server 2017 version. You have taken an e-mail from the software development team and they are complaining about the following query performance in their e-mail.

Your objective is to improve the performance of the above query without creating a new index on the tables but you can re-write the query.

The first step of the SQL query tuning: Identify the problems

Firstly, we will enable the actual execution plan in the SSMS and execute the problematic query. Using the actual execution plan is the best approach to analyze a query because the actual plan includes all accurate statistics and information about a query. However, if a query is taking a long time, we can refer to the estimated execution plan. After this explanation, let’s examine the select operator of the execution plan.

  • Interpreting execution plans of T-SQL queries
  • Main Concepts of SELECT operators in SQL Server execution plans

The ElapsedTime attribute indicates the execution time of a query and we figure out from this value that this query is completed in 142 seconds. For this query, we also see the UdfElapsedTime attribute and it indicates how long the database engine deal to invoke the user-defined functions in the query. Particularly for this query, these two elapsed times are very close so we can deduce that the user-defined function might cause a problem.

A select operator details in the execution plan

Another point to take into consideration for this query is parallelism. For this query, the Estimated Subtree Cost value exceeds the Cost Threshold for Parallelism  setting of the server but the query optimizer does not generate a parallel execution plan because of the scalar function. The scalar functions prevent the query optimizer to generate a parallel plan.

Why a query does not generate a parallel execution plan?

The last problem with this query is the TempDB spill issue and this problem is indicated with the warning signs in the execution plan.

Analyze an execution plan for SQL query tuning

Outdated statistics, poorly written queries, ineffective index usage might be caused to tempdb spill issues.

Improve performance of the scalar-function in a query

The scalar-functions can be a performance killer for the queries, and this discourse would be exactly true for our sample query. Scalar-functions are invoked for every row of the result set by the SQL Server. Another problem related to the scalar-functions is the black box problem because the query optimizer has no idea about the code inside the scalar-function, due to this issue the query optimizer does not consider the cost impact of the scalar functions on the query.

A new feature has been announced with SQL Server 2019 and can help overcome most of the performance issues associated with scalar functions. The name of this feature is Scalar UDF Inlining in SQL Server 2019 . On the other hand, if we are using earlier versions of SQL Server, we should adapt the scalar function code explicitly to the query if it is possible. The common method is to transform the scalar-function into a subquery and implement it to query with the help of the CROSS APPLY operator. When we look at the inside of the ufnGetStock function, we can see that it is summing the quantity of products according to the ProductId only a specific LocationId column.

Scalar-functions affects SQL query tuning negatively

We can transform and implement the ufnGetStock scalar-function as shown below. In this way, we ensure that our sample query can run in parallel and will be faster than the first version of the query.

This query has taken 71 seconds to complete but when we look at the execution plan, we see a parallel execution plan. However, the tempdb spill issue is persisted. This case obviously shows that we need to expend more effort to overcome the tempdb spill problem and try to find out new methods.

Tempdb spill issue affects SQL query tuning negatively

Think more creative for SQL query tuning

To get rid of the tempdb spill issue, we will create a temp table and insert all rows to this temporary table. The temporary tables offer very flexible usage so we can add a computed column instead of the LEN function which is placed on the WHERE clause. The insert query will be as below.

When we analyze this query we can see the usage of the TABLOCK hint after the INSERT statement. The usage purpose of this keyword is to enable a parallel insert option. So that, we can gain more performance. This situation can be seen in the execution plan.

SQL query tuning and parallel insert

In this way, we have inserted the 1.286.520 rows into the temporary table in just one second. However, the temporary table still holds more data than we need because we haven’t filtered the CreditCard ApprovalCode column values ​​with character lengths are greater than 10 in the insert operation. At this point, we will make a little trick and delete the rows whose are character length smaller than 10 or equal to 10. After the insert statement, we will add the following delete statement so that we will obtain the all qualified records in the temp table.

SQL Query tuning: Using indexes to improve sort performance

When we design an effective index for the queries which include the ORDER BY clause, the execution plan does not require to sort the result set because the relevant index returns the rows in the required order. Moving from this idea, we can create a non-clustered index that satisfies sort operation requirements. The important point about this SQL query tuning practice is that we have to get rid of the sort operator and the generated index advantage should outweigh the disadvantage. The following index will be helping to eliminate sort operation in the execution plan.

Now, we execute the following query and then examine the execution plan of the select query.

Improve sort operator performance with an index

As we can see in the execution plan, the database engine used the IX_Sort index to access the records and it also did not require to use of a sort operator because the rows are the sorted manner. In the properties of the index scan operator, we see an attribute that name is Scan Direction .

Non-clustered index scan direction

The scan direction attribute explains that SQL Server uses the b-tree structure to read the rows from beginning to the end at the leaf levels. At the same time, this index helps us to overcome the tempdb spill issue.

Non-clustered index structure  and scan direction

Finally, we see that the query execution time was reduced from 220 seconds to 33 seconds.

In this article, we learned practical details about SQL query tuning and these techniques can help when you try to solve a query performance problem. In the case study, the query which has a performance problem contained 3 main problems. These are:

  • Scalar-function problem
  • Using a serial execution plan
  • Tempdb spill issue

At first, we transformed the scalar-function into a subquery and implement it to query with the CROSS APPLY operator. In the second step, we eliminated the tempdb spill problem to use a temporary table. Finally, the performance of the query has improved significantly.

  • Recent Posts

Esat Erkec

  • SQL Performance Tuning tips for newbies - April 15, 2024
  • SQL Unit Testing reference guide for beginners - August 11, 2023
  • SQL Cheat Sheet for Newbies - February 21, 2023

Related posts:

  • Mapping schema and recursively managing data – Part 1
  • SQL SUBSTRING function and its performance tips
  • Top SQL Server Books
  • Parallel Nested Loop Joins – the inner side of Nested Loop Joins and Residual Predicates
  • A complete guide to T-SQL Metadata Functions in SQL Server

SQL DBA School

Case Studies and Real-World Scenarios

Case Study 1: Query Optimization

A financial institution noticed a significant performance slowdown in their central database application, affecting their ability to serve customers promptly. After monitoring and analyzing SQL Server performance metrics, the IT team found that a specific query, part of a core banking operation, took much longer than expected.

Using SQL Server’s Query Execution Plan feature, they found that the query was doing a full table scan on a large transaction table. The team realized the query could be optimized by adding an index on the columns involved in the WHERE clause. After adding the index and testing, the query’s execution time dropped significantly, resolving the application slowdown.

Case Study 2: TempDB Contention

An online retail company was experiencing sporadic slowdowns during peak times, which affected its website’s responsiveness. SQL Server Performance Monitoring revealed that the tempDB database was experiencing latch contention issues, a common performance problem.

The company’s DBA team divided the tempDB into multiple data files equal to the number of logical cores, up to eight, as recommended by Microsoft. This reduced contention and improved the performance of operations using the tempDB.

Case Study 3: Inefficient Use of Hardware Resources

A software development company was experiencing poor performance on their SQL Server, despite running on a high-end server with ample resources. Performance metrics showed that SQL Server was not utilizing all the available CPU cores and memory.

Upon investigation, the team found that SQL Server was running on default settings, which did not allow it to utilize all available resources. By adjusting SQL Server configuration settings, such as max degree of parallelism (MAXDOP) and cost threshold for parallelism, they were able to allow SQL Server to better use the available hardware, significantly improving server performance.

Case Study 4: Database Locking Issues

A large manufacturing company’s ERP system started to experience slowdowns that were affecting their production lines. The IT department, upon investigation, found that there were blocking sessions in their SQL Server database, causing delays.

Using the SQL Server’s built-in reports for “All Blocking Transactions” and “Top Sessions,” they found a poorly designed stored procedure holding locks for extended periods, causing other transactions to wait. After refactoring the stored procedure to hold locks for as short as possible, the blocking issue was resolved, and the system’s performance was back to normal.

These case studies represent common scenarios in SQL Server performance tuning. The specifics can vary, but identifying the problem, isolating the cause, and resolving the issue remains the same.

Case Study 5: Poor Indexing Strategy

A hospital’s patient records system began to experience performance issues over time. The plan was built on a SQL Server database and took longer to pull up patient records. The IT team noticed that the database had grown significantly more prominent over the years due to increased patient volume.

They used SQL Server’s Dynamic Management Views (DMVs) to identify the most expensive queries regarding I/O. The team found that the most frequently used queries lacked appropriate indexing, causing SQL Server to perform costly table scans.

They worked on a comprehensive indexing strategy, including creating new indexes and removing unused or duplicates. They also set up periodic index maintenance tasks (rebuilding or reorganizing) to optimize them. Post these changes, the time to retrieve patient records improved dramatically.

Case Study 6: Outdated Statistics

An e-commerce platform was dealing with sluggish performance during peak shopping hours. Their SQL Server-based backend was experiencing slow query execution times. The DBA team found that several execution plans were inefficient even though there were appropriate indexes.

After further investigation, they discovered that the statistics for several large tables in the database were outdated. SQL Server uses statistics to create the most efficient query execution plans. However, with outdated statistics, it was starting poor execution plans leading to performance degradation.

The team updated the statistics for these tables and set up an automatic statistics update job to run during off-peak hours. This change brought a noticeable improvement in the overall system responsiveness during peak hours.

Case Study 7: Memory Pressure

A cloud-based service provider was experiencing erratic performance issues on their SQL Server databases. The database performance would degrade severely at certain times, affecting all their customers.

Performance monitoring revealed that SQL Server was experiencing memory pressure during these periods. It turned out that the SQL Server instance was hosted on a shared virtual machine, and other applications used more memory during specific times, leaving SQL Server starved for resources.

The team decided to move SQL Server to a dedicated VM where it could have all the memory it needed. They also tweaked the ‘min server memory’ and ‘max server memory’ configurations to allocate memory to SQL Server optimally. This reduced memory pressure, and the erratic performance issues were solved.

Case Study 8: Network Issues

A multinational company with several branches worldwide had a centralized SQL Server-based application. Departments complained about slow performance, while the head office had no issues.

Upon investigation, it turned out to be a network latency issue. The branches that were geographically far from the server had higher latency, which resulted in slow performance. The company used a Content Delivery Network (CDN) to cache static content closer to remote locations. It implemented database replication to create read replicas in each geographical region. This reduced network latency and improved the application performance for all branches.

These examples demonstrate the wide range of potential SQL Server performance tuning issues. The key to effective problem resolution is a thorough understanding of the system, systematic troubleshooting, and the application of appropriate performance-tuning techniques.

Case Study 9: Bad Parameter Sniffing

An insurance company’s SQL Server database was experiencing fluctuating performance. Some queries ran fast at times, then slowed down unexpectedly. This inconsistent behavior impacted the company’s ability to process insurance claims efficiently.

After studying the execution plans and the SQL Server’s cache, the DBA team discovered that the issue was due to bad parameter sniffing. SQL Server uses parameter sniffing to create optimized plans based on the parameters passed the first time a stored procedure is compiled. However, if later queries have different data distributions, the initial execution plan might be suboptimal.

To resolve this, they used OPTIMIZE FOR UNKNOWN query hint for the stored procedure parameters, instructing SQL Server to use statistical data instead of the initial parameter values to build an optimized plan. After implementing this, the fluctuating query performance issue was resolved.

Case Study 10: Inadequate Disk I/O

An online gaming company started receiving complaints about slow game loading times. The issue was traced back to their SQL Server databases. Performance metrics showed that the disk I/O subsystem was a bottleneck, with high disk queue length and disk latency.

Upon investigation, they found that all their databases were hosted on a single, slower disk. To distribute the I/O load, they moved their TempDB and log files to separate, faster SSD drives. They also enabled Instant File Initialization (IFI) for data files to speed up the creation and growth of data files. These changes significantly improved disk I/O performance and reduced game loading times.

Case Study 11: SQL Server Fragmentation

A logistics company’s SQL Server database began to experience slower data retrieval times. Their system heavily relied on GPS tracking data, and they found that fetching this data was becoming increasingly slower.

The DBA team discovered high fragmentation on the GPS tracking data table, which had frequent inserts and deletes. High fragmentation can lead to increased disk I/O and degrade performance. They implemented a routine maintenance plan that reorganized or rebuilt indexes depending on the fragmentation level. They set up fill factor settings to reduce future fragmentation. This greatly improved data retrieval times.

Case Study 12: Excessive Compilation and Recompilation

A web hosting provider had a SQL Server database with high CPU usage. No heavy queries were running, and the server was not low on resources.

The DBA team found that the issue was due to excessive compilations and recompilations of queries. SQL Server compiles queries into execution plans, which can be CPU intensive. When queries are frequently compiled and recompiled, it can lead to high CPU usage.

They discovered that the application used non-parameterized queries, which led SQL Server to compile a new plan for each query. They worked with the development team to modify the application to use parameterized queries or stored procedures, allowing SQL Server to reuse existing execution plans and thus reducing CPU usage.

These cases emphasize the importance of deep knowledge of SQL Server internals, observant monitoring, and a systematic approach to identifying and resolving performance issues.

Case Study 13: Database Auto-growth Misconfiguration

A social media company faced performance issues on its SQL Server database during peak usage times. Their IT team noticed that the performance drops coincided with auto-growth events on the database.

SQL Server databases are configured by default to grow automatically when they run out of space. However, this operation is I/O intensive and can cause performance degradation if it happens during peak times.

The team decided to manually grow the database during off-peak hours to a size that could accommodate several months of data growth. They also configured auto-growth to a fixed amount rather than a percentage to avoid more extensive growth operations as the database size increased. This prevented auto-growth operations from occurring during peak times, improving overall performance.

Case Study 14: Unoptimized Cursors

A travel booking company’s SQL Server application was suffering from poor performance. The application frequently timed out during heavy load times, frustrating their users.

Upon analyzing, the DBA team found that the application heavily used SQL Server cursors. Cursors perform poorly compared to set-based operations as they process one row at a time.

The team worked with the developers to refactor the application code to use set-based operations wherever possible. They also ensured that the remaining cursors were correctly optimized. The change resulted in a significant improvement in application performance.

Case Study 15: Poorly Configured SQL Server Instance

An IT service company deployed a new SQL Server instance for one of their clients, but the client reported sluggish performance. The company’s DBA team checked the server and found it was not correctly configured.

The server was running on the default SQL Server settings, which weren’t optimized for the client’s workload. The team performed a series of optimizations, including:

  • Configuring the ‘max server memory’ option to leave enough memory for the OS.
  • Setting ‘max degree of parallelism’ to limit the number of processors used for parallel plan execution.
  • Enabling ‘optimize for ad hoc workloads’ to improve the efficiency of the plan cache.

After these changes, the SQL Server instance ran much more efficiently, and the client reported a noticeable performance improvement.

Case Study 16: Lack of Partitioning in Large Tables

A telecommunications company stored call records in a SQL Server database. The call records table was huge, with billions of rows, which caused queries to take a long time to run.

The DBA team decided to implement table partitioning. They partitioned the call records table by date, a standard filter condition in their queries. This allowed SQL Server to eliminate irrelevant partitions and only scan the necessary data when running queries. As a result, query performance improved dramatically.

In all these cases, thorough investigation and an in-depth understanding of SQL Server’s features and best practices led to performance improvement. Regular monitoring and proactive optimization are crucial to preventing performance problems and ensuring the smooth operation of SQL Server databases.

Case Study 17: Inappropriate Data Types

An educational institution’s student management system, built on a SQL Server database, suffered from slow performance when dealing with student records. The IT department discovered that the database design included many columns with data types that were larger than necessary.

For instance, student ID numbers were stored as NVARCHAR(100) even though they were always 10-digit numbers. This wasted space and slowed down queries due to the increased data size. The IT team worked on redesigning the database schema to use more appropriate data types and transformed existing data. The database size was significantly reduced, and query performance improved.

Case Study 18: Lack of Database Maintenance

A software firm’s application was facing intermittent slow performance issues. The application was built on a SQL Server database which had not been maintained properly for a long time.

The DBA team discovered that several maintenance tasks, including index maintenance and statistics updates, had been neglected. High index fragmentation and outdated statistics were causing inefficient query execution. They implemented a regular maintenance plan, including index defragmentation and statistics updates, which helped improve the query performance.

Case Study 19: Deadlocks

A stock trading company faced frequent deadlock issues in their SQL Server database, affecting their trading operations. Deadlocks occur when two or more tasks permanently block each other by having a lock on a resource that the different functions try to lock.

Upon reviewing the deadlock graph (a tool provided by SQL Server to analyze deadlocks), the DBA team found that specific stored procedures accessed tables differently. They revised the stored procedures to access tables in the same order. They introduced error-handling logic to retry the operation in case of a deadlock. This reduced the occurrence of deadlocks and improved the application’s stability.

Case Study 20: Improper Use of SQL Server Functions

A retail company’s inventory management system was suffering from poor performance. The DBA team, upon investigation, discovered that a critical query was using a scalar-valued function that contained a subquery.

Scalar functions can cause performance issues by forcing SQL Server to perform row-by-row operations instead of set-based ones. They refactored the query to eliminate the scalar function and replaced the subquery with a join operation. This change significantly improved the performance of the critical query.

In all these situations, the DBA teams had first to understand the problem, investigate the cause, and apply appropriate techniques to resolve the issues. Understanding SQL Server internals and keeping up with its best practices is vital for the smooth functioning of any application built on SQL Server.

Case Study 21: Excessive Use of Temp Tables

A media company faced a slow response time in its content management system (CMS). A SQL Server database powered this CMS. The system became particularly sluggish during peak hours when content-related activities surged.

Upon investigating, the DBA team found that several stored procedures excessively used temporary tables for intermediate calculations. While temporary tables can be handy, their excessive use can increase I/O on tempDB, leading to slower performance.

The team revised these stored procedures to minimize the use of temporary tables. Wherever possible, they used table variables or derived tables, which often have lower overhead. After the optimization, the CMS significantly improved performance, especially during peak hours.

Case Study 22: Frequent Table Scans

An e-commerce company experienced a gradual decrease in its application performance. The application was backed by a SQL Server database, which was found to be frequently performing table scans on several large tables upon investigation.

Table scans can be resource-intensive, especially for large tables, as they involve reading the entire table to find relevant records. Upon closer examination, the DBA team realized that many of the queries issued by the application did not have appropriate indexes.

The team introduced well-thought-out indexes on the tables. It made sure the application queries were written in a way to utilize these indexes. After these adjustments, the application performance improved significantly, with most queries executing much faster due to the reduced number of table scans.

Case Study 23: Unoptimized Views

A financial institution noticed slow performance in their loan processing application. This application relied on several complex views in a SQL Server database.

On review, the DBA team found that these views were not optimized. Some views were nested within other views, creating multiple layers of complexity, and some were returning more data than needed, including columns not used by the application.

They flattened the views to remove the unnecessary nesting and adjusted them to return only the required data. They also created indexed views for the ones most frequently used. These optimizations significantly improved the performance of the loan processing application.

Case Study 24: Log File Management Issues

A data analytics firm was facing a slowdown in their SQL Server-based data processing tasks. On investigation, the DBA team discovered that the log file for their central database was becoming extremely large, causing slow write operations.

The team found that the recovery model for the database was set to Full. Still, no transaction log backups were taken. In the Full recovery model, transaction logs continue to grow until a log backup is taken. They set up regular transaction log backups to control the log file size. They also moved the log file to a faster disk to improve the write operation speed. These changes helped in speeding up the data processing tasks.

In all these situations, systematic problem identification, root cause analysis, and applying the appropriate solutions were vital to improving SQL Server performance. Regular monitoring, preventive maintenance, and understanding SQL Server’s working principles are crucial in maintaining optimal database performance.

Case Study 25: Locking and Blocking Issues

A healthcare institution’s patient management system, running on a SQL Server database, was encountering slow performance. This was especially noticeable when multiple users were updating patient records simultaneously.

Upon investigation, the DBA team identified locking and blocking as the root cause. In SQL Server, when a transaction modifies data, locks are placed on the data until the transaction is completed to maintain data integrity. However, excessive locking can lead to blocking, where other transactions must wait until the lock is released.

To reduce the blocking issues, the team implemented row versioning-based isolation levels (like Snapshot or Read Committed Snapshot Isolation). They also optimized the application code to keep transactions as short as possible, thus reducing the time locks were held. These steps significantly improved the system’s performance.

Case Study 26: Outdated Statistics

An online marketplace experienced slow performance with its product recommendation feature. The feature relied on a SQL Server database that contained historical sales data.

The DBA team identified that the statistics on the sales data table were outdated. SQL Server uses statistics to create efficient query plans. However, if the statistics are not up-to-date, SQL Server may choose sub-optimal query plans.

The team implemented a routine job to update statistics more frequently. They also enabled the ‘Auto Update Statistics’ option on the database to ensure statistics were updated automatically when necessary. This led to an immediate improvement in the recommendation feature’s performance.

Case Study 27: Non-Sargable Queries

A sports statistics website saw a decrease in its website performance, especially when visitors were querying historical game statistics. A SQL Server database backed their site.

Upon reviewing the SQL queries, the DBA team found several non-sargable queries. These queries cannot take full advantage of indexes due to how they are written (e.g., using functions on the column in the WHERE clause).

The team worked with the developers to rewrite these queries in a sargable manner, ensuring they could fully use the indexes. This led to a substantial increase in query performance and improved the website’s speed.

Case Study 28: Over-Normalization

An HR application backed by a SQL Server database ran slowly, particularly when generating reports. The database schema was highly normalized, following the principles of reducing data redundancy.

However, the DBA team found that over-normalization led to excessive JOIN operations, resulting in slow query performance. They implemented denormalization in certain areas, introducing calculated and redundant fields where it made sense. This reduced the need for some JOIN operations and improved the application’s overall performance.

These cases show that performance troubleshooting in SQL Server involves understanding various components and how they interact. Addressing performance problems often requires a comprehensive approach, combining database configuration, query tuning, hardware adjustments, and, occasionally, changes to application design.

Case Study 29: Poor Query Design

A manufacturing company’s inventory management system was experiencing slow performance, especially when generating specific reports. The system was built on a SQL Server database.

The DBA team found that some queries used in report generation were poorly designed. They used SELECT * statements, which return all columns from the table, even though only a few columns were needed. This caused unnecessary data transfer and slowed down the performance.

The team revised these queries only to fetch the necessary columns. They also made other optimizations, such as avoiding unnecessary nested queries and replacing correlated subqueries with more efficient JOINs. These changes significantly improved the performance of the report generation process.

Case Study 30: Inefficient Indexing

A logistics company’s tracking system, running on a SQL Server database, was experiencing slow performance. Users were complaining about long loading times when tracking shipments.

Upon investigation, the DBA team discovered that the main shipment table in the database was not optimally indexed. Some critical queries didn’t have corresponding indexes, leading to table scans, while some existing indexes were barely used.

The DBA team created new indexes based on the query patterns and removed the unused ones. They also ensured to keep the indexing balanced, as excessive indexing could hurt performance by slowing down data modifications. After these indexing changes, the tracking system’s performance noticeably improved.

Case Study 31: Network Latency

A multinational corporation used a SQL Server database hosted in a different geographical location from the main user base. Users were experiencing slow response times when interacting with the company’s internal applications.

The IT team identified network latency as a critical issue. The physical distance between the server and the users was causing a delay in data transfer.

To solve this, they used SQL Server’s Always On Availability Groups feature to create a secondary replica of the database closer to the users. The read-only traffic was then directed to this local replica, reducing the impact of network latency and improving application response times.

Case Study 32: Resource-Intensive Reports

A fintech company ran daily reports on their SQL Server database during business hours. These reports were resource-intensive and caused the application performance to degrade when they were running.

The DBA team offloaded the reporting workload to a separate reporting server using SQL Server’s transaction replication feature. This ensured that the resource-intensive reports didn’t impact the performance of the primary server. They also scheduled the reports during off-peak hours to minimize user impact. This significantly improved the overall application performance during business hours.

These case studies underline the necessity of a proactive and comprehensive approach to managing SQL Server performance. Regular monitoring, appropriate database design, optimized queries, and a good understanding of how the database interacts with hardware and network can go a long way in maintaining optimal performance.

Case Study 33: Application with Heavy Write Operations

A social media application powered by a SQL Server database was facing slow performance due to a high volume of write operations from user posts, likes, and comments.

The DBA team found that the frequent write operations were causing high disk I/O, slowing down the application performance. They decided to use In-Memory OLTP, a feature in SQL Server designed for high-performance transactional workloads, by migrating the most frequently accessed tables to memory-optimized tables.

The team also introduced natively compiled stored procedures for the most common operations. In-memory OLTP significantly improved the write operation speed and overall application performance.

Case Study 34: Large Transactional Tables with No Archiving

A telecom company’s billing system was experiencing performance degradation over time. The system was built on a SQL Server database and retained years of historical data in the main transactional tables.

The DBA team found that the large size of the transactional tables was leading to slow performance, especially for queries involving range or full table scans. They introduced a data archiving strategy, moving older data to separate archive tables and keeping only recent data in the main transactional tables.

This reduced the transactional tables’ size, leading to faster queries and improved performance. In addition, it made maintenance tasks such as backups and index rebuilds quicker and less resource-intensive.

Case Study 35: Suboptimal Storage Configuration

A gaming company’s game-state tracking application was experiencing slow response times. A SQL Server database backed the application.

Upon investigation, the DBA team discovered that the database files were spread across multiple disks in a way that was not optimizing I/O performance. Some of the heavily used database files were located on slower disks.

The team reconfigured the storage, placing the most frequently accessed database files on SSDs (Solid State Drives) to benefit from their higher speed. They also ensured that data files and log files were separated onto different disks to balance the I/O load. After these adjustments, the application’s performance improved noticeably.

Case Study 36: Inefficient Use of Cursors

A government department’s record-keeping system, built on a SQL Server database, ran slow. The system was particularly sluggish when executing operations involving looping over large data sets.

The DBA team identified that the system used SQL Server cursors to perform these operations. Cursors are database objects used to manipulate rows a query returns on a row-by-row basis. However, they can be inefficient compared to set-based operations.

The team rewrote these operations to use set-based operations, replacing cursors with joins, subqueries, or temporary tables. These changes significantly improved the efficiency and performance of the data looping operations.

Each case study presents a unique scenario and solution, highlighting that SQL Server performance tuning can involve many factors. From the application design to the database schema, from the hardware configuration to the SQL Server settings – each aspect can significantly impact performance. By taking a methodical approach to identifying and addressing performance bottlenecks, it is possible to achieve substantial improvements.

Case Study 37: Use of Entity Framework without Optimization

A logistics company’s web application, backed by a SQL Server database, was experiencing slow load times. The application was built using. NET’s Entity Framework (EF) allows developers to interact with the database using .NET objects.

Upon review, the DBA team found that the Entity Framework was not optimally configured. For instance, “lazy loading” was enabled, which can lead to performance problems due to excessive and unexpected queries.

The team worked with developers to make necessary optimizations, like turning off lazy loading and using eager loading where appropriate, filtering data at the database level instead of the application level, and utilizing stored procedures for complex queries. After these optimizations, the web application’s performance significantly improved.

Case Study 38: Poorly Defined Data Types

An e-commerce platform was noticing slow performance when processing transactions. The platform’s backend was a SQL Server database.

The DBA team discovered that some of the columns in the transaction table were using data types larger than necessary. For instance, a column storing a small range of values used an INT data type when a TINYINT would suffice.

They adjusted the data types to match the data being stored more closely. This reduced the storage space and memory used by these tables, resulted in faster queries, and improved overall performance.

Case Study 39: Fragmented Indexes

A banking application was experiencing slow response times during peak usage hours. The application’s data was stored in a SQL Server database.

Upon reviewing the database, the DBA team found that the indexes on several critical tables were heavily fragmented. Index fragmentation can happen over time as data is added, updated, or deleted, leading to decreased query performance.

The DBA team implemented a regular maintenance plan to rebuild or reorganize fragmented indexes. They also adjusted some indexes’ fill factors to leave more free space and reduce future fragmentation. These steps led to improved query performance and faster response times for the banking application.

Case Study 40: Misconfigured Memory Settings

A CRM system was running slow, especially during data-heavy operations. The system was running on a SQL Server database.

Upon checking the SQL Server settings, the DBA team found that the maximum server memory was not correctly configured. The server was not utilizing the available memory to its full potential, which can impact SQL Server’s performance.

The team adjusted the memory settings to allow SQL Server to use more of the available memory, leaving enough memory for the operating system and other applications. This allowed more data to be kept in memory, reducing disk I/O and improving SQL Server performance.

These case studies further illustrate that performance tuning in SQL Server requires a multifaceted approach involving the database system and the related applications. Regular monitoring and maintenance and a good understanding of SQL Server’s working principles are essential in ensuring optimal database performance.

Case Study 41: Underutilized Parallelism

An analytics company was struggling with slow data processing times. They had a SQL Server database with multi-core processors, but the performance was unexpected.

The DBA team found that the server’s parallelism settings were not optimally configured. The “max degree of parallelism” (MAXDOP) setting, which controls how many processors SQL Server can use for single query execution, was set to 1, which meant SQL Server was not fully utilizing the available cores.

The team adjusted the MAXDOP setting to a more appropriate value considering the number of available cores and the workload characteristics. This allowed SQL Server to execute large queries more efficiently by spreading the work across multiple centers, improving data processing times.

Case Study 42: Bad Parameter Sniffing

An insurance company’s application was experiencing sporadic slow performance. The application was built on a SQL Server database and used stored procedures extensively.

Upon investigation, the DBA team discovered that the performance issues were due to “bad parameter sniffing.” SQL Server can create sub-optimal execution plans for stored procedures based on the parameters of the first execution, which may not work for subsequent executions with different parameters.

The team implemented the OPTION (RECOMPILE) query hint for the problematic stored procedures to force SQL Server to generate a new execution plan for each execution. They also used parameter masking for some procedures. This helped avoid bad parameter sniffing and improved the application’s performance consistency.

Case Study 43: Auto-Shrink Enabled

A retail company’s inventory management system, backed by a SQL Server database, was experiencing performance problems, slowing down irregularly.

The DBA team found that the “auto-shrink” option was enabled on the database. Auto-shrink can cause performance issues because it is resource-intensive and can lead to index fragmentation.

The team disabled auto-shrink and implemented a proper database size management strategy, manually shrinking the database only when necessary and immediately reorganizing indexes afterward. This resolved the irregular performance slowdowns and stabilized the system’s performance.

Case Study 44: Tempdb Contention

A travel booking website was noticing performance degradation during peak hours. Their system was built on a SQL Server database.

Upon review, the DBA team found signs of contention in tempdb, a system database used for temporary storage. Tempdb contention can slow down the system as queries wait for resources.

The team implemented several measures to reduce tempdb contention, including configuring multiple equally sized tempdb data files, adding more tempdb files, and using trace flag 1118 to change how SQL Server allocates extents. These steps helped alleviate the tempdb contention and improved the system’s peak performance.

These case studies showcase that SQL Server performance tuning is dynamic, requiring ongoing adjustments and a deep understanding of SQL Server’s various features and settings. By monitoring the system closely and being ready to investigate and address issues promptly, you can ensure your SQL Server databases run efficiently and reliably.

Case Study 45: Locking and Blocking

A healthcare company’s patient record system, powered by a SQL Server database, was experiencing slow performance during high user activity periods.

Upon investigation, the DBA team found high locking and blocking. This was due to a few long-running transactions that were locking critical tables for a significant amount of time, preventing other transactions from accessing these tables.

The DBA team optimized the problematic transactions to make them more efficient and faster. They also implemented row versioning by enabling Read Committed Snapshot Isolation (RCSI) on the database to allow readers not to block writers and vice versa. This alleviated the locking and blocking issue and led to a significant improvement in performance.

Case Study 46: Over-normalization

An e-commerce website was experiencing slow load times, particularly in product categories and search pages. The company’s product catalog was stored in a SQL Server database.

Upon review, the DBA team found that the database schema was overly normalized. While normalization is generally a good practice as it reduces data redundancy, in this case, it led to an excessive number of query joins, causing slower performance.

The DBA team worked with the developers to denormalize the database schema slightly. They created computed columns for frequently calculated fields and materialized views for commonly executed queries with multiple joins. These changes reduced the number of joins required in the queries and improved the website’s performance.

Case Study 47: Suboptimal Statistics

A software company’s project management application was running slow. The application was built on a SQL Server database.

Upon checking the database, the DBA team found that the statistics were not up-to-date on several large tables. Statistics in SQL Server provide critical information about the data distribution in a table, which the query optimizer uses to create efficient query execution plans.

The team set up a maintenance job to regularly update statistics on the database tables. They also adjusted the “auto update statistics” option to ensure that statistics are updated more frequently. These steps helped the query optimizer generate more efficient execution plans, improving query performance.

Case Study 48: Improper Use of Functions in Queries

A media company’s content management system was experiencing slow response times. The system was built on a SQL Server database.

The DBA team identified several frequently executed queries using scalar functions on columns in the WHERE clause. This practice prevents SQL Server from effectively using indexes on those columns, leading to table scans and slower performance.

The team avoided using functions on indexed columns in the WHERE clause, allowing SQL Server to use the indexes efficiently. This significantly improved the performance of these queries and the overall response time of the system.

As these case studies illustrate, various issues can affect SQL Server performance. Addressing them requires a good understanding of SQL Server, a methodical approach to identifying problems, and collaboration with other teams, such as developers, to implement optimal solutions.

Case Study 49: Excessive Use of Temp Tables

A finance firm’s risk assessment software, built on SQL Server, was experiencing slower performance. The software was executing numerous calculations and transformations, using temp tables extensively.

Upon reviewing the operations, the DBA team found that the excessive use of temp tables led to high I/O operations and caused contention in tempdb. They also found that some temp tables were unnecessary as the same operations could be achieved using more straightforward queries or table variables, which have a lower overhead than temp tables.

The DBA team and developers collaborated to refactor the procedures to reduce the use of temp tables. They replaced temp tables with table variables where possible and sometimes rewrote queries to avoid needing temporary storage. This reduced the load on tempdb and improved the software’s performance.

Case Study 50: High Network Latency

An international company was experiencing slow performance with its distributed applications. These applications interacted with a centralized SQL Server database in their headquarters.

Upon investigation, the DBA team found that network latency was a significant factor causing the slow performance. The network latency was exceptionally high for the company’s overseas offices.

To address this, they implemented SQL Server’s data compression feature to reduce the amount of data sent over the network. They also combined caching data at the application level and local read-only replicas for overseas offices. This resulted in reduced network latency and improved application performance.

Case Study 51: Large Data Loads During Business Hours

A manufacturing company’s ERP system was experiencing slow performance during specific periods of the day. A SQL Server database backed the system.

The DBA team found that large data loads were being run during business hours, impacting the system’s performance. These data loads were locking tables and consuming significant server resources.

The team rescheduled the data loads to off-peak hours, ensuring minimal impact on business users. They also optimized the data load processes using techniques such as bulk insert and minimally logged operations to make them run faster and consume fewer resources.

Case Study 52: Inefficient Code

A software company’s internal tool was running slow. The tool was built on a SQL Server database and used stored procedures extensively.

The DBA team found that some of the stored procedures were written inefficiently. There were instances of cursor use where set-based operations would be more appropriate, and some procedures called other procedures in a loop, causing many executions.

The team worked with developers to optimize the stored procedures. They replaced cursors with set-based operations and unrolled loops where possible, reducing the number of procedure executions. They also added appropriate indexes to support the queries in the stored procedures. These changes improved the code’s efficiency and the tool’s overall performance.

These case studies underscore that SQL Server performance issues can arise from different areas – from inefficient code to infrastructure factors like network latency. Keeping a keen eye on performance metrics, proactively managing server resources, and maintaining efficient database code are all part of the toolkit for managing SQL Server performance.

Leave A Reply Cancel reply

You must be logged in to post a comment.

Login with your site account

Remember Me

Not a member yet? Register now

Register a new account

I accept the Terms of Service

Are you a member? Login now

Modal title

  • Support Portal Login

Productivity Add-ons

Operations add-ons, e-commerce solutions, integrated tools, software services.

ERP Without The Pain

ERP Without The Pain

Get a bespoke ERP system without the traditional pains by using Business Central through Dynamics Consultants

Business Central Training Centre

Business Central Training Centre

Learn everything you need to know about Business Central in the Business Central Training Centre

Business Central Support

Business Central Support

UK based Business Central support desk, contactable via phone, email and support portal. 

About Dynamics Consultants

I like working at DC because of the positive culture of the company and the core values of the company. At DC, we are always encouraged to do the right thing which makes DC a great place to work. Jagdeep Rattu Business Central Consultant

Meet the Team

Popular Blogs

Highlight

Subscribe to us on YouTube for regular content!

Microsoft SQL Server Case Study

Tom Jenkins

Microsoft SQL Server

Microsoft SQL Server® is a market leading enterprise level database solution used by any a large number and variety of applications on which to host their databases and store their data. Microsoft SQL Server is an incredibly powerful, scalable and robust solution, however it is it’s robustness that often leads customers into a false sense of security.

As with anything in life, things can go wrong, and this is true with SQL Server. Your valuable data can be lost for a number of reasons such as hardware failure, theft, fire, flood, user error etc. and so it is worth planning for such events to make recovery as painless as possible.

With SQL Server, there are many ways to improve the recovery from data loss, such as mirroriing, transaction log shipping and always on high availability, all of which offer differing levels of protection at a variety of price points. Here we will look at the simplest and most cost effective solution for an SME to protect their data – a decent backup.

A Real-Life Example

Before we look at how we should implement SQL Server backups, let us look at a real life example of how a good backup strategy works.

In this particular example, the customer was running Microsoft SQL Server standard to host their Microsoft Dynamics® NAV database. Microsoft SQL Server was running on its own dedicated server with the Dynamics NAV database configured to use a full recovery model with full backups running daily at night and log backups running hourly during the working day to a network share on a different server.

On this particular day, the customer’s IT manager decided to test the UPS protecting the SQL Server, by unplugging it from the wall, something he had diligently been doing on a regular basis, however this time the UPS failed and the server immediately lost power. The server was powered up, and at first, all seemed to be OK, until after a short while (about an hour) it became apparent that the G/L Entry table (a somewhat important table in a NAV database) was corrupt. The customer in question was a distribution company and had a number of shipments that they needed to get out of the door before the end of the day, and so the prospect of recovering the database at that point in time was not very appealing. After a short discussion with Dynamics Consultants, we made a small tweak to the setup to allow them to continue processing warehouse transactions without needing to write to the G/L Entry table, allowing them to continue to ship orders for the rest of the day.

This still left the customer with a corrupt database, and now with a number of shipments processed, as well as other database activity, since that corruption had happened. However, because their database was configured with a full recovery model, we were able to restore to a fresh database the last full backup prior to the failure and all transaction log backups since, including a final log backup before disabling the database, and in so doing leaving the customer an uncorrupted database with no data loss, very happy, and with an extremely relieved IT manager.

Backup Strategy

So, what can we learn from this. Firstly, don’t test your UPS during the working day, but more importantly, make sure you have an appropriate backup strategy. (NOTE: Many VM level backup strategies would not have worked in the above situation, as it would have also backup up the corruption).

So what is the correct backup strategy? Well, there is no right or wrong answer as it depends very much on the level of database updates, what the database is used for, the size of the database and individual’s assessment of the risks associated with a failure in terms of acceptable downtime and data loss, but as a starting point, an SME should consider the following for a production database.

  • Full Recovery Model
  • Backups to a separate network server.
  • Backups to an offsite location.
  • Daily full and hourly log backups
  • Backup encryption

If you are unsure about your database backups, then you should seek advice from an experienced SQL Server administrator, or alternatively attend our SQL Server Basics course to obtain a good working overview.

Find out More

If you would like to find out more about SQL Server, why not join one of our extremely popular SQL training courses? Based at our comfortable offices on the outskirts of Southampton, Hampshire, our expert consultants have 4.5star rating reviews.

SQL Server Training >

Tom Jenkins

Tom Jenkins

Tom is one of the founding directors of Dynamics Consultants. He has worked on ERP / CRM systems since 1995, initially as an end user and later as a developer/consultant. Before founding Dynamics Consultants, Tom worked in a management role for a machinery importer / reseller, where his work included inventory management and purchase control, systems development, and IT project management.

What’s New in Business Central 2024 Wave 1

What’s New in Business Central 2024 Wave 1

What’s New in Business Central 2024 Wave 1: functionality from Business Central Updates, Dynamics 365 integration, Copilot, Shopify and Power Platform

How AI Can Boost Predictive Maintenance In Manufacturing

How AI Can Boost Predictive Maintenance In Manufacturing

How can Artificial Intelligence be used for predictive maintenance in manufacturing? New software technologies are helping business operations.

Manufacturing and Technology in 2023

Manufacturing and Technology in 2023

Summarising technology changes for manufacturing companies in 2023 and what that means for 2024 such as artificial intelligence and industry 4.0

  • Process Automation
  • Data & Analytics
  • Microsoft Syntex
  • IT Help Desk
  • CyberSecurity
  • IT Modernization
  • Contact Center Operations
  • Custom Software Development
  • Construction
  • Oil & Gas
  • Microsoft Partner
  • Case Studies

SQL Server Case Study

Established in 1972, the port’s primary business is offloading foreign crude oil from tankers, and storing and distributing the inventory to refineries throughout the Gulf Coast and Midwest. As the single largest point of entry for crude oil coming into the U.S., the client must serve its customers 24 hours a day, seven days a week.

With significant SQL server infrastructure including a custom Oil Management System, internal SharePoint and various third-party applications, the client needed a strategic partner who could serve as an extension of its IT team to: Set up and review its disaster recovery solution, SQL Server maintenance plans, administrative auditing and basic SQL Server performance Develop centrally managed and automated backups and SQL Servers to address critical issues quickly and effectively Build consistent SQL database index maintenance plans and integrity checks for all staff  

Using a checklist-based approach, Sparkhound met with senior staff to review all production SQL Servers and bring all systems up to SQL Server service pack level for long-term success. Built custom-developed SQL scripts that work alongside the enterprise backup software for backup redundancy and automated maintenance plans for the client’s Oil Management System Implemented auditing and custom-developed SQL Server Traces to fulfill government standards Trained system administrators and developers on SQL Server best practices  

By utilizing Sparkhound’s SQL Server consultant, the client received a Microsoft-certified expert who understood the complexity of the client’s mission-critical data, ensured all disaster recovery and security auditing requirements were met, and delivered a seamless knowledge transfer post-implementation.

You May Also Like

These Related Stories

logistics-and-transportation-of-cargo-freight-ship

Citrix-XenApp Case Study

refinery-storage-tanks

IMTT Service Desk Case Study

 Oil & Gas Service Provider

Improving Forecasting and Utilization for Oil & Gas Service Provider

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications

Contains solutions for #8WeekSQLChallenge case studies https://8weeksqlchallenge.com/

sharkawy98/sql-case-studies

Folders and files, repository files navigation, 8 weeks sql challenge.

This repository contains solutions for #8WeekSQLChallenge, they are interesting real-world case studies that will allow you to apply and enhance your SQL skills in many use cases. I used Microsoft SQL Server in writing SQL queries to solve these case studies.

Table of Contents

Sql skills gained.

  • Case study 1
  • Case study 2
  • Case study 3
  • Some interesting queries from my solutions
  • Data cleaning & transformation
  • Aggregations
  • Ranking (ROW_NUMBER, DENSE_RANK)
  • Analytics (LEAD, LAG)
  • CASE WHEN statements
  • UNION & INTERSECT
  • DATETIME functions
  • Data type conversion
  • TEXT functions, text and string manipulation

Case Study #1 : Danny's Diner

My solutions

image

Case Study #2 : Pizza Runner

image

Case Study #3 : Foodie-Fi

image

Case Study #4 : Data Bank

image

Some interesting queries

  • Business Intelligence & Data Warehousing
  • Data Science & AI
  • Python, Hadoop, Spark and Kafka
  • Cloud Data Warehousing
  • Planning & Consolidation
  • Managed Services
  • KPI Cloud Analytics
  • KPI Real-Time Analytics
  • BI Report Conversion Utilities
  • Oracle to Snowflake Migration Utility
  • Discoverer Migration Wizard
  • Integrated Salesforce-ERP Analytics
  • Amazon Web Services
  • Microsoft Azure
  • ThoughtSpot

Experience / Case Studies

  • Testimonials
  • White Papers
  • Quick Start Programs

Case Study: Wichita Public Schools Modernizing School Analytics on Azure

case study on microsoft sql server

While keeping student and educator data secure

About wichita public schools.

Wichita Public Schools provides K-12 public education in Wichita, Kansas.

Tags: Case Study , Power BI , Microsoft SQL Server , Azure Synapse Analytics , Azure Data Factory , Azure Data Lake , PeopleSoft ERP , Will Hutchinson

Case Study: Williams-Sonoma - KPI Retail Analytics on Azure and Tableau

case study on microsoft sql server

Leveraging KPI Retail Analytics on Azure and Tableau to provide quicker and more meaningful retail reporting

About williams-sonoma, inc..

Williams-Sonoma, Inc. is a specialty retailer of high quality products for the home with $4.4 billion revenue. These products, representing eight distinct merchandise strategies—Cookware, tools, electrics, cutlery, tabletop and bar, outdoor, furniture —are marketed through seven e-commerce websites, eight direct mail catalogs and 600+ stores in 44 US states and 10 countries.

Tags: Tableau , Microsoft Azure , Williams-Sonama , KPI Retail Analytics , Microsoft SQL Server

  • Case Studies
  • Aerospace and Defence
  • Life Sciences
  • Manufacturing
  • Travel and Transportation
  • Financial Services
  • Wholesalers
  • Pharmaceutical
  • Consumer Goods
  • Digital Music
  • Hospitality
  • Professional Services
  • Public Sector
  • Agriculture
  • Property Management
  • Medical Devices
  • Asset Management
  • Property and Casualty Insurance
  • Retail Banking
  • Medical device
  • Advertising
  • Credit Scoring
  • Engineering
  • Health Insurance
  • Electricity
  • Investment Banking
  • Life Insurance
  • Wealth Management
  • Business Intelligence
  • Data Discovery
  • ERP Analytics
  • Marketing Analytics
  • HR Analytics
  • Customer 360 Analytics
  • Planning and Consolidation
  • Online NOSQL Databases
  • Cloud Application Integration
  • Cloud Master Data Management
  • Delivery Leadership
  • Product Engineering
  • Informatica
  • Salesforce.com
  • Systems Implementation
  • Application Extensions
  • Migration Programs
  • System Upgrades
  • Acceleration Tools
  • AMC Entertainment
  • Avery Dennison
  • Bank of Hawaii
  • Cox Communications
  • Cricket Wireless
  • Dealer Tire
  • EMC Corporation
  • Family Dollar
  • GE Healthcare
  • New York University
  • PNC Financial Services
  • Progressive Insurance
  • Royal Caribbean
  • San Diego Unified School District
  • Savings Bank Life Insurance
  • St. Jude Medical
  • State of Louisiana
  • Stiefel Laboratories
  • Travis Credit Union
  • University of California-Berkeley

VIEW ALL CASE STUDIES

KPI Partners provides strategic guidance and technology systems for clients wishing to solve complex business challenges involving cloud applications and big data. Learn more

  • Technologies

Oracle | Tableau | Snowflake | AWS | Azure | Confluent Qlik | MapR | Cloudera | Hortonworks | DataStax | SAP Teradata | NetSuite | Salesforce | Attunity | Denodo |  Numerify View all

  • Our Process
  • BI & Data Warehousing

KPI Partners Great Place to Work Certified

KPI Partners, Inc. 39899 Balentine Drive, Suite #212

Newark, CA 94560, USA Tel: (510) 818-9480

© 2022 KPI Partners

This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

AdventureWorks sample databases

  • 12 contributors

This article provides direct links to download AdventureWorks sample databases, and instructions for restoring them to SQL Server, Azure SQL Database, and Azure SQL Managed Instance.

For more information about samples, see the Samples GitHub repository .

Prerequisites

  • SQL Server or Azure SQL Database
  • SQL Server Management Studio (SSMS) or Azure Data Studio

Download backup files

Use these links to download the appropriate sample database for your scenario.

  • OLTP data is for most typical online transaction processing workloads.
  • Data Warehouse (DW) data is for data warehousing workloads.
  • Lightweight (LT) data is a lightweight and pared down version of the OLTP sample.

If you're not sure what you need, start with the OLTP version that matches your SQL Server version.

Additional files can be found directly on GitHub:

  • SQL Server 2014 - 2022
  • SQL Server 2012
  • SQL Server 2008 and 2008R2

Restore to SQL Server

You can use the .bak file to restore your sample database to your SQL Server instance. You can do so using the RESTORE Statements command, or using the graphical interface (GUI) in SQL Server Management Studio (SSMS) or Azure Data Studio .

  • SQL Server Management Studio (SSMS)
  • Transact-SQL (T-SQL)
  • Azure Data Studio

If you're not familiar using SQL Server Management Studio (SSMS), you can see connect & query to get started.

To restore your database in SSMS, follow these steps:

Download the appropriate .bak file from one of links provided in the download backup files section.

Move the .bak file to your SQL Server backup location. This location varies depending on your installation location, instance name, and version of SQL Server. For example, the default location for a default instance of SQL Server 2022 (16.x) is:

Open SSMS and connect to your SQL Server instance.

Right-click Databases in Object Explorer > Restore Database... to launch the Restore Database wizard.

Screenshot showing how to choose to restore your database by right-clicking databases in Object Explorer and then selecting Restore Database.

Select Device and then select the ellipses (...) to choose a device.

Select Add and then choose the .bak file you recently moved to the backup location. If you moved your file to this location but you're not able to see it in the wizard, SQL Server or the user signed into SQL Server doesn't have permission to this file in this folder.

Select OK to confirm your database backup selection and close the Select backup devices window.

Check the Files tab to confirm the Restore as location and file names match your intended location and file names in the Restore Database wizard.

Select OK to restore your database.

Screenshot showing the Restore Database window with the backup set to restore highlighted and the OK option highlighted.

For more information on restoring a SQL Server database, see Restore a database backup using SSMS .

You can restore your sample database using Transact-SQL (T-SQL). An example to restore AdventureWorks2022 is provided in the following example, but the database name and installation file path can vary depending on your environment.

To restore AdventureWorks2022 on Windows , modify values as appropriate to your environment and then run the following Transact-SQL (T-SQL) command:

To restore AdventureWorks2022 on Linux , change the Windows filesystem path to Linux, and then run the following Transact-SQL (T-SQL) command:

If you're not familiar using Azure Data Studio Studio , see connect & query to get started.

To restore your database in Azure Data Studio, follow these steps:

Open Azure Data Studio and connect to your SQL Server instance.

Right-click on your server and select Manage .

Screenshot showing Azure Data Studio with the Manage option highlighted and called out.

Select Restore

Screenshot of selecting restore from the top menu to restore your database.

On the General tab, fill in the values listed under Source .

  • Under Restore from , select Backup file .
  • Under Backup file path , select the location you stored the .bak file.

Screenshot of selecting your backup file path.

This step autopopulates the rest of the fields such as Database , Target database and Restore to .

Screenshot of fields autopopulating.

Select Restore to restore your database.

Screenshot of restoring your database.

Deploy to Azure SQL Database

You have two options to view sample Azure SQL Database data. You can use a sample when you create a new database, or you can deploy a database from SQL Server directly to Azure using SSMS.

To get sample data for Azure SQL Managed Instance instead, see restore World Wide Importers to SQL Managed Instance .

Deploy new sample database

When you create a new database in Azure SQL Database, you can create a blank database, restore from a backup, or select sample data to populate your new database.

Follow these steps to add a sample data to your new database:

Connect to your Azure portal.

Select Create a resource in the top left of the navigation pane.

Select Databases and then select SQL Database .

Fill in the requested information to create your database.

On the Additional settings tab, choose Sample as the existing data under Data source :

Screenshot of Choose sample as the data source on the Additional settings tab in the Azure portal when creating your Azure SQL Database.

Select Create to create your new SQL Database, which is the restored copy of the AdventureWorksLT database.

Deploy database from SQL Server

SSMS allows you to deploy a database directly to Azure SQL Database. This method doesn't currently provide data validation so is intended for development and testing and shouldn't be used for production.

To deploy a sample database from SQL Server to Azure SQL Database, follow these steps:

Connect to your SQL Server in SSMS.

If you haven't already done so, restore the sample database to SQL Server .

Right-click your restored database in Object Explorer > Tasks > Deploy Database to Microsoft Azure SQL Database... .

Screenshot of Choose to deploy your database to Microsoft Azure SQL Database from right-clicking your database and selecting Tasks.

Follow the wizard to connect to Azure SQL Database and deploy your database.

Creation scripts

Instead of restoring a database, alternatively, you can use scripts to create the AdventureWorks databases regardless of version.

The below scripts can be used to create the entire AdventureWorks database:

  • AdventureWorks OLTP Scripts Zip
  • AdventureWorks DW Scripts Zip

Additional information about using the scripts can be found on GitHub .

Related content

  • Database Engine Tutorials
  • Quickstart: Connect and query a SQL Server instance using SQL Server Management Studio (SSMS)
  • Quickstart: Use Azure Data Studio to connect and query SQL Server

Was this page helpful?

Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see: https://aka.ms/ContentUserFeedback .

Submit and view feedback for

Additional resources

ExLibris Esploro

  • SQL Cheat Sheet
  • SQL Interview Questions
  • MySQL Interview Questions
  • PL/SQL Interview Questions
  • Learn SQL and Database
  • Deleting a Column in SQL Server
  • SQL Server REPLACE() Function
  • SQLite Rename Column
  • How to Rename a View in SQL Server?
  • Change Primary Key Column in SQL Server
  • How to Search For Column Names in SQL?
  • How to Rename a Column in PL/SQL?
  • SQL Server Rename Table
  • SQL Server ALIASES
  • How to Rename SQL Server Schema?
  • QUOTENAME() Function in SQL Server
  • LEN() Function in SQL Server
  • RANK() Function in SQL Server
  • REVERSE() Function in SQL Server
  • RTRIM() Function in SQL Server
  • LOWER() function in SQL Server
  • ALTER SCHEMA in SQL Server
  • Swap Column Values in SQL Server
  • SQL Server Collation

Rename Column in SQL Server

SQL Server is a widely used Relational Database Management System ( RDBMS ) that allows users to create and manage databases effectively. Renaming a column in a database is a common task usually required when users want to change the database schema. In this article, we will explore different methods to rename columns in SQL Server.

There are 2 methods through which we can rename columns of a database in SQL Server which are explained below:

  • Using the sp_rename system stored procedure
  • Using ALTER statement

Let’s set up an environment

To understand how we can rename columns in SQL Server, we will consider the table Customers as shown below:

1. Using the sp_rename System Stored Procedure

To rename a column of a database in SQL Server, we can use the sp_rename system stored procedure. The sp_rename procedure is a built-in system stored procedure that allows the users to rename various database objects like tables, columns, views, and indexes. Following is the syntax to use the sp_rename system stored procedure:

  • table_name: is the name of the table containing the column you want to rename.
  • old_column_name: is the current name of the column.
  • new_column_name: is the desired new name for the column.
  • COLUMN: is an additional parameter that directs the system procedure to rename the column.

To change the column name from State to Residential State of the above table Customers we will have to run the following query

Explanation: The EXEC sp_rename command renames the State column in the Customers table to ‘ Residential State ‘. The resulting table schema reflects this change, displaying the new column name ‘ Residential State ‘ instead of State, with all data remaining intact.

2. Using ALTER S tatement

To rename the column of a database in SQL Server, we can also use the ALTER table command. ALTER is a Data Definition Language(DDL) command that is used to update the schema of tables, views, and indexes in SQL Server. Following is the syntax to rename the column in SQL Server using the ALTER command:

  • old_column_name: is the current name of the column that you want to rename.

To change the column name from CustomerName to User of the above table Customers we will have to run the following query

Explanation: The ‘ ALTER TABLE Customers RENAME COLUMN CustomerName TO Users ‘ command attempts to rename the CustomerName column to Users . However, SQL Server does not support this syntax; use sp_rename instead for renaming columns in SQL Server.

In conclusion, SQL Server provides efficient tools for renaming columns, primarily using the sp_rename system stored procedure. While ALTER TABLE RENAME COLUMN is a standard SQL syntax in some database systems, SQL Server relies on sp_rename for this operation. Users can effectively manage and update their database schemas using these methods.

author

Please Login to comment...

Similar reads, improve your coding skills with practice.

 alt=

What kind of Experience do you want to share?

Microsoft Dynamics 365 Blog

A stack of books overlayed with the Copilot logo and Copilot prompts that read: "Ask me anything", "Generate a project kickoff presentation for /meeting", "What's the latest from Mona, organized by emails, messages, and files?", and "Compare online and offline marketing strategies for next quarter."

Microsoft and LinkedIn release the 2024 Work Trend Index on the state of AI at work  

For our fourth annual Work Trend Index, out today, we partnered with LinkedIn for the first time on a joint report so we could provide a comprehensive view of how AI is not only reshaping work, but the labor market more broadly.

2024 release wave 1 kicks off with hundreds of AI-powered capabilities for Microsoft Dynamics 365 and Microsoft Power Platform  

Introducing microsoft copilot for finance: transform finance with next-generation ai in microsoft 365  , microsoft copilot for sales and copilot for service are now generally available  .

A person smiling, looking at their laptop. The image incorporates the Dynamics 365 Business Central icon and the Copilot icon.

  • Business Decision Makers
  • News and product updates
  • Dynamics 365 Business Central

Work smarter with Microsoft Copilot in Dynamics 365 Business Central  

In the quickly changing world of AI, Microsoft Dynamics 365 Business Central is leading the way with innovations that have equipped more than 30,000 small and medium-sized businesses to succeed. Powered by next-generation AI, Microsoft Copilot in Dynamics 365 Business Central introduces new ways to streamline workflows, boost productivity, and unlock creativity.

A field employee outside in a hard hat and vest, looking at a tablet in their hand. The image incorporates the Dynamics 365 Field Service icon and the Copilot icon.

  • Dynamics 365 Field Service

Enabling fast, flexible, cost-effective service with Microsoft Copilot in Dynamics 365 Field Service  

Fast, efficient service, it’s what everybody wants. And today’s field service organizations are answering the call by adopting next-generation AI technologies that can help them be more flexible and responsive to customers while also driving revenue, reducing overtime, and ensuring more predictable arrival and completion times.

Volvo Penta technician asks Microsoft Copilot in Dynamics 365 Guides to show her the details of a marine engine.

  • Thought leadership
  • Dynamics 365 Guides

Early adopters of Microsoft Copilot in Dynamics 365 Guides recognize the potential for productivity gains  

The integration of Microsoft Copilot into Dynamic 365 Guides brings generative AI to this mixed reality solution. Copilot for Dynamics 365 Guides transforms frontline operations, putting AI in the flow of work, giving skilled and knowledge workers access to relevant information where and when they need it.

Two people, standing looking at a tablet together. The image incorporates the Copilot, Customer Insights, and Sales icons.

  • Dynamics 365 Customer Insights

2024 release wave 1: Transforming experiences with Microsoft Copilot and Dynamics 365  

In this extraordinary age of AI, we find ourselves on the brink of a profound revolution. Companies are looking for generative AI to solve longstanding problems around customer connection, loyalty, and seller productivity.

Two employees crouching down by boxes in a warehouse. The image incorporates the Dynamics 365 icon.

Microsoft unveils AI features for manufacturers at Hannover Messe 2024  

At Hannover Messe, the world’s leading industrial trade fair, organizations across engineering, digital technologies, energy, and more will gather to demonstrate solutions for high-performance, sustainable industries. Microsoft is honored to attend this year’s event to showcase how Microsoft Dynamics 365 helps manufacturers.

A field service employee wearing a hard hat, holding a tablet

Introducing new Microsoft Copilot capabilities to optimize Dynamics 365 Field Service operations    

Delivering exceptional service is key for building customer preference and loyalty. Today, we’re introducing new capabilities for Microsoft Copilot in Dynamics 365 Field Service that help service managers and technicians efficiently find the information they need to resolve issues right the first time while keeping customers updated at every step of the process.

Close-up of an employee with a phone headset on; image incorporates the Dynamics 365 Customer Service icon

  • Dynamics 365 Customer Service​

AI-powered innovations enhance customer service with 2024 release wave 1  

We’re excited to announce the general availability of new and enhanced experiences in Microsoft Dynamics 365 Customer Service as part of our 2024 release wave 1 cadence. This release focuses on extending Microsoft Copilot capabilities by infusing generative AI into customer, agent, and supervisor experiences.

Abstract graphic design incorporating an image of two women talking to each other, one holding a tablet.

  • Dynamics 365 Sales

New Microsoft Dynamics 365 and Microsoft Copilot innovations for supply chain, sales, and service join the 2024 release wave 1  

Sellers, service agents, and supply chain professionals share a common goal: delivering quality goods and services to customers on time, every time. Today, we’re announcing new experiences for Microsoft Dynamics 365 that help professionals across business functions to collaboratively solve challenges, streamline workflows, and focus on what matters most—key factors for transformative customer experiences.

Two people talking, looking at a tablet. The image incorporates the Dynamics 365 Customer Insights icon.

Revolutionizing marketing workflows with Copilot in Dynamics 365 Customer Insights  

In the ever-evolving landscape of generative AI, a copilot isn't just a companion that makes tasks that you’re already doing at work easier, but it's quickly becoming a transformative force reshaping the very core of how things are done.

Microsoft Dynamics 356 Customer Service icon next to a customer service agent on her laptop

  • Analyst reports

Forrester TEI study shows 315% ROI when modernizing customer service with Microsoft Dynamics 365 Customer Service  

We are pleased to share the results of a March 2024 Forrester Consulting Total Economic Impact (TEI) Study commissioned by Microsoft. Forrester calculates Dynamics 365 Customer Service delivered benefits of $14.70 million over three years to a composite organization.

Ribbon-like graphic design

Explore the next wave of AI innovation at the Microsoft Business Applications Launch Event  

Join Microsoft product leaders and engineers on April 10, 2024 for an in-depth look at the latest AI features and capabilities in Dynamics 365 and Microsoft Power Platform.

Customer service support call in action, including the Dynamics 365 Customer Service icon.

Microsoft is a Leader in The Forrester Wave™: Customer Service Solutions, Q1 2024   

Most organizations find it’s no longer good enough to just measure successful service engagements solely on whether a customer issue is resolved. Instead, they aim to deliver personalized, fast service experiences at every touchpoint through all engagement channels.

case study on microsoft sql server

Customer Case Study: Fujitsu Composite AI and Semantic Kernel

Matthew bolanos.

May 21st, 2024 0 0

Japanese multinational Fujitsu, a pioneer of information and communications technology, has been transforming industries with innovative solutions since 1935. With a workforce of 124,000 dedicated professionals across 50 countries, Fujitsu is committed to building trust and fostering sustainability through its groundbreaking technologies.

A diverse portfolio that includes everything from IT services to server equipment has a new member: AI . Fujitsu’s AI solutions (branded as Fujitsu Kozuchi) are broken into seven areas:

  • Generative AI
  • Predictive Analysis

With the help of Semantic Kernel, we’ve been able to stack these technologies together to better solve customer needs from a single platform.

A new frontier: Fujitsu Composite AI

Fujitsu Composite AI is a unique combination of AI technologies that can understand abstract business problems through chat-style dialogue. It automatically analyzes a situation, searching for and proposing specific solutions based on past data.

Image composite ai

From ambiguous to automatic with Semantic Kernel

Semantic Kernel is an SDK that, as the documentation says, lets you “actually do something productive.” By using it to connect Composite AI component technologies, we can address real customer needs.

Our pipeline breaks down ambiguous instructions and automatically combines multiple AIs to create an advanced model capable of delivering a solution. In fact, if the required model doesn’t exist, one that is bespoke and solution-fit will be generated.

Real problems, real solutions: Composite AI case studies

Fujitsu Composite AI is already being applied to real customer data, creating efficiencies and solutions for issues that were previously cumbersome or resource-intensive.

Nakayama Transportation

Composite AI powers Nakayama Transportation’s automated vehicle dispatch system. The system analyzes the driving and restricted time of the drivers, generating an efficient plan while complying with laws and regulations.

Nakayama Unyu has given the AI solution high praise for its ability to manage both vehicle dispatching and working hours in a single tool.

Results: The time it takes to create a dispatch plan has dropped from several hours to 10 minutes .

  Fujitsu Customer Support

Internally, we’ve used Composite AI to accurately predict customer support requirements and optimize resource allocation. The platform analyzes incident management logs, predicts the future of any incident (how many days before it is resolved), and suggests staffing allocation.

Results: The new incident management system is 25% more efficient than the previous system .

Thoughts from the Semantic Kernel team

Semantic Kernel’s PM Matthew Bolaños had this to say about the Composite AI integration:

“I was very impressed with the solution that Fujitsu has implemented. It’s great to see how they have been able to use Semantic Kernel to improve the customer experience. It’s positive to see they’ve leveraged Semantic Kernel to integrate their AI technology as Composite AI and apply it to several real business fields.”

By enhancing Fujitsu AI technologies with Semantic Kernel, we can design a flexible solution pipeline and solve customers’ real problems. Composite AI automatically combines the most appropriate AI tech for any given task. In the future, we plan not only to deepen our collaboration with Semantic Kernel but also to explore integrating with Microsoft Fabric as a data source for Composite AI. We believe this integration has the potential to greatly enhance our capabilities and provide even more value to our customers.

We’ve only brushed the surface of its orchestration capabilities. Their importance will only grow as we continue to explore, innovate, and use these features more extensively.

Join us at Microsoft Build!

Learn more about Fujitsu , the Fujitsu AI (Kozuchi) platform and Composite AI . For a more in-depth look, check out the whitepaper for Composite AI .

Learn more about Semantic Kernel .

authors

Leave a comment Cancel reply

Log in to start the discussion.

light-theme-icon

Insert/edit link

Enter the destination URL

Or link to existing content

case study on microsoft sql server

Azure AI Studio

case study on microsoft sql server

Simplify generative AI development

Explore the model catalog.

Explore cutting edge model

Craft AI solutions your way

Create innovative AI solution

Design applications responsibly

Picture of model catalog

Revolutionize AI deployment

A picture of azure studio preview.

Build cutting-edge AI solutions

case study on microsoft sql server

Improve customer experiences

case study on microsoft sql server

Reduce organizational risk

case study on microsoft sql server

 Improve work quality

case study on microsoft sql server

Enhance productivity and efficiency

Gartner logo

Learn why Microsoft was named a Leader in the 2023 Gartner® Magic Quadrant™ for Strategic Cloud Platform Services (SCPS).

Capabilities.

case study on microsoft sql server

Built-in security and compliance 

A woman in an apparel store checking her tablet

Flexible consumption-based pricing

case study on microsoft sql server

Find your AI solution

case study on microsoft sql server

Customers are innovating with Azure AI

Siemens logo

Siemens Digital Industries Software

A man working on a computer

Perplexity.AI

A couple of cell phones with text messages

Get the latest Azure AI news and resources

A close up of a bicycle wheel

Realize Azure AI ROI

Gartner logo

Learn why Microsoft is a Leader

A desk with multiple laptop on it

Introduction to Azure AI Studio

A person sitting at a desk and working on laptop

Azure AI Studio learning collection

A person sitting in front of a computer

Azure AI Studio documentation

A person and another person looking at a tablet

Azure AI and Microsoft Fabric 

A women working on laptop

Driving inclusive AI

A group of people laughing

Innovations in generative AI

A person drinking from a cup and working on dekstop

Code-first copilots: Azure AI

Who should use azure ai studio, can i use models other than chatgpt in azure openai service, is prompt flow the microsoft equivalent of langchain, how is prompt injection handled, and how do i ensure no malicious code is running from prompt injection, is there fine-tuning in azure ai studio.

Two people sitting at a table with a laptop in front of them

Get started with a free account

A man working on a laptop in front of a window

Get started with pay-as-you-go pricing

[ 1 ] Gartner, Magic Quadrant for Cloud AI Developer Services, Jim Scheibmeir, and 4 more, 22 April 2024.

Gartner is a registered trademark and service mark and Magic Quadrant is a registered trademark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. 

case study on microsoft sql server

IMAGES

  1. SQL Server School Case Study

    case study on microsoft sql server

  2. CASE Statement & Nested Case in SQL Server: T-SQL Example

    case study on microsoft sql server

  3. The SQL CASE WHEN Statement

    case study on microsoft sql server

  4. SQL CASE Statement (With Examples)

    case study on microsoft sql server

  5. SQL CASE Statement

    case study on microsoft sql server

  6. CASE Statement & Nested Case in SQL Server: T-SQL Example

    case study on microsoft sql server

VIDEO

  1. case study 1 2 sql DESKTOP KUNKNQL MORNINGSESSION master sa 63 Microsoft SQL Server Management S

  2. Microsoft ki strategy😱 Free Windows #microsoft #windows #casestudy #seekho #seekhoshorts #seekhotech

  3. All e Technologies (Alletec)

  4. DP 600 Sample Case Study

  5. DP 600 Sample Case Study

  6. Poupart Limited increase productivity with Microsoft Dynamics NAV 2013

COMMENTS

  1. A case study of SQL Query tuning in SQL Server

    In this article, we learned practical details about SQL query tuning and these techniques can help when you try to solve a query performance problem. In the case study, the query which has a performance problem contained 3 main problems. These are: Scalar-function problem. Using a serial execution plan.

  2. Case Studies and Real-World Scenarios

    These case studies represent common scenarios in SQL Server performance tuning. The specifics can vary, but identifying the problem, isolating the cause, and resolving the issue remains the same. Case Study 5: Poor Indexing Strategy. A hospital's patient records system began to experience performance issues over time.

  3. Microsoft SQL Server Case Study

    Microsoft SQL Server Case Study. Tom Jenkins. 19 September 2016 Microsoft SQL Server. Microsoft SQL Server® is a market leading enterprise level database solution used by any a large number and variety of applications on which to host their databases and store their data. Microsoft SQL Server is an incredibly powerful, scalable and robust ...

  4. SQL Server Case Study

    SQL Server Case Study. by Case Studies. 1 min read. Jun 19, 2019 10:14:50 AM. Executive Summary. Established in 1972, the port's primary business is offloading foreign crude oil from tankers, and storing and distributing the inventory to refineries throughout the Gulf Coast and Midwest. As the single largest point of entry for crude oil ...

  5. PDF The Total Economic Of Microsoft SQL Server

    $15,695 per server. With Microsoft SQL Server 2014 and SQL Server 2012, there was a 20% improvement in IT resource productivity and a 22% reduction in data errors and issues with mission-critical applications, improved profit from direct and sales-led revenue, reduction in customer churn, and cost reductions from improved data issues with security.

  6. GitHub

    This repository contains solutions for #8WeekSQLChallenge, they are interesting real-world case studies that will allow you to apply and enhance your SQL skills in many use cases. I used Microsoft SQL Server in writing SQL queries to solve these case studies.

  7. Customer and Partner Success Stories

    Try Azure for free. Get popular services free for 12 months and 45+ other services free always—plus $200 credit to use in your first 30 days. Start free. Get the Azure mobile app. Explore Azure customer success stories and case studies to see how organizations all over the world are optimizing their costs and gaining new capabilities.

  8. SQL Server 2008 case studies

    By SQL Server Team. January 28, 2008. As we get closer to launch, I wanted to spend a moment to highlight some recently published case studies for SQL Server 2008. Its great to see the excellent work that our customers and partners are doing on our upcoming release. Healthcare Group Upgrading to SQL Server 2008 to Better Protect 2 Terabytes of ...

  9. CASE (Transact-SQL)

    The CASE expression can't be used to control the flow of execution of Transact-SQL statements, statement blocks, user-defined functions, and stored procedures. For a list of control-of-flow methods, see Control-of-Flow Language (Transact-SQL). The CASE expression evaluates its conditions sequentially and stops with the first condition whose ...

  10. New SQL Server 2008 Case Studies

    The momentum for SQL Server 2008 continues. Here are some of the latest case studies published: Russia's Baltika Breweries Links its ERP Databases using SQL Server 2008 Replication Baltika Breweries, the largest producer of beer in Russia, has about 12,000 employees and 11 breweries. Its popular brands have been the market leader in Russia for

  11. Customers using SQL Server 2012 today!

    Expect to see a lot more SQL Server 2012 case studies (and earlier versions of SQL Server) published over the next few months. To make it easier to find these stories, we will be adding a new Customer Stories section to the SQL Server website in the next few weeks, so you can easily find stories that match the Version, Workload, Industry or ...

  12. MS SQL Server Solutions

    Data Science Solution for Sales Analysis and Forecasting. Read More. Show more success stories. ScienceSoft's case studies: MS SQL Server Solutions. Check out the success stories of a software company that has been operating since 1989.

  13. In-Memory OLTP overview and usage scenarios

    In-Memory OLTP is the premier technology available in SQL Server and SQL Database for optimizing performance of transaction processing, data ingestion, data load, and transient data scenarios. This article includes an overview of the technology and outlines usage scenarios for In-Memory OLTP. Use this information to determine whether In-Memory ...

  14. Sony DADC NMS Case Study

    The company was looking to enhance its disaster recovery capabilities, because the application was running on two Microsoft SQL Server 2008 R2 database software nodes in a data center. "Both nodes were located in the data center, so if the data center was ever impacted, we knew the application would be impacted too," Gassner says.

  15. Case Studies

    About Williams-Sonoma, Inc. Williams-Sonoma, Inc. is a specialty retailer of high quality products for the home with $4.4 billion revenue. These products, representing eight distinct merchandise strategies—Cookware, tools, electrics, cutlery, tabletop and bar, outdoor, furniture —are marketed through seven e-commerce websites, eight direct ...

  16. In-Memory OLTP Updated Overview and Case Studies

    We just published updated slide decks about In-Memory OLTP. In-Memory OLTP overview : this is an overview of the technology and details what's new in SQL Server 2016/2017 and Azure SQL Database. In-Memory OLTP case studies : this discusses when you and do not want to use In-Memory OLTP, as well as a number of application patterns and customer ...

  17. AdventureWorks sample databases

    Follow these steps to add a sample data to your new database: Connect to your Azure portal. Select Create a resource in the top left of the navigation pane. Select Databases and then select SQL Database. Fill in the requested information to create your database.

  18. DOC Lava Technology

    The Lava DMA solution was built using the Microsoft® Application Platform, including Microsoft SQL Server® 2005 database software. A pioneer in creating DMA solutions, the Lava technology team prides itself in having been the first to provide access to all the U.S. market centers. A survey by Elkins/McSherry, which was published in ...

  19. A Case Study for Mysql and Microsoft Sql Server Database Reporting a

    creating a case study providing a clear idea about how to use Microsoft SQL Server and MySQL to create and maintain reports using a web-based server. As a first step, the user installs Microsoft SQL Server and MySQL Server on the user machine. Once the database servers have been installed and configured, next step is to set

  20. data driven business case for SQL Server cloud migration

    Build data driven business case for SQL Server migration to Azure. Azure offers you the ability to run your SQL Server workloads with optimized costs, greater flexibility, scalability, and availability along with a strong security and compliance posture all the while simplifying management and monitoring. That being said, moving databases to ...

  21. PROS: Oracle to SQL Case Study

    PROS: Oracle to SQL Case Study. Companies must address complex pricing issues, involving increasingly sophisticated trading partners, to improve profitability in business-to-business (B2B) transactions. Successful pricing software needs to deliver meaningful business intelligence to front-line sellers in real time, integrate with existing tools ...

  22. A case study for MySQL and Microsoft SQL server database reporting

    In today's world, for data to be useful, it needs to be structured, relevant, timely and accurate. Almost all databases provide functionality for report generation as it is a closely related feature. This project focuses on generating web-based report using MS SQL Server and MySQL. This case study will help to gain and enhance knowledge about Web based reporting with databases. Microsoft ...

  23. SSIS Case Study

    First published on MSDN on Apr 04, 2012. An interesting case study about how SSIS is used by Canada Post has recently been published . The article mentions how they built an SSIS scale-out solution which processes 10-15 GB of data per day (with four worker nodes). Definitely an interesting read!

  24. Rename Column in SQL Server

    1. Using the sp_rename System Stored Procedure. To rename a column of a database in SQL Server, we can use the sp_rename system stored procedure. The sp_rename procedure is a built-in system stored procedure that allows the users to rename various database objects like tables, columns, views, and indexes. Following is the syntax to use the sp_rename system stored procedure:

  25. Dynamics 365 Blog

    In the quickly changing world of AI, Microsoft Dynamics 365 Business Central is leading the way with innovations that have equipped more than 30,000 small and medium-sized businesses to succeed. Powered by next-generation AI, Microsoft Copilot in Dynamics 365 Business Central introduces new ways to streamline workflows, boost productivity, and ...

  26. Microsoft Fabric gets new generative AI tech, more openness

    Fabric, first unveiled in May 2023 and made generally available in November, is an AI-powered data management and analytics suite that joins the capabilities of Data Factory, Azure Synapse Analytics and Power BI in a single platform. The combination aims to enable seven data workloads including integration, management and analysis.

  27. Customer Case Study: Fujitsu Composite AI and Semantic Kernel

    A new frontier: Fujitsu Composite AI. Fujitsu Composite AI is a unique combination of AI technologies that can understand abstract business problems through chat-style dialogue. It automatically analyzes a situation, searching for and proposing specific solutions based on past data. It dramatically expands the application and use cases of AI by ...

  28. Azure AI Studio

    Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. Explore Azure AI Studio, your all-in-one AI platform for building, evaluating, and deploying generative AI solutions and custom copilots. Start your AI journey today!

  29. Azure OpenAI Service Announces Multimodal Innovations at Microsoft

    This multimodal model integrates text, vision, and in the future, audio capabilities, setting a new standard for generative and conversational AI experiences. GPT-4o is available now in Azure OpenAI Service API and Azure AI Studio with support for text and image. GPT-4o is our first model offered both with a Global and Regional deployment.