{"url": "https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [What’s New](https://blog.devart.com/category/whats-new) 10 New ODBC Drivers for Marketing, Planning and Collaboration Services, and More Released By [DAC Team](https://blog.devart.com/author/dac) November 10, 2022 [0](https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html#respond) 2950 We are excited to announce the release of new ODBC Drivers. Devart ODBC drivers allow easy access to these sources from various ETL, BI, reporting, and database management tools and programming languages on x32-bit and x64-bit Windows. The drivers fully support standard ODBC API functions and data types and enable fast access to live data from anywhere. Cloud CRM [Freshworks CRM](https://www.devart.com/odbc/freshsales/) Marketing [AfterShip](https://www.devart.com/odbc/aftership/) [DEAR Inventory](https://www.devart.com/odbc/dearinventory/) [Delighted](https://www.devart.com/odbc/delighted/) [Mailjet](https://www.devart.com/odbc/mailjet/) [Sendinblue](https://www.devart.com/odbc/sendinblue/) Planning and Collaboration Tools [Confluence Cloud](https://www.devart.com/odbc/confluence/) [Jira](https://www.devart.com/odbc/jira/) [Jira Service Management](https://www.devart.com/odbc/jiraservice/) Workflow Automation [Podio](https://www.devart.com/odbc/podio/) A complete list of all ODBC drivers you can find on our [website](https://www.devart.com/odbc/) . Tags [cloud crm](https://blog.devart.com/tag/cloud-crm) [marketing](https://blog.devart.com/tag/marketing) [odbc](https://blog.devart.com/tag/odbc) [release](https://blog.devart.com/tag/release) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F10-new-odbc-drivers-for-marketing-planning-collaboration-services.html) [Twitter](https://twitter.com/intent/tweet?text=10+New+ODBC+Drivers+for+Marketing%2C+Planning+and+Collaboration+Services%2C+and+More+Released&url=https%3A%2F%2Fblog.devart.com%2F10-new-odbc-drivers-for-marketing-planning-collaboration-services.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html&title=10+New+ODBC+Drivers+for+Marketing%2C+Planning+and+Collaboration+Services%2C+and+More+Released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html&title=10+New+ODBC+Drivers+for+Marketing%2C+Planning+and+Collaboration+Services%2C+and+More+Released) [Copy URL](https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025"} {"url": "https://blog.devart.com/11-reasons-why-you-need-dbforge-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) 11 Reasons Why You Need dbForge SQL Complete By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) September 30, 2022 [0](https://blog.devart.com/11-reasons-why-you-need-dbforge-sql-complete.html#respond) 3046 Okay, so you deal with SQL code in SSMS every day? Then you most probably would not mind doubling your output without investing too much raw effort. Instead, you would like to work smart, work with convenience, and still not feel tired by the end of the day. If that’s correct, we’ve got a solution that will definitely help you get it right, and it’s called [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . With all that in mind, we have singled out 11 rather compelling reasons why you wouldn’t want to miss SQL Complete in your daily work with SQL Server databases. Context-aware code completion Customizable formatting Predefined and custom SQL snippets Intelligent refactoring Debugging of SQL code directly from SSMS Document recovery Multiple operations with data in the results grid Tab coloring Simplified navigation across large SQL statements Extra features Free Express Edition Now let’s talk about each of them in detail. 1. You get the best context-aware IntelliSense-like code completion that money can buy The superior-to-IntelliSense code completion that you get with SQL Complete is a sure way to either double your produced output or get everything done twice as fast. There’s everything you might need for quick and effortless SQL coding—context-aware suggestions, instant expansion of statements, quick info on database objects, instant syntax check, and simplified code navigation. 2. Your code is kept as consistent as ever with highly customizable formatting The built-in SQL Formatter makes it exceptionally fast and easy for you to read, understand, review, and share code with your teammates. Moreover, unified code formatting standards also help everyone stay on the same page and thus become more productive. Your tasks, be it code reviews or troubleshooting, get done much faster. The features delivered by the Formatter include predefined and custom formatting profiles, wizard-aided bulk formatting, noformat tags, database identifier case synchronization, word recognition in CamelCase identifiers, and automated formatting. 3. It’s never been easier to reuse your SQL code with predefined and custom snippets Who wouldn’t want to eliminate repetitive coding in daily work? You can do it using the rich collection of snippets that you get with SQL Complete. The built-in Snippets Manager will help you create and manage custom snippets, apply and modify the predefined ones, as well as group, sort, relocate, and share them via GitHub. Say no to wasted time. 4. You can level up code quality with the built-in intelligent refactoring tools First off, you can safely rename database objects (including temporary ones) without affecting the existing dependencies. Instead, SQL Complete will automatically find and correct all references to the renamed objects. Next, you can quickly rename aliases and variables in your queries. In both cases, SQL Complete finds, highlights, and renames all occurrences of any specified alias or variable automatically to make your code cleaner and more readable. Finally, SQL Complete enables you to find invalid objects (like those that reference non-existing objects) across multiple databases. 5. You can effectively debug your T-SQL scripts, stored procedures, triggers, and functions directly from SSMS As if all that wasn’t enough, SQL Complete further extends the capabilities of SSMS with a T-SQL Debugger that helps you easily troubleshoot complex queries, stored procedures, triggers, and functions directly in the IDE. You can observe the runtime behavior of database objects, detect and identify logic errors, break or suspend query execution in order to examine objects, and use watches to evaluate and edit variables in your scripts. 6. Your output is always safe with document recovery Never lose a line of your code with the session restoration features delivered by SQL Complete. These features allow minimizing or, we’d rather say, totally avoiding loss of your SQL code and data, recovering document sessions in a couple of clicks, and quickly taking action in case of emergency. That said, whenever SSMS crashes, or the power is suddenly shut down, or you have accidentally closed an SQL document without saving, it will take you moments to restore everything and resume your work. 7. You can perform multiple operations with your data in the results grid Yet another worthwhile feature that you get with SQL Complete is a set of flexible and versatile operations with data that can be easily performed right in the SSMS results grid. Let’s cover them briefly. Data visualization does what it says, just check the screenshot below; note that visualization applies to the Hexadecimal, Text, XML, HTML, Rich Text, PDF, JSON, Image, and Spatial file formats Data search helps you find matching data, entire words, and regular expressions Data copying lets you select data from a cell, a range of cells, or an entire table and copy it to clipboard or to a CSV, XML, HTML, or JSON file Data aggregation is useful when it comes to calculating sums and average values in the received data sets; this is far quicker and simpler than the use of spreadsheets Finally, you can easily generate scripts from the results grid based on your table data with the following statements: INSERT, INSERT #tmpTable, WHERE IN(), WHERE AND OR, and UPDATE 8. Your work with multiple connections becomes easier with tab coloring Tab coloring is a handy feature that allows color-coding your servers and databases. This lets you easily keep an eye on which connection your current tab is using, be it Development, Sandbox, Testing, Production, or a custom one—you are free to manage them any way you like. For your convenience, SQL Complete labels tabs as well as status bars of SQL documents. Your connections and databases are also labeled with vertical lines in Object Explorer. 9. You will definitely enjoy the benefits of simplified navigation across large SQL statements The simplified navigation we’re talking about is provided by a feature called Document Outline . You can view and navigate across your current document structure in the Document Outline window and sync the structure with the text from your code in a matter of moments by right-clicking the code and selecting Synchronize Document Outline. Simple, easy, and convenient. 10. You can empower yourself further with a set of productivity-enhancing extra features Besides everything that’s been mentioned above, there are still many pleasant surprises awaiting you in SQL Complete. These include SQL query history (where you can find, view, and edit queries that have been previously executed), transaction reminders (which notify you about uncommitted transactions), and execution warnings (which prevent accidental drop of database objects or deletion of your data). All of these contribute to your overall performance and help you keep things in check. 11. If basic code completion and formatting features are really all you need, you can opt for the free Express Edition The basic features in question include completion for SELECT, INSERT, UPDATE, EXEC, and DELETE statements, smart filtering in the suggestion list, parameter information for procedures and functions, as well as fundamental SQL formatting functionality. All this makes SQL Complete Express a perfect solution for newcomers in the sphere of SQL development and people who work in non-profit organizations, such as churches, municipalities, or educational institutions. These are the 11 main reasons why we believe that SQL Complete can easily become your indispensable assistant—and you don’t have to take our word for it. Simply [download SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) for a FREE 2-week trial and give it a go! Tags [Code Completion](https://blog.devart.com/tag/code-completion) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F11-reasons-why-you-need-dbforge-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=11+Reasons+Why+You+Need+dbForge+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2F11-reasons-why-you-need-dbforge-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/11-reasons-why-you-need-dbforge-sql-complete.html&title=11+Reasons+Why+You+Need+dbForge+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/11-reasons-why-you-need-dbforge-sql-complete.html&title=11+Reasons+Why+You+Need+dbForge+SQL+Complete) [Copy URL](https://blog.devart.com/11-reasons-why-you-need-dbforge-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [What’s New](https://blog.devart.com/category/whats-new) 14 New ODBC Drivers for Cloud Data Warehouses and Services Released By [DAC Team](https://blog.devart.com/author/dac) August 10, 2022 [0](https://blog.devart.com/14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html#respond) 3074 We are excited to announce the release of new ODBC Drivers. Devart ODBC drivers allow easy access to these sources from various ETL, BI, reporting, and database management tools and programming languages on x32-bit and x64-bit Windows, Linux, and macOS. The drivers fully support standard ODBC API functions and data types and enable fast access to live data from anywhere. Cloud Data Warehouses [Azure Synapse Analytics](https://www.devart.com/odbc/sqlsynapse/) [QuestDB](https://www.devart.com/odbc/questdb/) [Snowflake](https://www.devart.com/odbc/snowflake/) Cloud CRM [PipeDrive](https://www.devart.com/odbc/pipedrive/) Communication [Slack](https://www.devart.com/odbc/slack/) Ecommerce [WooCommerce](https://www.devart.com/odbc/woocommerce/) Help Desk [Zendesk](https://www.devart.com/odbc/zendesk/) Marketing [Active Campaign](https://www.devart.com/odbc/activecampaign/) [EmailOctopus](https://www.devart.com/odbc/emailoctopus/) [Klaviyo](https://www.devart.com/odbc/klaviyo/) [Marketo](https://www.devart.com/odbc/marketo/) Payment Processing [Square](https://www.devart.com/odbc/square/) Project Management [Asana](https://www.devart.com/odbc/asana/) Other Applications [WordPress](https://www.devart.com/odbc/wordpress/) A complete list of all ODBC drivers you can find on our [website](https://www.devart.com/odbc/) . Tags [odbc](https://blog.devart.com/tag/odbc) [what's new odbc drivers](https://blog.devart.com/tag/whats-new-odbc-drivers) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html) [Twitter](https://twitter.com/intent/tweet?text=14+New+ODBC+Drivers+for+Cloud+Data+Warehouses+and+Services+Released&url=https%3A%2F%2Fblog.devart.com%2F14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html&title=14+New+ODBC+Drivers+for+Cloud+Data+Warehouses+and+Services+Released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html&title=14+New+ODBC+Drivers+for+Cloud+Data+Warehouses+and+Services+Released) [Copy URL](https://blog.devart.com/14-new-odbc-drivers-for-cloud-data-warehouses-and-services-released.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025"} {"url": "https://blog.devart.com/27-new-data-sources-supported-in-devart-ssis-components-2-0.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SSIS Components](https://blog.devart.com/category/products/ssis-components) [What’s New](https://blog.devart.com/category/whats-new) 27 New Data Sources Supported in Devart SSIS Components 2.0! By [dotConnect Team](https://blog.devart.com/author/dotconnect) October 4, 2021 [0](https://blog.devart.com/27-new-data-sources-supported-in-devart-ssis-components-2-0.html#respond) 2493 Devart is glad to announce the release of SSIS Data Flow Components 2.0 with support for 27 new data sources. We have supported Snowflake cloud data warehouse and a number of cloud applications, including several new inventory management solutions, like DEAR Inventory or Zoho Inventory, a number of new cloud CRMs, such as Pipedrive and Insightly CRM. SSIS Data Flow Components now also support new cloud app categories, for example, Ads and Conversion, with such widely used tools as Google Analytics, Google Ads, and Twitter Ads. See the full list of the sources below: Supported Cloud Data Warehouses [Snowflake](https://www.devart.com/ssis/snowflake.html) Supported Cloud CRMs [Freshworks CRM](https://www.devart.com/ssis/freshworks-crm.html) [HubSpot](https://www.devart.com/ssis/hubspot.html) [Insightly CRM](https://www.devart.com/ssis/insightly-crm.html) [NetSuite](https://www.devart.com/ssis/) [Pipedrive](https://www.devart.com/ssis/pipedrive.html) [Streak](http://local-tw/ssis/streak.html) Supported Ads & Conversion Applications [Google Ads](https://www.devart.com/ssis/google-ads.html) [Google Analytics](https://www.devart.com/ssis/google-analytics.html) [Twitter Ads](https://www.devart.com/ssis/twitter-ads.html) Supported Cloud Accounting Applications [Zoho Books](https://www.devart.com/ssis/zoho-books.html) Supported Cloud Marketing Applications [ActiveCampaign](https://www.devart.com/ssis/activecampaign.html) [EmailOctopus](https://www.devart.com/ssis/emailoctopus.html) [SendPulse](https://www.devart.com/ssis/sendpulse.html) Supported Communication Applications [Slack](https://www.devart.com/ssis/slack.html) Supported Ecommerce Applications [DEAR Inventory](https://www.devart.com/ssis/dear-inventory.html) [ShipStation](https://www.devart.com/ssis/shipstation.html) [Zoho Inventory](https://www.devart.com/ssis/zoho-inventory.html) [Zoho Invoice](https://www.devart.com/ssis/zoho-invoice.html) Supported Helpdesk Applications [Freshdesk](https://www.devart.com/ssis/freshdesk.html) [Zoho Desk](https://www.devart.com/ssis/zoho-desk.html) Supported Payment Processing Applications [Stripe](https://www.devart.com/ssis/stripe.html) Supported Project Management Applications [Asana](https://www.devart.com/ssis/asana.html) [Jira](https://www.devart.com/ssis/jira.html) [Podio](https://www.devart.com/ssis/podio.html) Other Applications [WordPress](https://www.devart.com/ssis/wordpress.html) [Zoho People](https://www.devart.com/ssis/zoho-people.html) Tags [SSIS](https://blog.devart.com/tag/ssis) [what's new ssis](https://blog.devart.com/tag/whats-new-in-ssis) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F27-new-data-sources-supported-in-devart-ssis-components-2-0.html) [Twitter](https://twitter.com/intent/tweet?text=27+New+Data+Sources+Supported+in+Devart+SSIS+Components+2.0%21&url=https%3A%2F%2Fblog.devart.com%2F27-new-data-sources-supported-in-devart-ssis-components-2-0.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/27-new-data-sources-supported-in-devart-ssis-components-2-0.html&title=27+New+Data+Sources+Supported+in+Devart+SSIS+Components+2.0%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/27-new-data-sources-supported-in-devart-ssis-components-2-0.html&title=27+New+Data+Sources+Supported+in+Devart+SSIS+Components+2.0%21) [Copy URL](https://blog.devart.com/27-new-data-sources-supported-in-devart-ssis-components-2-0.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/4-ways-to-test-an-odbc-connection.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Test an ODBC Connection: 4 Practical Methods By [DAC Team](https://blog.devart.com/author/dac) January 3, 2025 [0](https://blog.devart.com/4-ways-to-test-an-odbc-connection.html#respond) 10311 Are you looking for different ways to test an [ODBC](https://blog.devart.com/oledb-vs-odbc-which-driver-to-choose.html) connection? In this article, we will discuss 4 easy ways to test the connectivity and show how to do it with examples. In our case, the test environment is a laptop with a Windows 11. MySQL 8, PostgreSQL 14, and also ODBC drivers for these databases are installed in advance. We will test the connection to different data sources using the following instruments: ODBC Data Source Administrator (64-bit) Excel ODBC Query (DQY) File PowerShell Script .Net Code using C# Wrapping up So, let’s dive in. Using ODBC Data Source Administrator (64-bit) You may already know this method. Here, we are going to access the Sakila MySQL sample database. But first, let’s create the MySQL user permissions to this database. Here’s the code: USE sakila;\nCREATE USER 'edwin'@'localhost' IDENTIFIED BY 'e@10-4ward'; \nGRANT ALL ON sakila TO 'edwin'@'localhost'; To run this code, you can use any MySQL database administration tool. Then, you have to create the data source name (DSN). For this, run the ODBC Data Source Administrator (64-bit) and create a DSN using the MySQL Connector/ODBC or the [Devart ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) . Configure the DSN with the credentials we created above. Figure 1 . The ODBC DSN configuration for the Sakila database. Now, let’s test this DSN. Click a Test Connection button as shown in Figure 2. You should see a success message. Figure 2 . Successful connection to Sakila database message . But is there a way to test it outside the ODBC Data Source Administrator? Let’s move on to the following method. ODBC Test Connection Using a DQY File This method is not just a connection test. A simple query will also run here. The result of the query will be displayed in Excel. This is quite handy if you have Excel installed on your computer. Before we proceed, you should install the following on your machine: Microsoft Excel PostgreSQL Sample database with a structure mentioned in the next subsection [Devart ODBC Driver for PostgreSQL](https://www.devart.com/odbc/postgresql/) Creating the DQY File and Testing a Connection We have a PostgreSQL sample database called sample . We added a table called a person with the following structure: CREATE TABLE IF NOT EXISTS public.person\n(\n \"ID\" bigint NOT NULL,\n \"LastName\" character varying(20) COLLATE pg_catalog.\"default\" NOT NULL,\n \"FirstName\" character varying(20) COLLATE pg_catalog.\"default\" NOT NULL,\n \"MiddleName\" character varying(20) COLLATE pg_catalog.\"default\",\n \"birthDate\" date,\n CONSTRAINT person_pkey PRIMARY KEY (\"ID\")\n) Then, fill in the table with the names of 4 actors from Marvel’s Avengers. We also added user permissions to the database for ourselves. The User ID is edwin and the password is e@10-4ward . Now, let’s test it with a DQY file. First, create an Excel ODBC Query File (DQY). Do this by running any text editor. Then, paste the code from below. XLODBC\n1\nDRIVER=Devart ODBC Driver for PostgreSQL;Data Source=localhost;Database=sample;User ID=edwin;Password=e@10-4ward;Schema=public\nSELECT * FROM person; Save it to a file named test-postgres-connectivity.dqy . Note the DQY file extension. Before we let Excel open this file, let’s describe what’s in this file. Line 1 : The file header. It should be XLODBC or Excel will see this file as invalid or damaged. Line 2 : We can’t find any documentation on this. We tried typing 1, 100, X, and a blank space and Excel didn’t have any problem with it. But when removing this line, Excel can’t open the file. So, for simplicity, use 1. Line 3 : A connection string without a DSN (because we didn’t set it up in the ODBC Data Source Administrator (64-bit). In our example above, this uses the driver, database, and permissions we needed earlier. Line 4 : Request. You can use any query that is valid to access the database. Now, let’s run it. So, double-click the DQY file. Excel will open it and block the connection. But if you continue by clicking Yes , the 4 records from the person table will display. You will see the successful output as shown below. Figure 3 . The result of the successful ODBC test connection using a DQY file. ODBC Test Connection Using PowerShell Another way to test an ODBC connection is to use PowerShell. What you need: PowerShell The PowerShell script (see below). The Sakila-Connectivity-DSN we created earlier Testing the Connection Using the PowerShell Script There is a simple PowerShell script below. This is the same MySQL DSN we used in Figure 1 earlier. $conn = New-Object System.Data.Odbc.OdbcConnection(\"DSN=Sakila-Connectivity-DSN\")\n$conn.open()\n$cmd = $conn.CreateCommand()\n$cmd.CommandText = \"SELECT COUNT(*) AS RecordCount FROM actor\"\n$reader = $cmd.ExecuteReader()\n$reader.Read()\n$reader[0]\n$reader.Close()\n$conn.Close() The script above uses the .Net ODBC connection object and the Sakila-Connectivity-DSN . It will connect to the Sakila database and count the number of rows in the actor table. Using a DataReader, it will display the number of rows. See the console output in Figure 4. Figure 4 . Using PowerShell to test ODBC DSN and MySQL connectivity. If there is any problem, it will show up in the console. Figure 4 shows 201 rows from the actor table. So we’re good here. You can also make this script a function. This function can take a DSN and a table as parameters. But we leave it up to you in PowerShell. ODBC Test Connection Using .Net and C# Finally, the fourth method uses code. This could be cumbersome but eventually, you may use the DSN in code. To give you more value, the next code will try to test any DSN from any SQL database and will also count the rows in the table you specify. You can paste it later in your Visual Studio project. What you will need: Visual Studio 2022 with Windows Forms project template System.Data.Odbc NuGet package (you need to add this dependency to the project). A couple of DSNs you want to test. The Visual Studio Project and Code Look at the screenshot of the app below. Figure 5 . The ODBC DSN Test app. The lines of code are in the Test button click event. So, check it out below. private void btnTest_Click(object sender, EventArgs e)\n{\n // DSN and table are required\n if(txtDSN.Text != \"\" & txtTable.Text != \"\") \n {\n // create an ODBC connection\n var odbcConnection = new OdbcConnection(\"DSN=\" + txtDSN.Text);\n try\n {\n odbcConnection.Open(); // open the connection\n\n using (var cmd = odbcConnection.CreateCommand())\n {\n cmd.CommandText = @\"SELECT COUNT(*) \n as RecordCount FROM \" \n + txtTable.Text;\n // execute the query\n var reader = cmd.ExecuteReader(); \n\n reader.Read();\n if (reader.HasRows)\n {\n // show the result\n MessageBox.Show(\"Table \" \n + txtTable.Text \n + \" has \" + reader[0].ToString() \n + \" records\",\"Result\",\n MessageBoxButtons.OK, \n MessageBoxIcon.Information);\n }\n // close the DataReader and Connection\n reader.Close();\n odbcConnection.Close();\n }\n }\n catch (Exception err)\n {\n // display any runtime error\n MessageBox.Show(err.Message,\"Error\"\n , MessageBoxButtons.OK\n , MessageBoxIcon.Error);\n }\n finally\n {\n // dispose the connection object\n odbcConnection.Dispose();\n }\n }\n else\n {\n // show validation message\n MessageBox.Show(\"DSN and Table to Test are required!\"\n , \"Validation\"\n , MessageBoxButtons.OK\n , MessageBoxIcon.Exclamation);\n }\n} The code above also uses a try-catch-finally block. This will catch errors in case you tried a 32-bit DSN or a non-existent DSN, or entered a non-existent table or view. But why do we need to do a query? Isn’t a connection test enough? Let’s dive in. Testing with Limited Permissions What if you have limited access to a database and you don’t know it yet? Consider this permission for another database. We will be using the same MySQL account we used earlier (see Figure 1). Check out Figure 6. Figure 6 . Granting SELECT permission on 1 table and showing the results. As you can see in Figure 6, only the client table has SELECT permission. The SHOW GRANTS command shows the result of the GRANT SELECT. Now, we prepare the DSN, as shown in Figure 7. Figure 7 . DSN configuration to a database with limited permissions. Then, we test the connectivity. We start with the table with SELECT permissions. Check out Figure 8. Figure 8 . The app successfully read and displayed the number of rows in the client table. As expected, this should go well. Then, test the billing table where there is no permission. See the result in Figure 9. Figure 9 . Permission denied on the billing table. As expected, this is not allowed. So, when you see an error like in Figure 9, it’s time to talk to your database administrator and ask for more permissions. You can try this out with another 64-bit DSN. Wrapping up So, what is the point? Sometimes it pays to test the connection with a query. You can see if you have table permissions before charging into coding. Method Key steps Output verification Example database ODBC Data Source Administrator (64-bit) Create user permissions. Configure DSN in ODBC Administrator. Test connection via ‘Test Connection’ button. Success message displayed in ODBC Administrator. Sakila (MySQL) Excel ODBC Query (DQY) File Create a DQY file with connection string and query. Open the file in Excel. Allow connection when prompted. Query results displayed in Excel (e.g., 4 records from ‘person’ table). Sample (PostgreSQL) PowerShell Script Write a PowerShell script using ODBC connection. Execute a query (e.g., row count). Run the script in PowerShell. Row count displayed in PowerShell console (e.g., 201 rows in ‘actor’ table). Sakila (MySQL) .Net Code using C# Create a Windows Forms project in Visual Studio. Use ODBC connection in code. Execute a query and handle errors. Message box shows row count or error (e.g., ‘Table has X records’). Any SQL Database Check the table above to understand the key steps of each of the methods described in this article and feel free to return to it whenever you need to test an ODBC connection. Tags [odbc](https://blog.devart.com/tag/odbc) [ODBC Tutorial](https://blog.devart.com/tag/odbc-tutorial) [PowerShell Script](https://blog.devart.com/tag/powershell-script) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F4-ways-to-test-an-odbc-connection.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Test+an+ODBC+Connection%3A+4+Practical+Methods&url=https%3A%2F%2Fblog.devart.com%2F4-ways-to-test-an-odbc-connection.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/4-ways-to-test-an-odbc-connection.html&title=How+to+Test+an+ODBC+Connection%3A+4+Practical+Methods) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/4-ways-to-test-an-odbc-connection.html&title=How+to+Test+an+ODBC+Connection%3A+4+Practical+Methods) [Copy URL](https://blog.devart.com/4-ways-to-test-an-odbc-connection.html) RELATED ARTICLES [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [How to Integrate Dynamics 365 Apps with Other Solutions: The Ultimate Guide](https://blog.devart.com/how-to-integrate-dynamics-365-apps-with-other-solutions-the-ultimate-guide.html) April 10, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [How To](https://blog.devart.com/category/how-to) [How to Migrate From SQL Server to Oracle](https://blog.devart.com/how-to-migrate-from-sql-server-to-oracle.html) February 4, 2025"} {"url": "https://blog.devart.com/5-simple-tips-for-postgresql-data-access-components-in-2022.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [How To](https://blog.devart.com/category/how-to) 5 Simple Tips for PostgreSQL Data Access Components in 2025 By [DAC Team](https://blog.devart.com/author/dac) June 15, 2022 [0](https://blog.devart.com/5-simple-tips-for-postgresql-data-access-components-in-2022.html#respond) 5901 In this article, we would like to introduce one of the rock-solid sets of data access components – the PgDAC or PostgreSQL data access component. PgDAC is a library of components that delivers [native connectivity to PostgreSQL](https://www.devart.com/pgdac/) from C++ Builder and Delphi including Community Edition as well as Lazarus. PgDAC is designed to help programmers to create swift and clean PostgreSQL database-powered applications without the need to deploy any additional libraries and use libpq.dll to connect. Furthermore, it provides direct access to PostgreSQL without a PostgreSQL client and presents itself as an efficient alternative solution to standard connectivity solutions like the Borland database engine or dbExpress drivers. Additionally, PgDAC provides a further opportunity to work with PostgreSQL directly via TCP and IP without involving the client. PgDAC is not just about connecting to PostgreSQL databases as it covers a wide range of PostgreSQL features like SSL connection working with large objects, composite or geometric types, intervals, and notices. It also provides complete support for fast record insertion, async notification, PostgreSQL sequences, and more. Delphi and C++ Builder developers appreciate components and libraries that support cross-platform options. All major GUI frameworks are supported by PgDAC so that you can develop FPC applications for Windows, macOS, Linux, iOS, Android, and FreeBSD for x64 and x86 platforms. PgDAC also allows the creation of cross-platform FMX solutions and LCL applications. Let’s consider a real-world demo application. First, you need to [download](https://www.devart.com/pgdac/download.html) PgDAC Trial and install it for evaluation purposes with some limitations. In this demo, we used Rad Studio 11 Alexandria. After installing, you will see a menu. If you click PgDAC , you can see the information about it. Figure 1 . Information about PgDAC. Let’s create a new multi-device application with Delphi. Select the type of application. In our example, we have selected Blank Application . Click Ok . Figure 2 . Creating a new multi-device application. In the list under Palette , you can see PgDAC, TPgConnection, TPgQuery, TPgTable, TPgDataSource, TPgUpdateSQL, and stored product components which are the basic components for the PostgreSQL connection and fetching processes. Several other valuable components like PgScript and PgMetadata make you proactive in obtaining metadata about database objects or other components. Figure 3 . The basic components for the PostgreSQL connection and fetching processes. To connect to a PostgreSQL local database server, double-click the TPgConnection component and insert the connection information in the appeared dialog box (see Figure 5). Let’s consider the example. In pgAdmin, we previously created a my_db database and a demo table ‘table’ . Figure 4 . A created demo table “table” in a my_db database in pgAdmin. To connect to this localhost server, enter its name in the PgConnection dialog box as shown in Figure 5. The username is postgres. Then enter the password and select the name of your database from a dropdown list. Click Connect and then OK . Figure 5 . Creating a connection to a PostgreSQL database. Now we have connected to our database, so when we run the application, it just connects because we have already selected the connected property. Now we need to write one line of code on close. This means that when we close the application, it kills the database server connection. Figure 6. Killing the connection with the database server. Now let’s select a PgQuery. In the Properties , PgConnection is already selected. In the SQL property here, we’re going to type some SQL code: SELECT * FROM table After code insertion, click OK and activate. Then mark Activate in the Properties . Figure 7. SQL query activating. To write the SQL query, you can also use Code Editor in the Object Inspector as shown in Figure 8. Figure 8 . Creating an SQL query using the PgQuery component. Now we have a data source component here. We just need to show the data to the user with our UI controls. Right-click and open the LiveBindings Designer and as you can see, we already have the fields available. Figure 9 . LiveBindings Designer window. Click the LiveBindings Wizard, and select Link a grid with a data source . Then click Next –> TStringGrid –> Next and as an existing source, select PgQuery . Add a data source navigator if you need it. Close it. Now we just need to make some changes. Right-click the grid and select Quick Edit . Choose the alignment and layout and close the window. The navigator should be inside the form, not inside the stream grid component, so drag it to the right place. Make additional changes if you need. Click Save and then Run. Figure 10 . The result of the connection to the database. As you can see, we have connected to our database. You can also add new data by pressing + . In our example, we have added a string with literal values a, b, and c . Figure 11 . Adding new values. Click Save and Close . If you open the pgAdmin, just select the data and click Run . As you can see, we have the updated information available inside our demo table. Figure 12 . Updated information in the table in pgAdmin. To learn more about PostgreSQL Data Access Components, please check out its [webpage](https://www.devart.com/pgdac/) where you can find all the interesting information along with the extensive documentation. Tags [connect to postgresql](https://blog.devart.com/tag/connect-to-postgresql) [pgadmin](https://blog.devart.com/tag/pgadmin) [pgdac](https://blog.devart.com/tag/pgdac) [PostgreSQL](https://blog.devart.com/tag/postgresql) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2F5-simple-tips-for-postgresql-data-access-components-in-2022.html) [Twitter](https://twitter.com/intent/tweet?text=5+Simple+Tips+for+PostgreSQL+Data+Access+Components+in+2025&url=https%3A%2F%2Fblog.devart.com%2F5-simple-tips-for-postgresql-data-access-components-in-2022.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/5-simple-tips-for-postgresql-data-access-components-in-2022.html&title=5+Simple+Tips+for+PostgreSQL+Data+Access+Components+in+2025) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/5-simple-tips-for-postgresql-data-access-components-in-2022.html&title=5+Simple+Tips+for+PostgreSQL+Data+Access+Components+in+2025) [Copy URL](https://blog.devart.com/5-simple-tips-for-postgresql-data-access-components-in-2022.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/a-look-back-at-pass-summit-2015.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) A Look Back at PASS Summit 2015 By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) November 5, 2015 [0](https://blog.devart.com/a-look-back-at-pass-summit-2015.html#respond) 2727 PASS Summit 2015 has ended a little more than a week ago and we are already looking forward to what’s next in 2016. [PASS Summit](https://www.pass.org/default.aspx) is the world’s largest conference for Microsoft SQL Server and BI professionals where participants have a great chance to attend a wide variety of technical sessions presented by world-recognized experts and industry leaders. Also, it is a great opportunity to have a face to face talk with developers from leading software development companies and get the latest news from the first hand. We were the first-timers at the event and we really enjoyed it! We would like to thank all participants who came to our exhibition booth, asked questions and shared opinions! It was a great pleasure to talk to you and present you our new features and products. We made friends with leading professionals from across the world and we appreciate it! If you haven’t had a chance to talk to us on the exhibition, please feel free to contact us via email or [social networks](https://www.facebook.com/DevartSoftware/) . Our doors are always open! Our special thanks to [Mr. Al Shuler](https://www.linkedin.com/in/al-shuler-2404656) who is a Partner Development Manager at PASS for the assistance in preparing to the event. Tags [devart](https://blog.devart.com/tag/devart) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fa-look-back-at-pass-summit-2015.html) [Twitter](https://twitter.com/intent/tweet?text=A+Look+Back+at+PASS+Summit+2015&url=https%3A%2F%2Fblog.devart.com%2Fa-look-back-at-pass-summit-2015.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/a-look-back-at-pass-summit-2015.html&title=A+Look+Back+at+PASS+Summit+2015) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/a-look-back-at-pass-summit-2015.html&title=A+Look+Back+at+PASS+Summit+2015) [Copy URL](https://blog.devart.com/a-look-back-at-pass-summit-2015.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/a-new-look-of-differences-in-compare-bundle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) A New Look of Differences in Compare Bundle! By [dbForge Team](https://blog.devart.com/author/dbforge) September 2, 2019 [0](https://blog.devart.com/a-new-look-of-differences-in-compare-bundle.html#respond) 4102 We are thrilled to announce a huge update of our SQL Server comparison tool pack, [dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) .  The new versions of [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) and [dbForge Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) include a number of new features that deliver extra convenience to working with data and schema differences. In addition, the core features of the tools have been greatly redesigned and improved to make your experience with the tools as good as it gets. dbForge Schema Compare for SQL Server Object Filter The new feature allows filtering objects right in Comparison Document. Applying advanced filters makes the analysis of schema comparison results more effective, informative, and bespoke. The feature also allows applying multiple filters and creating custom filters with Filter Editor which can be saved, and used for further comparisons. Redesigned Objects Text Diff Control The coloring model of diffs highlighting simplifies the analysis of schema comparison results and speeds up the whole schema comparison process. Redesigned Generate Comparison Report Window The window has been completely redesigned with all the report generation options have been neatly regrouped. In addition, users can select how script diffs will look in their reports. Redesigned Schema Comparison Report in HTML HTML reports feature new smooth design and became a way more informative: apart from information about objects,  now the HTML reports include the actual script differences. Redesigned Schema Comparison MS SSMS Add-in Window The window has become more functional and features the Copy to Target , Swap , Copy to Source buttons. If you use [dbForge Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/) add-in along with Schema Compare, you can select a certain Source Control repository as Source or Target . By the way, to learn how to involve dbForge Schema Compare in the CI process, feel free to watch [this video](https://youtu.be/hllTzoXvoO8) . dbForge Data Compare for SQL Server Redesigned Data Compare Control Viewing of exact data differences has become more smooth and clear: tabs of the grid became more informative with more crisp data differences highlighting. Redesigned Data Comparison Report Window The window has been completely redesigned with all the report generation options have been neatly regrouped. Redesigned Data Comparison Report in CSV Now Data Compare for SQL Server generates several report files in CSV format.  One of them contains summary results, and the rest of the files contain specific data diffs info. Redesigned Data Comparison SSMS Add-in Window The window has become more functional and features the Copy to Target , Swap , Copy to Source buttons. The window also allows selecting a scripts folder as Source or Target . Tell Us What You Think [Download](https://www.devart.com/dbforge/sql/compare-bundle/download.html) the updated dbForge Compare Bundle and try out new and updated features of our data and schema diff tools. We are looking forward to your [feedback](https://www.devart.com/dbforge/sql/compare-bundle/feedback.html) to go on crafting fine tools for you. The updated dbForge Schema Compare for SQL Server and dbForge Data Compare for SQL Server are also available as components of [dbForge Developer Bundle for SQL Server](https://www.devart.com/dbforge/sql/developer-bundle/) . Tags [data compare](https://blog.devart.com/tag/data-compare) [developer bundle](https://blog.devart.com/tag/developer-bundle) [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fa-new-look-of-differences-in-compare-bundle.html) [Twitter](https://twitter.com/intent/tweet?text=A+New+Look+of+Differences+in+Compare+Bundle%21&url=https%3A%2F%2Fblog.devart.com%2Fa-new-look-of-differences-in-compare-bundle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/a-new-look-of-differences-in-compare-bundle.html&title=A+New+Look+of+Differences+in+Compare+Bundle%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/a-new-look-of-differences-in-compare-bundle.html&title=A+New+Look+of+Differences+in+Compare+Bundle%21) [Copy URL](https://blog.devart.com/a-new-look-of-differences-in-compare-bundle.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Absolute Trophy Champion—dbForge Studio for MySQL Got Nine Awards By [dbForge Team](https://blog.devart.com/author/dbforge) January 24, 2022 [0](https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html#respond) 2423 The festive season has come to an end, but Devart is still receiving presents. We have great news that we would like to share. [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) gained nine awards from G2. The collection of dbForge Studio for MySQL counts nine new badges: According to G2 user reviews, dbForge Studio for MySQL attracts by many features, among which are: data and schema compare functionality, advanced code completion and refactoring tools, visual query builder, robust database administration utilities, and much more. You can check all reviews on the [dbForge Studio for MySQL Reviews & Product Details](https://www.g2.com/products/dbforge-studio-for-mysql-2018-12-04/reviews) page. [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is a universal tool for managing, administrating, and developing MySQL and MariaDB databases. It has everything to create and execute queries, develop and debug stored routines, automate database object management, compare and synchronize databases, analyze table data, and much more. Therefore, it is not surprising that users choose dbForge Studio for MySQL for their work. All these awards prove that the Devart team develops quality and reliable products. Tags [Awards](https://blog.devart.com/tag/awards) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fabsolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Absolute+Trophy+Champion%E2%80%94dbForge+Studio+for+MySQL+Got+Nine+Awards&url=https%3A%2F%2Fblog.devart.com%2Fabsolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html&title=Absolute+Trophy+Champion%E2%80%94dbForge+Studio+for+MySQL+Got+Nine+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html&title=Absolute+Trophy+Champion%E2%80%94dbForge+Studio+for+MySQL+Got+Nine+Awards) [Copy URL](https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/add-sensitivity-classification-command-in-sql-server-2019.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) ADD SENSITIVITY CLASSIFICATION Command in SQL Server 2019 By [dbForge Team](https://blog.devart.com/author/dbforge) September 4, 2020 [0](https://blog.devart.com/add-sensitivity-classification-command-in-sql-server-2019.html#respond) 3564 For a database administrator, the common everyday practice involves running multiple operations targeted at ensuring database security and integrity. Thus, we shouldn’t overlook the importance of sensitive data stored in the database under any circumstances. In light of this, we are excited to demonstrate the new ADD SENSITIVITY CLASSIFICATION command introduced in SQL Server 2019, which allows adding the sensitivity classification metadata to database columns. What’s the big deal about data protection? There are numerous types of applications that store sensitive information both for users, such as credit card numbers, passwords, health care information, IDs, SSN, and other applications like credential data, trade secrets, certificates. The leak or breach of such information can lead to horrific consequences as companies might be forced to pay millions of dollars in damage compensation to customers and financial institutions. To be compliant with regulations for personal data such as GDPR or healthcare data (HIPAA), companies need to acquire the best practices of data security and protection. What the command offers The most sensitive data in your database mainly refers to the business, financial, healthcare, or personal information. To establish a high level of your organization data protection, the key steps you should undertake are to discover the sensitive data and then to classify it. This is where the new command will show to its best advantage. Above all, it will help you meet the standards for data privacy and requirements for regulatory compliance. Additionally, with its help, you can implement several scenarios to monitor (audit) and alert on anomalous access to sensitive data. Finally, you will be able to toughen the security of databases containing highly sensitive data and manage access to them. Summarizing the above, one of the pivoting points in compliance practice is to know which data has to be secured, classify this data, give access to only a limited number of people allowed to view or modify it, and continuously monitor access to your sensitive data to know all access patterns. How the ADD SENSITIVITY CLASSIFICATION command works For starters, let me remind you that a similar feature—Data Discovery and Classification—was introduced into SSMS v17.5. As well as the ADD SENSITIVITY CLASSIFICATION command, the SSMS wizard allows classifying data and labeling it with sensitivity tags. To learn the details and the differences between these two, refer to our article about [SQL Data Discovery and Classification in SSMS.](https://blog.devart.com/manage-your-sensitive-data-with-sql-data-discovery-and-classification-in-ssms.html) Let’s now talk about the ADD SENSITIVITY CLASSIFICATION command in greater detail. This section will deal with the most important issues related to discovering, classifying, and labeling columns that contain sensitive data in your database, along with viewing the current classification state of your database. Below are three metadata attributes used in the classification of sensitive data: Label is the main classification attribute. Its task is to define the sensitivity level of the data stored in the column. You can indicate your data as being Public, General, Confidential, Highly Confidential, etc. Information Type gives additional description of the type of data stored in the database column. It indicates the field your sensitive data refers to, whether it is Banking, Contact Info, Credentials, Financial, or else. Rank defines the sensitivity rank and ranges from none to critical, as shown below: SQL syntax To add sensitivity classification to a database object, simply apply the following syntax: ADD SENSITIVITY CLASSIFICATION TO\n [, ...n ]\n WITH ( [, ...n ] )\n\n ::=\n{\n [schema_name.]table_name.column_name\n}\n\n ::= \n{\n LABEL = string |\n LABEL_ID = guidOrString |\n INFORMATION_TYPE = string |\n INFORMATION_TYPE_ID = guidOrString |\n RANK = NONE | LOW | MEDIUM | HIGH | CRITICAL\n} Let’s consider the following example: As well as that, SQL Server 2019 introduced the sys.sensitivity_classifications system catalog view, which returns information types and sensitivity labels. You can use it to manage the database classifications, as well as to generate reports. With the limitation that the classification is supported only for columns. Use the following query to review all classified columns with the corresponding classifications: SELECT\n SCHEMA_NAME(sys.all_objects.schema_id) AS SchemaName,\n sys.all_objects.name AS [TableName], sys.all_columns.name AS [ColumnName], [Label], [Information_Type]\nFROM\n sys.sensitivity_classifications\nLEFT JOIN sys.all_objects ON sys.sensitivity_classifications.major_id = sys.all_objects.object_id\nLEFT JOIN sys.all_columns ON sys.sensitivity_classifications.major_id = sys.all_columns.object_id\n AND sys.sensitivity_classifications.minor_id = sys.all_columns.column_id See the example output below: Database audit Database audit involves analyzing and tracking the activity of database users related to database security, access and usage, data creation, change, or deletion. Auditing is an essential part of database security because oftentimes database administrators and consultants have to make sure the permission to access data is only given to those who need it and not otherwise. There’s no denying that the most critical part of any organization is its data. There can be many users who might have permission to manipulate data, and it’s extremely important that all confidential and restricted data not be edited by unauthorized users. Applying the ADD SENSITIVITY CLASSIFICATION command, you can quickly and easily detect the most vulnerable data and classify it. After that, the classification state is added to the audit log, which helps monitor access to sensitive data for compliance and auditing purposes. ADD SENSITIVITY CLASSIFICATION in dbForge SQL Complete The Devart team is committed to keeping up with the latest changes in SQL Server, so the support for the new command was added to SQL Complete with [the v6.6 release](https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html) . The tool is a superior solution for SQL database development, management, and administration with great auto-completion functionality. The support for the ADD SENSITIVITY CLASSIFICATION command is one of its beneficial updates. Let’s overview how the new feature works in SQL Server Management Studio (SSMS) with [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . SQL Complete 6.6 (and later versions) enables you to easily classify database columns according to the data sensitivity level by prompting sensitivity labels that show the vulnerability of data in the database column. Aside from the sensitivity label, a column can have another attribute—Information Type, which provides additional granularity to the type of data stored in the database column. Again, quick and comprehensive prompts by SQL Complete significantly facilitate data classification. In the suggestion window, SQL Complete marks columns containing personal or confidential information according to GDPR with black or red circles depending on the sensitivity degree. ADD SENSITIVITY CLASSIFICATION in dbForge Studio for SQL Server [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) —a comprehensive integrated development environment (IDE) designed for SQL Server database management and development—also obtained support for the ADD SENSITIVITY CLASSIFICATION command with the [release of v7.0](https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html) , bringing a new level of efficiency to data classification processes. This feature seamlessly integrates sensitivity label prompts into the completion list and quick info, enabling swift and accurate data classification directly within the SQL Server environment. By supporting the ADD SENSITIVITY CLASSIFICATION command, dbForge Studio for SQL Server v7.0 (and later versions) simplifies the process of SQL Server data classification. The tool prompts sensitivity labels and information types, allowing users to efficiently tag columns based on data sensitivity levels and information types. This enhances data security, compliance, and auditing capabilities, enabling organizations to better protect sensitive data and meet regulatory requirements. Summary On that note, let’s outline the main capabilities and advantages the new feature has to offer. The ADD SENSITIVITY CLASSIFICATION command is a powerful enhancement introduced in SQL Server 2019 that will improve database security and compliance with data protection rules. With its help, you can easily discover the columns that contain potentially sensitive data, create reports as well as add the classification metadata. By using the command, you are sure to facilitate database audit and bring access to your sensitive data under control. Tags [ADD SENSITIVITY CLASSIFICATION](https://blog.devart.com/tag/add-sensitivity-classification) [sql command](https://blog.devart.com/tag/sql-command) [sql complete](https://blog.devart.com/tag/sql-complete) [sql data classification](https://blog.devart.com/tag/sql-data-classification) [sql sensitivity classification](https://blog.devart.com/tag/sql-sensitivity-classification) [sql server 2019](https://blog.devart.com/tag/sql-server-2019) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fadd-sensitivity-classification-command-in-sql-server-2019.html) [Twitter](https://twitter.com/intent/tweet?text=ADD+SENSITIVITY+CLASSIFICATION+Command+in+SQL+Server+2019&url=https%3A%2F%2Fblog.devart.com%2Fadd-sensitivity-classification-command-in-sql-server-2019.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/add-sensitivity-classification-command-in-sql-server-2019.html&title=ADD+SENSITIVITY+CLASSIFICATION+Command+in+SQL+Server+2019) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/add-sensitivity-classification-command-in-sql-server-2019.html&title=ADD+SENSITIVITY+CLASSIFICATION+Command+in+SQL+Server+2019) [Copy URL](https://blog.devart.com/add-sensitivity-classification-command-in-sql-server-2019.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/adding-timestamp-to-a-filename.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Adding Timestamp to a Filename By [dbForge Team](https://blog.devart.com/author/dbforge) January 23, 2015 [0](https://blog.devart.com/adding-timestamp-to-a-filename.html#respond) 19784 Summary : This article describes how to add time and date to a filename using the command line. Sometimes it is crucial to append time and date to the name of a file. For example, we would like to have separate log files for each execution of data synchronization script. To prevent file overriding, we want to specify date and time in the name of each log file. Generally, the solution is quite simple — we should use the %date% and %time% built-in variables, that display date and time based on the regional coding. In our case, the variables returned the following values: for date echo %date% Thu 01/22/2015 for time echo %time% 16:20:08.80 In most cases, we do not need to put the entire date and time, as it is specified above. In our case, we wish to output date in the MM/DD/YY format, and time in the HH:MM:SS format. To achieve this, we will create the .bat file, and modify system variables with help of the SET command: echo %date% SET mm=%date:~4,2% SET dd=%date:~7,2% SET yy=%date:~12,2% echo %time% SET hh=%time:~0,2% SET min=%time:~3,2% SET ss=%time:~6,2% Note : digits after tilde specify the position and the number of characters we want to use, for example %date:~4,2% mean that we want to output two characters, starting after the fourth character (numbering begins from 0). Here what command line returned after execution of our .bat file: SET mm=01 SET dd=22 SET yy=15 SET hh=18 SET min=09 SET ss=12 Now, all we have to do is to put it all together to the required script after the /log argument: /log:”D:\\dbForge\\logfile_%mm%%dd%%yy%_%hh%%min%%ss%.log” The execution of the .bat file should return the following: logfile_012215_181259.log There is also an alternative way to perform the task without the .bat file and the SET command. In this case, you should modify system variables right in the script. For example: /log:”D:\\dbForge\\logfile%date:~4,2%%date:~7,2%%date:~12,2%_%time:~0,2%%time:~3,2%%time:~6,2%.log” To sum everything up, let’s use the timestamp code in practice. Let’s compare and synchronize data through the command line by means of [dbForge Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) , and add timestamp to the logfile name: Tags [command line](https://blog.devart.com/tag/command-line) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fadding-timestamp-to-a-filename.html) [Twitter](https://twitter.com/intent/tweet?text=Adding+Timestamp+to+a+Filename&url=https%3A%2F%2Fblog.devart.com%2Fadding-timestamp-to-a-filename.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/adding-timestamp-to-a-filename.html&title=Adding+Timestamp+to+a+Filename) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/adding-timestamp-to-a-filename.html&title=Adding+Timestamp+to+a+Filename) [Copy URL](https://blog.devart.com/adding-timestamp-to-a-filename.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/ado-net-vs-dapper.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) ADO.NET vs Dapper: Comparison Guide for .NET Developers By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) April 29, 2025 [0](https://blog.devart.com/ado-net-vs-dapper.html#respond) 205 In .NET, data access has evolved, but finding the right tool still comes down to control vs. convenience. You have to decide: do you prefer to write every query or move faster with something easier to maintain? Can you manage the boilerplate, or would you rather work with leaner syntax? For many .NET developers, the answers point to one of two popular tools—ADO.NET or Dapper. ADO.NET gives you complete control but with boilerplate and manual overhead. Dapper, on the other hand, simplifies development with less friction but offloads SQL, relationships, and state management to you. And for teams that want the flexibility to use both, tools like [dotConnect](https://www.devart.com/dotconnect/) unify access without sacrificing performance. In this article, we compare ADO.NET and Dapper, covering performance, structure, and the complexity each tool expects you to manage. Let’s dive in! Table of contents What is ADO.NET? What is Dapper? Key differences between ADO.NET and Dapper ADO.NET vs Dapper: performance benchmarking Choosing between ADO.NET and Dapper How Devart’s dotConnect can help you choose the right data access strategy Conclusion What is ADO.NET? [ADO.NET](https://www.devart.com/dotconnect/what-is-ado-net.html) (ActiveX Data Objects for .NET) is Microsoft’s low-level framework for working directly with relational data sources. It’s built into the .NET Framework and .NET Core, and it remains one of the most direct ways to connect, query, and manipulate data in SQL Server, Oracle, MySQL, and other databases. At its core, ADO.NET is about explicit control. You manage connections, craft your own SQL commands, and decide precisely how data flows between your application and the database. There’s no middle layer trying to interpret your intent—what you write gets executed. Here are its key building blocks: SqlConnection / OleDbConnection : These classes handle opening and closing the pipeline between your app and the database. You provide the connection string, and ADO.NET handles the handshake. SqlCommand : Executes raw SQL queries or stored procedures. Supports parameterization for security and performance. SqlDataReader : A high-performance, forward-only cursor that reads one row at a time directly from the database stream. Lightweight and fast—perfect for read-heavy operations. DataSet / DataTable : These in-memory objects let you work with data in a disconnected state. Ideal when you need to manipulate or batch-process data before syncing back to the database. Since the early 2000s, ADO.NET has powered countless enterprise applications, especially when performance, transactional precision, or complex queries are non-negotiable. If you’re working on a .NET application that needs fine-tuned data access with no overhead, it’s your best bet. But, expect to write more code and own the logic end to end. What is Dapper? [Dapper](https://www.devart.com/dotconnect/what-is-dapper-orm.html) is a lightweight micro-ORM (Object-Relational Mapper) for .NET that simplifies data access while maintaining close control over SQL. Created by the team at Stack Overflow, it was designed to reduce the repetitive code required when using raw ADO.NET without introducing the overhead of a full-featured ORM. While it doesn’t provide features like change tracking or relationship management, Dapper is valued for what it does, that is, make ADO.NET-based code faster to write and easier to read—without hiding how your data is queried. Dapper works by extending IDbConnection with a set of helper methods. The most commonly used are: .Query() – Executes a SQL query and maps the result to a list of objects. .Execute() – Runs commands like INSERT, UPDATE, or DELETE without expecting a return value. Instead of manually looping through a SqlDataReader, Dapper uses reflection to map results directly to C# objects—making your code cleaner, faster, and easier to maintain. Now, let’s see how Dapper stacks up against ADO.NET in practice. Key differences between ADO.NET and Dapper Both ADO.NET and Dapper let you access data in .NET applications—but how you use them, and what you gain or give up, varies significantly. Let’s take a closer look at the differences between [ADO.NET vs Dapper](https://blog.devart.com/best-postgresql-orm-in-dotnet.html) . Performance ADO.NET is the baseline for performance. It talks directly to the database engine with zero abstraction. You manage everything: the connection, the commands, the data reader. If you’re optimizing down to milliseconds—or processing large datasets in tight loops—ADO.NET gives you the edge. Dapper sits on top of ADO.NET, adding lightweight mapping via reflection. While that sounds expensive, in practice, the overhead is minimal. For most use cases—like loading API data or querying business logic—Dapper is fast enough. And the time you save writing and maintaining code usually outweighs the small performance trade-off. Flexibility and control ADO.NET gives you full, low-level control. You can stream massive result sets, build dynamic SQL on the fly, wrap multi-step operations in fine-grained transactions, or tap into any corner of the database engine you need. Nothing is hidden. Nothing is automatic. That’s a strength—if you’re willing to manage the complexity. Dapper simplifies things. It’s built for clean, predictable queries with straightforward mappings. You still write SQL and manage transactions, but when things get dynamic—like custom mappings or complex multi-type joins—you may need workarounds or drop back to raw ADO.NET. Ease of use ADO.NET is verbose. Even basic queries involve multiple steps—setting up connections, commands, readers, and manual mapping. It’s powerful, but repetitive—and that repetition is where bugs tend to hide. Dapper reduces most of that to one or two lines. It lets you focus on what you want from the database, not how to get it out. Instead of building everything from scratch, you use intuitive methods like .Query() and .Execute(). It’s easier to write, easier to read, and easier to maintain over time—especially in projects with a lot of standard data access logic. Pro tip: Using Dapper with tools like [dotConnect](https://www.devart.com/dotconnect/) gives you extra productivity boosts—such as multi-database support and better developer tooling—without sacrificing simplicity. ADO.NET vs Dapper in code: querying a list of players Here’s a quick example that highlights the difference in verbosity and developer experience between ADO.NET and Dapper. // Define the Player model used in both examples \npublic class Player \n{ \n    public int Id { get; set; } \n    public string Name { get; set; } \n} ADO.NET code This approach manually connects to a database, runs a SQL query, reads each row, and maps the results to a list of Player objects. using (var connection = new SqlConnection(connectionString)) \n{ \n    var command = new SqlCommand(\"SELECT Id, Name FROM Players\", connection); \n    connection.Open(); \n    var reader = command.ExecuteReader(); \n    var players = new List(); \n    while (reader.Read()) \n    { \n        players.Add(new Player \n        { \n            Id = reader.GetInt32(0), \n            Name = reader.GetString(1) \n        }); \n    } \n} Dapper code This does the same thing in a single line, thanks to Dapper’s extension methods and automatic object mapping. using (var connection = new SqlConnection(connectionString)) \n{ \n    var players = connection.Query(\"SELECT Id, Name FROM Players\").ToList(); \n} ADO.NET vs Dapper: feature comparison Aspect ADO.NET Dapper Which is Better? Performance Direct, zero abstraction. Ideal for high-speed, low-level operations. Near-native performance for most use cases. Minor overhead from mapping. ADO.NET – slightly faster in micro-optimized scenarios Bulk insert handling Efficient via SqlBulkCopy or custom logic. Slower for large batch inserts unless paired with third-party libraries. ADO.NET – better suited for high-volume inserts Performance consistency Stable and predictable across workloads. Generally fast, but may show slight variability in edge cases. ADO.NET – more consistent under all workloads Ease of use Verbose; requires manual setup and teardown for every query. Minimal boilerplate. Clean, intuitive syntax. Dapper – much easier and faster to implement Flexibility & control Full access to SQL, transactions, and streaming. Great for straightforward queries; more complex scenarios may need workarounds. ADO.NET – greater control for edge-case operations Scalability Ideal for complex, high-throughput enterprise systems. Scales well for typical services; may need tuning for large workloads. Neutral – depends on architecture Learning curve Steeper; requires understanding of all DB interactions. Easier for SQL-savvy developers; less overhead to get started. Dapper – better for onboarding and fast prototyping Object mapping Manual; developers must map fields to objects. Automatic via reflection. Dapper – cleaner and faster for mapping Ecosystem / extensions Built-in, stable core API. Depends on third-party libraries for advanced features like mapping or bulk ops. ADO.NET – more complete out of the box Ideal for Legacy systems, data-heavy platforms, performance-critical workloads. CRUD-heavy APIs, microservices, rapid development environments. Neutral – based on project type ADO.NET vs Dapper: performance benchmarking When performance is a deciding factor, numbers matter more than assumptions. To get a clearer picture, developer Matthew Jones ran [benchmark tests](https://exceptionnotfound.net/dapper-vs-entity-framework-vs-ado-net-performance-benchmarking/) comparing [Dapper vs ADO.NET](https://blog.devart.com/ado-net-vs-entity-framework.html) using identical SQL Server queries across 10 runs. The goal: to measure how fast each method handled realistic database operations—without optimizing for one over the other. The test scenarios included: Player by ID – A basic lookup using a primary key. Players per Team – A filtered list query. Teams per Sport – A more complex join involving multiple entities. Here are the Dapper vs ADO.NET benchmarking results. Query type ADO.NET Dapper Player by ID 0.013 ms 0.047 ms Players per Team 1.03 ms 1.01 ms Teams per Sport 8.84 ms 7.94 ms Query execution speed ADO.NET remains the gold standard for raw performance, especially for low-level, repetitive operations where microseconds matter. However, Dapper comes extremely close—and in more complex queries like Teams per Sport , it actually outperformed ADO.NET. This may be due to the use of SqlDataAdapter in ADO.NET versus Dapper’s more efficient query handling and mapping pipeline. Still, the Dapper vs ADO.NET performance difference is often negligible. In most real-world applications, Dapper delivers 95% of the speed with far less code—a trade-off many developers are happy to make. Overhead and memory efficiency Both tools are lightweight in terms of memory usage. ADO.NET gives you full control, meaning memory usage is entirely in your hands. Dapper adds a layer of object mapping using reflection, but it’s highly efficient. It’s worth noting that Dapper may incur a small performance hit on its first use, similar to EF and other ORMs, due to the initialization of its internal mapping logic. After the initial run, performance stabilizes and remains fast. Scalability When dealing with massive data loads, bulk imports, or streaming, ADO.NET gives you every knob and lever for precise tuning. Dapper, while slightly less flexible, handles high-throughput APIs and service-layer queries very well—especially when query patterns are clear and predictable. In practice, ADO.NET excels in performance-critical systems like financial apps, ETL pipelines, and real-time analytics engines. Dapper, on the other hand, is a great fit for CRUD-heavy APIs, microservices, and internal business apps where clean code and dev speed matter more than ultra-fine performance tuning. Pro Tip: Benchmark results only tell part of the story. For better ADO.NET performance, use SqlDataReader instead of SqlDataAdapter—and consider providers like dotConnect to further optimize execution and connectivity. Choosing between ADO.NET and Dapper Choosing a data access strategy in .NET is less about tooling preferences and more about trade-offs—between control, performance, maintainability, and delivery speed. Both ADO.NET and Dapper are high-performance options, but they solve different problems. The right choice depends on your system architecture, team skills, and how much abstraction you’re prepared to manage. When to use ADO.NET ADO.NET is the most direct way to work with a database in .NET. It’s low-level, explicit, and designed for scenarios where abstraction introduces risk. Use ADO.NET when: Performance tuning is non-negotiable : Avoids reflection and runtime overhead—ideal for systems that demand predictable speed at scale. Complex transactions need full control : Supports custom transaction boundaries, savepoints, and fine-tuned rollback logic. Legacy integration is required : Extends existing ADO.NET-heavy systems without introducing unnecessary friction. Handling massive datasets or streaming workloads : Offers low-level control over how data is read, processed, and optimized for scale. Yes, it’s verbose. But in performance-critical systems, verbosity is often the cost of precision. ADO.NET doesn’t make assumptions—it gives you control. You want control? You take responsibility. That said, manual mapping can become brittle—any schema change can break assumptions, leading to runtime issues if column indexes or types are mismatched. Maintenance can quickly become tedious in fast-evolving systems. When to use Dapper Dapper is built for developers who want raw performance with minimal code. It strips away ceremony while staying close to the database. Use Dapper when: Codebase simplicity is a goal : Reduces boilerplate with concise syntax and intuitive mapping. SQL is written and managed directly : Gives full transparency and control to SQL-fluent teams. Endpoints require fast execution : Perfect for microservices and APIs where speed matters. ORM features are unnecessary : No tracking, no lazy loading—just data in, data out. Development timelines are aggressive : Quick to implement, easy to maintain. Dapper isn’t just for small apps. It’s well-suited for mid-sized systems where performance, clarity, and onboarding speed all matter. It does one thing exceptionally well: map query results to objects fast—and get out of your way. However, just be aware: Dapper may introduce a slight warm-up cost on first query execution due to reflection setup. After that, it runs fast. And while it handles common mapping needs smoothly, more complex use cases (like custom mappings or bulk inserts) may require third-party libraries like Dapper.FluentMap or Dommel. Also, because Dapper relies on raw SQL, it’s up to the developer to manage query correctness, parameter safety, and database alignment. There’s no compile-time validation—you find out what breaks at runtime. How to choose (without overthinking it) Choose ADO.NET if your project demands full control over data access, must integrate with legacy systems, or operates under strict performance constraints. Choose Dapper if you want performance close to ADO.NET, but need to develop quickly and maintain clean, readable data access code. In practice, many teams combine both. Use [Entity Framework](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) for everyday workflows, Dapper for speed-sensitive queries, and ADO.NET where absolute control is required. The goal is not loyalty to a tool—it’s using the right abstraction for the task in front of you. How Devart’s dotConnect can help you choose the right data access strategy Choosing between ADO.NET and Dapper often comes down to balancing performance, flexibility, and long-term maintainability. But what if you didn’t need to choose just one? dotConnect makes that possible—it’s an [ADO.NET provider](https://www.devart.com/dotconnect/) that enhances both ADO.NET and Dapper by offering a unified, professional-grade data access solution with advanced features. Here’s what makes it work for both sides. Built for versatility dotConnect extends ADO.NET with advanced features while offering native support for Dapper, right out of the box. This dual compatibility means development teams don’t have to trade speed for structure—or limit themselves to a single methodology. For ADO.NET users: dotConnect adds asynchronous support, improved design-time tools, and compatibility with popular ORMs like Entity Framework and LINQ to DataSet—making low-level access easier to scale and maintain. For Dapper users: dotConnect ensures stable, high-performance connections across multiple database systems, with optimized drivers that simplify cross-database development. One API, multiple databases With support for PostgreSQL, Oracle, MySQL, SQLite, SQL Server, and more, dotConnect lets teams standardize their data access stack—even in environments with multiple backends. This reduces integration overhead and accelerates onboarding for new developers. Why it matters In real-world .NET applications, it’s rarely a question of either/or . Many teams use Dapper for rapid querying and ADO.NET for edge cases requiring full control. dotConnect allows those choices to coexist in a single, cohesive strategy—backed by stable, commercial-grade data providers. Try it yourself! Download a free trial of [dotConnect](https://www.devart.com/dotconnect/) and explore how it can simplify your data access strategy—regardless of whether you’re building for performance, simplicity, or both. Conclusion The ADO.NET vs. Dapper decision comes down to what fits your project—not which tool is objectively better. Choose ADO.NET when full control, transaction precision, or legacy integration is key. Choose Dapper for fast, maintainable data access in performance-focused, SQL-driven apps. No matter your choice—Dapper or ADO.NET — your data provider plays a critical role. Tools like dotConnect enhance both ADO.NET and Dapper with multi-database support, better connectivity, and smooth integration across projects. Ready to optimize your .NET data layer? [Try dotConnect for free](https://www.devart.com/dotconnect/) and build with more speed, flexibility, and control. Frequently Asked Questions How does Dapper compare to ADO.NET when handling complex queries and large datasets? Dapper can handle complex queries efficiently, but ADO.NET offers greater control for scenarios that involve streaming large datasets, managing memory, or executing multi-step operations. ADO.NET gives developers more flexibility at the cost of added complexity and boilerplate. In what scenarios is it more beneficial to use Dapper over ADO.NET? Dapper is ideal for performance-focused applications that rely on straightforward SQL operations—like APIs, microservices, or internal tools. It significantly reduces boilerplate and accelerates development without compromising speed, making it more practical for small to mid-sized projects. Can Dapper be used alongside ADO.NET in the same project, and if so, how? Yes. Dapper is built on top of ADO.NET and works naturally in combination with it. You can use Dapper for fast object mapping in most cases while falling back to raw ADO.NET when full control is required—for example, in complex transactions or data streaming scenarios. Are there specific use cases where ADO.NET outperforms Dapper? ADO.NET outperforms Dapper when low-level database control is needed—such as streaming massive datasets, handling complex transactions, or fine-tuning command execution. It’s also preferred in legacy systems where consistency with existing patterns is important. How does Dapper handle database connections differently compared to ADO.NET? Dapper relies on the same connection objects as ADO.NET (SqlConnection, etc.), but it abstracts much of the repetitive work. Developers still manage connection lifecycles, but Dapper simplifies command execution and object mapping, reducing manual effort. How does integrating dotConnect with Dapper affect the performance and scalability of data-driven applications? dotConnect provides optimized, native database drivers that improve connection stability and query performance—especially under load. When paired with Dapper, this results in faster execution and greater scalability across high-throughput applications. How does dotConnect’s support for multiple databases compare to Dapper’s compatibility with different data sources? Dapper works with any ADO.NET-compatible provider but doesn’t offer built-in support for advanced features across different databases. dotConnect simplifies this by offering a consistent API and full-featured support for popular databases like [PostgreSQL](https://www.devart.com/dotconnect/postgresql/) , [Oracle](https://www.devart.com/dotconnect/oracle/) , [MySQL](https://www.devart.com/dotconnect/mysql/) , and more—simplifying cross-database development. Tags [ADO.NET](https://blog.devart.com/tag/ado-net) [Dapper](https://blog.devart.com/tag/dapper) [dotconnect](https://blog.devart.com/tag/dotconnect) [orm](https://blog.devart.com/tag/orm) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fado-net-vs-dapper.html) [Twitter](https://twitter.com/intent/tweet?text=ADO.NET+vs+Dapper%3A+Comparison+Guide+for+.NET+Developers%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fado-net-vs-dapper.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/ado-net-vs-dapper.html&title=ADO.NET+vs+Dapper%3A+Comparison+Guide+for+.NET+Developers%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/ado-net-vs-dapper.html&title=ADO.NET+vs+Dapper%3A+Comparison+Guide+for+.NET+Developers%C2%A0) [Copy URL](https://blog.devart.com/ado-net-vs-dapper.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/ado-net-vs-entity-framework.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) What is the Difference Between ADO.NET and Entity Framework Core: Comparison Guide By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) January 3, 2025 [0](https://blog.devart.com/ado-net-vs-entity-framework.html#respond) 19972 When developers start working on any applications (for both web or desktop versions), they spend a long time worrying about the backend database, its tables, and their relationships, stored procedures, views, etc. Along with this, they also need to consider the data schema which will return from the backend part for the application. For this type of operation, we can use many available frameworks such as DAO, RDO, ADO, ADO.NET, Entity Framework, etc. Out of these different frameworks, developers most commonly use [Entity Framework Core or ADO.NET](https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html) , depending on project requirements. So, after reading this article,  you will get a clear understanding of the following topics supported with clear examples. Table of contents What is ADO.NET? Advantages of ADO.NET Overview of ADO.NET Architecture What is the Entity Framework? Advantages of Entity Framework Core When should you use ADO.NET or Entity Framework Core? Entity Framework Core architecture What is the difference between ADO.NET and Entity Framework Core Entity Framework vs ADO.NET comparison table Benchmark tests: ADO.NET vs Entity Framework Core performance How dotConnect supports ADO.NET and Entity Framework Core Conclusion What is ADO.NET? ADO.NET was invented by Microsoft as a part of the .NET Framework Component. With the help of this technology, we can access any type of data source through our applications and fetch the data to our C# and VB.NET. ADO.NET is a collection of object-oriented-based classes that provide a rich set of data components and with help of these components, we can create high-performance, reliable, and scalable database-based applications. In the ADO.NET models, it connects to the data source only when the system requires reading or updating the data. It is one of the major impacts on application development. Because, in Client-Server or distributed application, always having a connection resource open all the time is one of the most resource-consuming parts. In reality, we don’t need to connect a data source all the time. We only need to connect with a data source when we are reading and writing the data to a data source. With ADO.NET, we can use SQL queries and stored procedures to perform the read, write, update and delete operation from a data source. We can use the SQL syntax with the help of ADO.NET Command objects and it always returns data in the form of DataReader or DataSet objects. So that, after the connection closes, we can use the DataSet objects to work for the data and after completing the work on our computer, we can connect the data source again when we need to update the data source. A dataset is a container of multiple DataTable Objects and every data table can have a relationship among them. We can access the data source and fill the dataset with the help of data providers. The .NET Framework provides us with three different types of data providers – ADO.NET, OLEDB, and ODBC. XML plays a major role in ADO.NET. The ADO.NET model utilizes XML to store the data in the cache and transfer the data among applications. Datasets use XML schemas to store and transfer data among applications. We can even use this XML file from other applications without interacting with the actual dataset. We can use data among all kinds of applications and components because XML is an industry standard; we can transfer data via many protocols, such as HTTP, because of XML’s text-based nature. Advantages of ADO.NET ADO.NET provides many advantages over the previous Microsoft-based data access technologies like ADO. Some of the major and important advantages are as follows: Single Object-Oriented API – ADO.NET always features a single object-oriented collection of classes. ADO.NET also provides different data providers to work with different types of data sources, but the programming model for all the data providers works in the same way. So, if we implement the ADO.NET for one data provider, then after that if we need to change the data provider or use the other data provider, we do not need to change the entire process, we just need to change the class names and connection strings. Managed Code – The ADO.NET classes are managed classes. They take all the advantages of .NET CLR, such as language independence and automatic resource management. All .NET languages access the same API. So if we know how to use these classes in C#, we have no problem using them in VB.NET. Another big advantage is we don’t have to worry about memory allocation and freeing it. The CLR will take care of it for us. Deployment – In real life, writing database applications using ODBC, DAO, and other previous technologies and deploying on client machines was a big problem that was somewhat taken care of in ADO except that there are different versions of MDAC. Now you don’t have to think about that. Installing distributable .NET components will take care of it. XML Support – Today, XML is an industry-standard format and the most widely used method of sharing data among applications over the Internet. In ADO.NET, data is always cached and transferred in XML format. So that, this data can be shared with the application by components and we can transfer data via different protocols such as HTTP for different types of operations. Performance and Scalability – When we are developing any web-based application, we always keep focus on two major concerns i.e. Performance and Scalability. Transferring data from one source to another is always a costly process across the Internet due to connection bandwidth limitations and rapidly increasing traffic. Using disconnected cached data in XML takes care of both of these problems. DataReader versus DataSet – The ADO.NET DataReader is used to retrieve data in read-only mode (cannot update data back to a data source) and forward-only mode (cannot read backward/random) data from a database. We create a DataReader by calling Command.ExecuteReader after creating an instance of the Command object. LINQ to DataSet – LINQ to DataSet API provides queries capabilities on a cached DataSet object using LINQ queries. The LINQ queries are written in C#. LINQ to SQL – LINQ to SQL API provides queries against relational databases without using a middle layer database library. ADO.NET architecture concept Microsoft designed ADO.NET in such a way that we can perform different kinds of data source operations in the same fashion. For simplicity, we can categorize ADO.NET components into three categories: disconnect, common or shared, and the .NET data providers. The disconnected components build based on ADO.NET architecture. We can use these classes with or without data providers. For example, we can use a DataTable object with or without providers, and shared or common components are the base classes for all types of data providers. The below ADO.NET architecture diagram demonstrates related to the ADO.NET Component model and how they work together. A data provider is a set of factors, similar as Connection, Command, DataAdapter, and DataReader. The Connection is the first element that talks to a data source. With the help of Connection Object , we can establish a connection between the application and the data source. These connection objects work as reference objects in the Command and DataAdapter objects. A Command object executes a SQL query and stored procedures to read, add, update, and cancel data of a data source via a DataAdapter. A DataAdapter is ground between a dataset and the connection. We can use the Command Object to execute any type of SQL Queries and Stored Procedures to fetch data from the database. All data providers share the ADO.NET common components. These components like DataSet, DataView, and DataViewManager always represent the data on behalf of ADO.NET. The DataSet component objects normally use XML Schema to capture and return data between the Applications and the Data Providers. A DataSet is a sub-set of DataTable objects. A DataTable represents a database table. We can represent single or multiple views of a dataset with the help of DataView and DataViewManager objects. In our applications, if required we can directly use a DataView or DataViewManager component with data-bound controls like DataGrid or DataList. What is Entity Framework Core? [Entity Framework Core](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) (EF Core) is a cross-platform, open-source ORM (Object-Relational Mapper) for .NET applications, developed by Microsoft. It simplifies data access by allowing developers to work with C# objects instead of writing raw SQL queries. This abstraction eliminates the need to interact directly with database tables or columns, making data operations more intuitive and efficient. As shown in the image below, EF Core sits between the application’s business logic (domain classes) and the database. It automatically maps data between objects and relational tables, handling retrieval, updates, and persistence without manual SQL queries. As shown in the image above, Entity Framework Core serves as the bridge between business entities (domain classes) and the database. It facilitates data retrieval by mapping database records to objects and persists changes back to the database when required. This eliminates the need for manual SQL queries, allowing developers to work with data using object-oriented principles. Advantages of Entity Framework Core Now, let’s explore Entity Framework Core advantages and how it works. It offers significant improvements over earlier Microsoft-based data access technologies like ADO.NET. Some of the key benefits include: Modeling – Uses POCO (Plain Old CLR Objects) to define entity models that map to database tables, allowing a code-first approach without requiring an EDMX file. Querying – Supports LINQ-to-Entities for querying data, translating expressions into database-specific SQL. Also allows executing raw SQL queries for advanced database operations. Change Tracking – Automatically tracks modified properties in entity instances, ensuring that only changed data is updated in the database. Saving Data – Executes INSERT, UPDATE, and DELETE operations, applying changes when SaveChanges() or SaveChangesAsync() is called. Concurrency Control – Implements optimistic concurrency to prevent overwriting conflicts when multiple users modify data simultaneously. Transaction Management – Supports automatic transactions for consistent database operations and allows custom transaction handling when needed. Built-in Conventions – Uses a convention-over-configuration approach to reduce setup time, with Fluent API and Data Annotations available for customization. Migrations – Provides built-in migration commands via the .NET CLI or NuGet Package Manager Console, enabling database schema changes without manual SQL scripts. Caching – Implements first-level caching within a DbContext instance, preventing redundant queries. EF Core does not include second-level caching by default. With these capabilities, EF Core streamlines database operations while maintaining flexibility and scalability, making it a preferred ORM for modern .NET applications. When should you use ADO.NET or Entity Framework Core? Choosing the right technology depends on your application’s performance needs, maintainability, and scalability. While ADO.NET provides fine control over database operations, Entity Framework Core (EF Core) offers a more developer-friendly ORM approach. When to use ADO.NET? ADO.NET is the best choice in the following scenarios: High-performance applications that require low latency and direct database access. Large-scale data processing where precise control over queries and stored procedures is necessary. Legacy systems already built on ADO.NET, making migration to ORM unnecessary. Scenarios requiring direct SQL execution, such as working with bulk inserts, transactions, and stored procedures. When to use Entity Framework Core? EF Core is a better fit if: Minimizing manual SQL queries accelerates development. Cross-database compatibility is required, as EF Core supports multiple providers. An object-oriented approach to data management is preferable to writing raw SQL. Scalability and maintainability are priorities, with LINQ and entity tracking simplifying data access. Hybrid approaches: Using ADO.NET and Entity Framework Core together A hybrid approach is beneficial in scenarios where: EF Core is used for general CRUD operations and rapid development. ADO.NET is used for performance-critical database tasks, such as bulk data operations and transaction-heavy workloads. dotConnect is leveraged to provide optimized database connectivity for both ADO.NET and EF Core. Entity Framework Core architecture Let’s give a short overview of the different components available under the Entity Framework Core architecture: POCO Entities – EF Core uses Plain Old CLR Objects (POCOs) to define entity models, mapping them directly to database tables using Fluent API or Data Annotations. DbContext & DbSet – The DbContext class acts as the main entry point for querying and saving data. It contains DbSet properties that represent database tables as collections of entities. LINQ-to-Entities (L2E) – Developers use LINQ queries to retrieve, filter, and manipulate data. EF Core translates these LINQ expressions into SQL queries optimized for the underlying database provider. Change Tracking – EF Core automatically tracks modifications to entity instances, ensuring only necessary changes are persisted to the database. Transactions & Concurrency Control – EF Core supports automatic transactions and uses Optimistic Concurrency Control to prevent data conflicts. Database Providers – EF Core is database-agnostic, supporting SQL Server, PostgreSQL, MySQL, SQLite, and more. It communicates with databases through custom EF Core database providers. Migration & Schema Management – EF Core allows schema changes using Migrations, enabling database creation, updates, and modifications via CLI or NuGet Package Manager. Raw SQL Support – While EF Core primarily relies on LINQ, it also allows executing raw SQL queries when necessary for performance optimization. This architecture makes EF Core more lightweight, flexible, and efficient than previous versions, making it the preferred ORM for modern .NET applications. The following figure shows the overall architecture of the Entity Framework Core. To expand your possibilities, you can use this powerful [Entity Framework Designer](https://www.devart.com/entitydeveloper/) that automates the process. What is the difference between ADO.NET and Entity Framework Core In this chapter, we will explore the differences between traditional ADO.NET and Entity Framework Core. While both technologies enable database interactions, ADO.NET provides direct control over SQL execution, whereas EF Core abstracts database operations through an ORM approach. Despite their shared purpose, ADO.NET and EF Core differ significantly in performance, flexibility, and ease of use. Below are some of the key [ADO.NET and Entity Framework Core differences](https://blog.devart.com/working-with-data-transfer-objects-in-asp-net-core.html) . Performance ADO.NET is faster because it connects directly to the database without ORM overhead. EF Core translates LINQ queries into SQL, which adds some overhead but improves maintainability. ADO.NET is ideal for high-performance applications, while EF Core balances performance with ease of use. Flexibility ADO.NET provides full control over queries, allowing developers to optimize SQL execution manually. EF Core abstracts database operations, making data access simpler but less flexible. EF Core supports raw SQL execution, but ADO.NET remains the better choice for stored procedures and complex queries. Development speed ADO.NET requires manual coding for database interactions, increasing development time. EF Core generates models and relationships automatically, reducing effort in setting up the data access layer. EF Core is preferred for rapid development, while ADO.NET is better for fine-tuned, custom data access solutions. Code maintainability Debugging ADO.NET requires tracking database queries manually. EF Core maintains entity relationships using Fluent API and Data Annotations, simplifying code structure. EF Core supports automatic schema migrations, making long-term maintenance easier. Both ADO.NET and Entity Frameworks have similar but quite different features. We offer a clear comparison table to make the process of comparison easier and to answer numerous questions about them (e.g., “Does Entity Framework use ADO.NET?” etc.). Entity Framework vs ADO.NET comparison table Below, we highlight the differences between [Entity Framework vs. ADO.NET](https://blog.devart.com/key-difference-between-ado-net-and-asp-net.html) , let’s explore. SI No ADO.NET Entity Framework 1 ADO.NET establishes a direct connection between relational or non-relational systems and applications. Entity Framework provides an object-relational mapping framework over ADO.NET architecture. 2 It is directly connected to the database. For Entity Framework, it first translates the LINQ query into raw SQL queries and then executes that query to the database. 3 ADO.NET provides complete control over the data access layer, allowing the creation of classes and methods from scratch. It automatically creates the data model classes and their related database context class. 4 Debugging in ADO.NET is cumbersome as it requires navigating from the application layer to the database layer. Entity Framework provides a clear relationship between different data model classes. 5 More flexible in terms of raw SQL queries and procedures because it offers greater control over the database. Entity Framework is less flexible because it always depends on LINQ queries, which return the data entity model class type. Takeaway: The [difference between Entity Framework and ADO.NET](https://blog.devart.com/net-core-top-practices-for-developers.html) comes down to performance vs. convenience. ADO.NET offers direct control and faster execution, while Entity Framework simplifies data management but adds overhead. Benchmark tests: ADO.NET vs Entity Framework Core performance Performance is critical in database operations, particularly in applications that handle large datasets, real-time processing, or transaction-heavy workloads. To understand ADO.NET vs Entity Framework performance, we refer to a benchmarking study conducted by [Exception Not Found](https://www.exceptionnotfound.net/dapper-vs-entity-framework-vs-ado-net-performance-benchmarking/) , which tested various query execution times under controlled conditions. The benchmark tests measured execution times for three types of queries: Fetching a player by ID (single-row retrieval) Retrieving all players for a team (multi-row retrieval) Fetching teams for a sport, including all players (complex relational query) The results were as follows: Technology Player by ID Players for Team Teams for Sport ADO.NET 0.013 ms 1.03 ms 8.84 ms EF 0.77 ms 3.57 ms 113.45 ms Key observations: ADO.NET demonstrated the fastest performance in simple queries, such as fetching a player by ID, with an average execution time of 0.013 ms. Entity Framework was significantly slower across all query types, particularly in complex queries like “Teams for Sport,” averaging 113.45 ms. Note: These benchmarks are based on specific test conditions and may vary depending on the application’s context and environment. How dotConnect supports ADO.NET and Entity Framework Core dotConnect is a database connectivity solution built on ADO.NET, designed to provide smooth integration with various relational databases. It supports both traditional ADO.NET operations and modern ORMs like Entity Framework Core, allowing developers to balance performance, maintainability, and cross-database compatibility. Key aspects of dotConnect’s integration: Built on ADO.NET Architecture – dotConnect uses standard ADO.NET classes while extending functionality with additional features like advanced connection pooling and performance optimizations. Supports Entity Framework Core – Works as a database provider for EF Core, ensuring compatibility with LINQ-to-Entities, change tracking, and transactions while maintaining database flexibility. Multi-Database Compatibility – Provides support for multiple database systems, including SQL Server, PostgreSQL, MySQL, Oracle, and SQLite, offering a unified approach to data access. Entity Developer (ED) for ORM Modeling – Includes Entity Developer, a powerful tool for modeling and managing EF Core and ADO.NET entity relationships, reducing manual configuration. By utilizing dotConnect, developers gain a flexible, high-performance data access solution that integrates smoothly with ADO.NET and EF Core while providing additional capabilities for efficient database operations. [Try dotConnect](https://www.devart.com/dotconnect/mysql/download.html) today and experience seamless database connectivity tailored to your development needs! Conclusion So, which is better ADO.NET or Entity Framework Core? If we want to achieve more control over SQL commands and operations with the help of raw SQL Queries, then ADO.NET will be a great choice to start work with. Whereas if we want to develop the application in a much faster way with clear code maintainability, then Entity Framework will be the better choice. But also, we can use both the approaches in a single application, like, Entity Framework for the CRUD-related operations and the ADO.NET for fetching records from the database for the reporting purpose and bulk data-related SQL operations. FAQ Should ADO.NET or Entity Framework Core be used? The choice depends on the application’s priorities. ADO.NET is the go-to solution for high-performance scenarios, offering direct SQL access and full control over database operations. In contrast, EF Core simplifies development and improves maintainability through its ORM abstraction. A hybrid approach often works best—utilizing EF Core for standard CRUD operations while using ADO.NET for bulk inserts, complex queries, and transaction-heavy tasks. Is ADO.NET outdated? ADO.NET remains widely used in enterprise applications where performance and control are critical. Microsoft continues to support it and serves as the foundation for tools like Dapper, dotConnect, and LinqConnect. Does Entity Framework Core introduce performance overhead? Yes, EF Core adds overhead due to query translation and change tracking. However, optimizations like compiled queries (AsNoTracking()) and indexing can reduce impact. The trade-off between ease of development and performance is acceptable for most applications. Can ADO.NET and Entity Framework Core be used together? Yes, and many applications do. EF Core internally uses ADO.NET, and both can be combined—EF Core for ORM-based data access and ADO.NET for raw SQL performance. Many also integrate Dapper for optimized query execution. How does dotConnect enhance database performance? dotConnect optimizes ADO.NET and EF Core by improving connection pooling, query execution, and ORM capabilities. It adds: Faster database access through connection pooling. Optimized query execution for high-performance applications. Extended ORM support for EF Core, NHibernate, and LinqConnect. Multi-database compatibility with SQL Server, PostgreSQL, MySQL, and Oracle. Tags [ADO.NET](https://blog.devart.com/tag/ado-net) [ef core](https://blog.devart.com/tag/ef-core) [entity framework](https://blog.devart.com/tag/entity-framework) [orm](https://blog.devart.com/tag/orm) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fado-net-vs-entity-framework.html) [Twitter](https://twitter.com/intent/tweet?text=What+is+the+Difference+Between+ADO.NET+and+Entity+Framework+Core%3A+Comparison+Guide&url=https%3A%2F%2Fblog.devart.com%2Fado-net-vs-entity-framework.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/ado-net-vs-entity-framework.html&title=What+is+the+Difference+Between+ADO.NET+and+Entity+Framework+Core%3A+Comparison+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/ado-net-vs-entity-framework.html&title=What+is+the+Difference+Between+ADO.NET+and+Entity+Framework+Core%3A+Comparison+Guide) [Copy URL](https://blog.devart.com/ado-net-vs-entity-framework.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/adopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Adopt Flawless Continuous Integration into Bamboo with a New dbForge DevOps Plugin By [dbForge Team](https://blog.devart.com/author/dbforge) March 30, 2020 [0](https://blog.devart.com/adopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html#respond) 2372 We are glad to inform our SQL Server users that we have just extended the compatibility of [dbForge DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) with just-released dbForge DevOps Automation Bamboo Plugin for SQL Server. Bamboo Integration The new plugin brings all the power of dbForge DevOps Automation for SQL Server powered by [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) right into Bamboo and helps to tune and implement all steps of the CI process in the most simple and effective way. Along with Bamboo, dbForge DevOps Automation for SQL Server also integrates into Jenkins, TeamCity, and Azure DevOps. Additionally, watch these videos to discover how dbForge products can boost database development. [How dbForge SQL Complete is involved in the Database DevOps process](https://youtu.be/RNgxe_8InU0) [How to import data to SQL Server database with dbForge Data Pump during the DevOps process](https://youtu.be/R7nq351mlHo) [Creating database documentation during the Continuous Integration workflow](https://youtu.be/S4W0ybixQII) [How to automate database schema changes for the CI process during database deployment](https://youtu.be/hllTzoXvoO8) [dbForge Source Control in the DevOps pipeline](https://youtu.be/reU4ALv2ctg) [Test data generation in the Continuous Integration and Deployment processes](https://youtu.be/G3GNo0i03bk) [Unit Testing for SQL Server Database in DevOps process](https://youtu.be/3A5JEs3Nz0I) Availability [Download](https://www.devart.com/dbforge/sql/database-devops/static/dbforge-devops-automation-for-sqlserver-1.0.29.obr) dbForge DevOps Automation Bamboo Plugin for SQL Server here and [share your thoughts](https://www.devart.com/dbforge/sql/database-devops/feedback.html) about the product. Your feedback helps us to improve the tool according to your needs. dbForge DevOps Automation for SQL Server is a free product that is supplied exclusively as part of [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) . Tags [dbForge DevOps Automation](https://blog.devart.com/tag/dbforge-devops-automation) [devops](https://blog.devart.com/tag/devops) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fadopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html) [Twitter](https://twitter.com/intent/tweet?text=Adopt+Flawless+Continuous+Integration+into+Bamboo+with+a+New+dbForge+DevOps+Plugin&url=https%3A%2F%2Fblog.devart.com%2Fadopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/adopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html&title=Adopt+Flawless+Continuous+Integration+into+Bamboo+with+a+New+dbForge+DevOps+Plugin) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/adopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html&title=Adopt+Flawless+Continuous+Integration+into+Bamboo+with+a+New+dbForge+DevOps+Plugin) [Copy URL](https://blog.devart.com/adopt-flawless-continuous-integration-into-bamboo-with-new-dbforge-devops-plugin.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/adventures-of-clr-types-in-net-framework.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Adventures of CLR Types in .Net Framework By [dbForge Team](https://blog.devart.com/author/dbforge) November 28, 2014 [0](https://blog.devart.com/adventures-of-clr-types-in-net-framework.html#respond) 3606 Summary : This article describes the issue occurred with execution of the a polygon instance of the geography type and the possible solution for it. Once I stumbled upon the following query: DECLARE @g1 GEOGRAPHY; SET @g1 = geography::Parse(‘POLYGON ((0 0, 1 0, -1 1, 2 2, 0 0))’); SELECT @g1 go The query seemed to be quite valid and SSMS executed it smoothly, but nevertheless [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) failed  to execute it. In SQL server Management Studio 2014, the query after execution returns the following binary serialized instance of the Geography data type: 0xE6100000020005000000000000000000000000000000000000000000000000000000000000000000 F03F000000000000F03F000000000000F0BF0000000000000040000000000000004000000000000000 00000000000000000001000000010000000001000000FFFFFFFF0000000003 During deserialization of the aforesaid value, dbForge Studio threw an exception, and I was destined to get to the bottom of the issue. Making no question of the value validity, I decided to write a test .NET application to localize the problem. I was not interested a lot in determining the polygon type, as the query was executed successfully and returned the result – that means that, theoretically a simple application should have no issues with it. However, in practice, everything turned out to be far otherwise. The simple code fragment SqlConnection connection = new SqlConnection(@””); connection.Open(); new SqlCommand(@”DECLARE @g1 GEOGRAPHY; SET @g1 = geography::Parse(‘POLYGON ((0 0, 1 0, -1 1, 2 2, 0 0))’); SELECT @g1″, connection).ExecuteScalar(); that was supposed to create the instance returned by server, threw the same exception as I got in dbForge instead. System.FormatException occurred HResult=-2146233033 Message=One of the identified items was in an invalid format. Source=Microsoft.SqlServer.Types StackTrace: at Microsoft.SqlServer.Types.GeoData.Read(BinaryReader r) The situation seemed extremely strange to me. Why CLR is unable to deserialize the standard instance, that was successfully created and serialized on server? Well, let’s take a closer look at this polygon. After the query execution, SSMS showed the binary value – it is not required to instantiate the type in this case. Let’s see what will happen if SSMS instantiates the type. For this, open the Spatial Results window to represent the polygon visually. SSMS will definitely have to load the assembly and deserialize the object. Here is what I ended up with: So, what does it all mean? Well, first of all, this points to the fact that the polygon is really odd. But! SQL Server and its Management Studio have managed to create and serialize-deserialize the object perfectly. So why then my application fails? The code seems to be exactly the same (parcing and serialization are executed by CLR-type, that is located, in this case, in the Microsoft.SqlServer.Types assembly), and it is executed in the same Runtime – Common Language Runtime 4.0. But what if the code is different, after all? Let’s check it out. Assembly, loaded by the application Assembly, loaded by SSMS It appears that we did load different assemblies! In spite of the fact that metadata features the fully qualified type name with the name and version of the assembly (and it is located in my GAC), for some reason CLR loaded the different assembly. At this point, I became more interested than ever and decided, that Spending some time on debugging the .Net Framework code (thanks to Reference Source, it’s not hard at all), I stumbled upon the following lines: System.Data.dll!System.Data.SqlClient.SqlConnection.ResolveTypeAssembly (System.Reflection.AssemblyName asmRef, bool throwOnError) asmRef.Version = TypeSystemAssemblyVersion; The assembly version, that should be loaded, is replaced with the one, that FrameWork considers to be more correct. The most interesting thing is that, though the property value can be managed with help of the Connection String parameter, it accepts only two preset values – 10 and 11. If we try to use any other version, or, for whatever reason, to determine the proper version dynamically, we will hardly find solution for the problem. The only thing that you can do is Binding Redirect. By the way, let’s take a look at SSMS.exe.config: So, here comes the answer. SSMS redirects all calls for the types assembly to the version, that corresponds to SSMS itself, neglecting the version that corresponds the actual data from the server. I personally consider such behavior odd, but Microsoft knows better :) Anyway, I added these lines to dbForgeSql.exe, and received: In this case, as in any other specific case, the hack can be considered as a workaround. But the question – what to do in the common case, when the proper version becomes known at the time of execution – remains rhetorical. And the last thing – concerning polygon, it is really not valid, but the SQL Server 2014 documentation states, that “… a Polygon instance needs to only contain a ring with any four points to be accepted.” whereas the SQL Server 2008r2 documentation states, that “… A Polygon instance of a geography type is accepted only if the instance is valid.” Tags [performance](https://blog.devart.com/tag/performance) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fadventures-of-clr-types-in-net-framework.html) [Twitter](https://twitter.com/intent/tweet?text=Adventures+of+CLR+Types+in+.Net+Framework&url=https%3A%2F%2Fblog.devart.com%2Fadventures-of-clr-types-in-net-framework.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/adventures-of-clr-types-in-net-framework.html&title=Adventures+of+CLR+Types+in+.Net+Framework) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/adventures-of-clr-types-in-net-framework.html&title=Adventures+of+CLR+Types+in+.Net+Framework) [Copy URL](https://blog.devart.com/adventures-of-clr-types-in-net-framework.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/agile-code-review-process-with-review-assistant.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Agile Code Review Process with Review Assistant By [ALM Team](https://blog.devart.com/author/alm) October 9, 2013 [0](https://blog.devart.com/agile-code-review-process-with-review-assistant.html#respond) 3624 Summary: This article describes a scenario of [Review Assistant](https://www.devart.com/review-assistant/) usage in the agile development process. The peculiarity of this scenario is that every team member is allowed to join a code review. Some time ago we received the following question through our technical support: Hi guys! Our company is currently evaluating Review Assistant, we are using ‘Simple review workflow’ in our project. The issue is there is no way to create review without assigning a reviewer to it. This doesn’t allow us to use agile process, when anyone who is available can join a review. Is there any way to create review in current version? Problem description That company uses Review Assistant in the environment in which they have: Agile development process Highly disciplined team No review moderator (they use simple review workflow) But the team wants to go even further. They want : Not to assign reviewers when publishing a code review. Every team member to be able to join a review. That’s why they wrote to the customer support service. Solution Review Assistant has a built-in feature for publishing a code review without assigned reviewers, which will be particularly useful for agile teams. It allows any team member to join the review and review code. ‘No reviewer’ workflow Before creating a new code review, you should check whether the email notifications are properly set up for all project members, in order for them to be notified about a new review without reviewers. Open the Review Assistant options. Go to the Users tab. Click the name of each project member and check whether they have email notifications set to All events or Only when I’m involved in the New review field. Note: This option is available only for the user with administrative privileges or the owner of the project. Th ough, project members can set email notifications by themselves by going to the My account tab. To enable the ‘review with no reviewers’ option: Open the Review Assistant options. Navigate to the Projects tab and select Allow creating reviews without pre-defined reviewers . Note: This option is available only for the user with administrative privileges or the owner of the project. 3. After enabling the option, you can create a [regular review](https://docs.devart.com/review-assistant/creating-review/creating-regular-review.html) without adding reviewers. We will use a simple review workflow to demonstrate how this all works. 1. Go to the Projects tab and check the Use simple review workflow and Allow creating reviews without pre-defined reviewers options. 2. See whether the email notifications are configured as mentioned above. 3. Create a regular review without adding a reviewer. 4. Having email notifications properly set up, the whole team receives a notification about the new review. 5. Project members can open the link from the email and join the review as reviewers by clicking the corresponding link. As you can see, a project member Mathew Green is added to the code review as a reviewer. Conclusion We’ve shown you an example of how you can use Review Assistant in the agile code review process. [Start using](https://www.devart.com/review-assistant/download.html) our peer code review tool for free today. Tags [code review](https://blog.devart.com/tag/code-review) [review assistant](https://blog.devart.com/tag/review-assistant) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fagile-code-review-process-with-review-assistant.html) [Twitter](https://twitter.com/intent/tweet?text=Agile+Code+Review+Process+with+Review+Assistant&url=https%3A%2F%2Fblog.devart.com%2Fagile-code-review-process-with-review-assistant.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/agile-code-review-process-with-review-assistant.html&title=Agile+Code+Review+Process+with+Review+Assistant) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/agile-code-review-process-with-review-assistant.html&title=Agile+Code+Review+Process+with+Review+Assistant) [Copy URL](https://blog.devart.com/agile-code-review-process-with-review-assistant.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/alias-for-columns-in-sql-query.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) SQL Aliases: Improving Query Efficiency and Clarity By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) August 1, 2023 [0](https://blog.devart.com/alias-for-columns-in-sql-query.html#respond) 2655 SQL column aliases are the secret to clearer code and more user-friendly query results. Assigning custom names to database objects in SQL scripts results in improved readability and valuable insights into data. In this article, we’ll explore the power of SQL aliases, their practical applications, and the benefits they bring to database management and reporting. In this article, we will delve into the concept of SQL aliases, learn how to use them effectively, and discover the advantages they provide. Whether you’re a seasoned SQL expert or just beginning your journey, understanding SQL aliases will undoubtedly prove to be an invaluable skill in optimizing your database interactions. Buckle up and let’s dive in and unravel the versatility of SQL aliases. Contents Understanding SQL Aliases Definition of SQL aliases Basic syntax The importance and uses of SQL aliases Cases where SQL aliases are necessary Alias management with dbForge Studio for SQL Server Conclusion Understanding SQL Aliases Understanding and mastering aliases in SQL is of paramount importance as they not only improve the readability of queries but also facilitate code analysis and understanding. Properly utilizing aliases allows database professionals to enhance the overall query experience, streamline database interactions, and gain a comprehensive grasp of the data being retrieved. Column aliases are used to create more meaningful and descriptive names for columns in the result set. By providing custom names, you can present data in a more user-friendly format, making it easier to understand the meaning of each column. Table aliases , on the other hand, are employed when a query involves multiple tables or when the same table needs to be referenced more than once. Using table aliases not only reduces typing effort but also helps to distinguish between different instances of the same table, avoiding naming conflicts and making the query more concise and readable. Definition of SQL aliases In SQL, an alias is a user-defined alternative name assigned to a table, column, or any other database object in a SQL script or query result. It provides a way to rename the tables or columns temporarily, making the output more readable and intuitive. Aliases play a crucial role in simplifying complex queries, especially when dealing with database objects that have long or convoluted names. Basic syntax Column alias: To create a column alias, use the AS keyword followed by the desired name. Alternatively, you can omit the AS keyword and simply provide the alias name after the column name. Syntax with the AS keyword: SELECT column_name AS alias_name\nFROM table_name; Syntax without the AS keyword: SELECT column_name alias_name\nFROM table_name; Table alias: To create a table alias, specify an alternative name after the table name, followed by the AS keyword (optional). SELECT column_name\nFROM table_name [AS] alias_name; It’s important to note that while the AS keyword is optional for column aliases, it is typically considered good practice to use it for consistency and readability. String literals as column aliases In SQL, a string literal is a sequence of characters enclosed in single quotes (‘ ‘) or double quotes (” “). It represents a constant value that can be used in SQL statements, such as SELECT, WHERE, and other clauses. A string literal can be used for various purposes in SQL queries, and one of its common uses is as column aliases. To use a string literal as a column alias in a SQL query, you can use the syntax below. However, it is considered [deprecated](https://learn.microsoft.com/en-us/previous-versions/sql/sql-server-2012/bb510662(v=sql.110)?redirectedfrom=MSDN) and not recommended for modern SQL practices. SELECT 'alias_name' = column_name FROM table_name; The importance and uses of SQL aliases SQL aliases hold significant importance and offer several practical uses in database querying and reporting. Here are some key points highlighting their relevance: Improved readability: By utilizing aliases, developers have the ability to assign more descriptive and intuitive names to database objects within scripts and query results. This greatly improves the readability of the code, making it easier for users to comprehend it without referring to lengthy or cryptic original names. Aggregated functions: Aliases are often used in conjunction with aggregated functions like SUM, COUNT, AVG, etc. The custom names provided by aliases give context to the computed results, making it clear what each value represents. Self Joins and multiple table references: When a query involves multiple instances of the same table or Self Joins, table aliases become essential to differentiate between the various references, ensuring accurate data retrieval. Simplification of complex queries: In complex queries with multiple tables and joins, aliases streamline the process by reducing typing effort and enhancing query clarity. They make the SQL statement more concise and manageable. Customized output: By assigning aliases to columns, you can create a tailored output with meaningful headers. This customization improves the user experience when viewing query results or generating reports. Adaptable reports: Aliases allow for dynamic and customizable report generation. With aliases, you can standardize column names across different queries or modify them as needed, making reports consistent and adaptable. Renaming calculated columns: When performing calculations or data transformations, column aliases enable you to rename the resulting columns with names that reflect their purpose, fostering better documentation and understanding. Cases where SQL aliases are necessary Let us ask ChatGPT – an AI language model – whether there are situations where using aliases in SQL becomes obligatory. Let’s delve deeper into the scenarios mentioned by ChatGPT for a more comprehensive understanding. Using aliases in Self Joins Suppose you have a table called Employees that contains information about employees and their managers. The table has two columns: employee_id and manager_id , where manager_id refers to the employee_id of the manager. Here’s how you can use aliases in a self-join to retrieve the names of employees along with their respective manager names: SELECT emp.name AS employee_name, mgr.name AS manager_name\nFROM Employees AS emp\nJOIN Employees AS mgr ON emp.manager_id = mgr.employee_id; In this example, we are using aliases emp and mgr for the Employees table to distinguish between the employee and manager instances. The first alias, emp , is for the employees’ records, and the second alias, mgr , is for the corresponding manager records. The Self Join is established by joining the Employees table with itself using different aliases for each instance. By doing so, we can compare the manager_id from the emp alias with the employee_id from the mgr alias to get the manager’s name for each employee. Using aliases in this Self Join allows us to differentiate between the two occurrences of the Employees table and retrieve the desired information correctly. Using aliases to avoid column name ambiguity when a query involves multiple tables Using aliases to avoid column name ambiguity is a common and beneficial practice in SQL, especially when a query involves multiple tables. When querying data from multiple tables, it’s possible that some tables might have columns with the same name. In such cases, using aliases becomes necessary to differentiate between these identically named columns and ensure the query works correctly. Here’s an example scenario to illustrate the use of aliases for column name disambiguation: Consider two tables: Customers and Orders . Both tables have a column named id , but they represent different entities. To retrieve customer information along with their order details, you could use aliases to distinguish between the two id columns: SELECT c.id AS customer_id, c.name AS customer_name, o.id AS order_id, o.order_date\nFROM Customers AS c\nJOIN Orders AS o ON c.id = o.customer_id; In this query, aliases c and o are used for the Customers and Orders tables, respectively. The SELECT clause uses these aliases to rename the id columns to customer_id and order_id , respectively. By doing so, we eliminate any ambiguity and ensure that the results returned are accurate. Using aliases not only prevents naming conflicts but also makes the SQL code more readable and maintainable. Using aliases in correlated subqueries Using aliases in correlated subqueries is both necessary and powerful. Correlated subqueries are subqueries that reference values from the outer query, and aliases play a crucial role in distinguishing the tables involved in the subquery from those in the main query. Let’s look at an example to illustrate the use of aliases in correlated subqueries: Suppose we have two tables: Employees and Salaries . The Salaries table contains employee salaries, and the Employees table contains information about the employees, including their ID, name, and department. Now, let’s say we want to find employees whose salary is above the average salary in their department. We can achieve this using a correlated subquery with aliases: SELECT emp.ID, emp.Name, emp.Department, emp.Salary\nFROM Employees AS emp\nWHERE emp.Salary > (SELECT AVG(Salary) FROM Salaries AS s WHERE s.Department = emp.Department); In this query, we use the alias emp for the Employees table and s for the Salaries table within the correlated subquery. The emp.Department in the subquery refers to the department of the employee from the outer query, creating a connection between the subquery and the main query. Without aliases, the correlated subquery would not work correctly, as it wouldn’t know which table to reference when comparing the department values. Using aliases in correlated subqueries allows us to establish this link between the outer and inner queries, making it possible to perform data retrieval based on conditions from the outer query. This technique is powerful for advanced data analysis and filtering in SQL. Wondering how to optimize queries to increase execution speed? Check [SQL query plans performance tuning](https://www.devart.com/dbforge/sql/studio/sql-server-query-optimization.html) tips. Alias management with dbForge Studio for SQL Server [dbForge Studio for SQL Server](https://www.devart.com/dbforge/oracle/studio/) is an ultimate all-in-one IDE crafted to simplify SQL Server development, management, and reporting. It comes both as a standalone application and as part of the dbForge Edge bundle, which offers a comprehensive set of powerful solutions for popular relational DBMS, including MySQL, MariaDB, SQL Server, Oracle, and PostgreSQL. dbForge Studio for SQL Server is equipped with advanced capabilities for easy and convenient work with SQL aliases. [SQL aliases in dbForge Studio for SQL Server](https://docs.devart.com/studio-for-sql-server/writing-and-executing-sql-statements/work-with-aliases.html) can be generated automatically. The tool takes care of assigning aliases for tables, views, table-valued functions, and synonyms referenced in SQL statements. However, users have the flexibility to customize the alias generation behavior according to their preferences. Here’s how to do it: In the Tools menu, select Options . In the Options dialog, go to Text Editor > Code Completion > Aliases . Automatic alias assignment When the Generate alias on commit checkbox is selected, the Studio operates as follows: Adds aliases to tables, views, table-valued functions, and synonyms: As the user selects a database object from the suggestion list, dbForge Studio automatically assigns an alias to it. This eliminates the need for manual alias assignment, saving time and reducing the chances of errors. Adds table alias to column names: When the user chooses to select all columns using the asterisk (*) [wildcard in SQL](https://www.devart.com/dbforge/sql/sqlcomplete/sql-wildcard-search.html) or individually picks specific columns from the suggestion list, dbForge Studio goes the extra mile by adding a table alias to each column name. This enhances query clarity and avoids potential ambiguity in the output. Customization of alias names dbForge Studio for SQL Server allows users to personalize the alternative names assigned to auto-generated aliases. With this feature, users can tailor the alias names according to their coding preferences and readability standards, enhancing the clarity and organization of SQL scripts. To customize the names of auto-generated aliases: In the Tools menu, select Options . In the Options dialog, navigate to Text Editor > Code Completion > Alias . On the Alias page in the Condition column, specify the database object name (you may use a mask) to which you want to assign an alias. In the Action column, specify the custom alias you prefer. Click OK . Customization of alias names is beneficial as it allows developers to personalize the naming of database objects in SQL queries, aligning them with their coding preferences and readability standards. Alias refactoring in dbForge Studio for SQL Server dbForge Studio offers a convenient alias refactoring feature that allows you to rename aliases in queries with ease. Follow these steps to rename an alias: Right-click the desired alias in the SQL Editor window, then select the Rename command from the shortcut menu. Alternatively, you can select the alias and press the F2 key, which will highlight the alias for editing. Enter the new name for the alias directly in the SQL Editor window. As you type, a helpful tooltip will appear, guiding you to press F2 to preview the changes or use Enter/Tab to apply them. To preview the code changes before applying them, press F2 to open the Preview Changes – Rename dialog. Review the changes, and when satisfied, click Apply to implement the alias rename. Alternatively, if you’re confident in the changes you’ve made, you can immediately apply them in the code by pressing Enter or Tab . This alias refactoring feature streamlines the process of renaming aliases in your queries, enhancing code maintainability and reducing the chances of errors. Conclusion Aliases play a crucial role in enhancing the readability, maintainability, and efficiency of SQL scripts. By providing a concise and intuitive way to reference tables, views, and columns, aliases simplify complex queries and make them easier to comprehend. dbForge Studio for SQL Server is an outstanding tool for effectively managing aliases in SQL scripts, even those of considerable size. Its extensive range of features, such as automatic alias assignment, customizable alias generation, and alias refactoring capabilities, simplifies the handling of aliases. Take advantage of the Studio’s comprehensive features and seamless interface to boost productivity and efficiency. [Download a 30-day free trial of dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) now and unleash the full potential of alias management in your database development. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [query optimization](https://blog.devart.com/tag/query-optimization) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Falias-for-columns-in-sql-query.html) [Twitter](https://twitter.com/intent/tweet?text=SQL+Aliases%3A+Improving+Query+Efficiency+and+Clarity&url=https%3A%2F%2Fblog.devart.com%2Falias-for-columns-in-sql-query.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/alias-for-columns-in-sql-query.html&title=SQL+Aliases%3A+Improving+Query+Efficiency+and+Clarity) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/alias-for-columns-in-sql-query.html&title=SQL+Aliases%3A+Improving+Query+Efficiency+and+Clarity) [Copy URL](https://blog.devart.com/alias-for-columns-in-sql-query.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/alternatives-to-limit-clause-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Using Alternatives to the LIMIT Clause in SQL Server By [Oleksandra Pokhvalna](https://blog.devart.com/author/olivia-brooks) November 1, 2024 [0](https://blog.devart.com/alternatives-to-limit-clause-in-sql-server.html#respond) 873 In database management, efficient limiting of query results is crucial for optimizing performance and ensuring the retrieval of relevant data. Many SQL database systems, such as MySQL and PostgreSQL, utilize the LIMIT clause to specify the number of records a query returns. However, SQL Server doesn’t support the LIMIT clause, opting instead for alternatives like TOP , OFFSET-FETCH , and SET ROWCOUNT . This design choice reflects SQL Server’s focus on flexibility and performance, offering various methods to achieve similar functionality while catering to different use cases and scenarios. Let’s take a closer look at the LIMIT alternatives in SQL Server, highlighting their unique features and limitations. Contents Using the SELECT TOP clause Implementing pagination with OFFSET-FETCH Using the SET ROWCOUNT command Try it yourself with dbForge Studio Comparing the alternatives Further learning Using the SELECT TOP clause In SQL Server, the SELECT TOP clause acts as an alternative to the LIMIT clause. Likewise, it’s used to limit the number of rows returned by a query. It’s especially useful when you’re dealing with large datasets and want to retrieve only a subset of records. The basic syntax is: SELECT TOP (number | percent) column_names\nFROM table_name; Here, number stands for the exact number of rows to return, and percent is the percentage of rows to return from the total result set. Use one of these arguments depending on your needs. You can refine the results further by adding other clauses—like [WHERE](https://www.devart.com/dbforge/sql/studio/sql-where-clause-for-beginners.html) or ORDER BY . If you’d like to know more about clauses used with SELECT , check out the post about the [SQL SELECT statement](https://blog.devart.com/sql-select-statement.html) . For example, the following query returns the first five employees ordered by the date they were hired (here and below, we’ll be using the AdventureWorks2022 sample database in our examples): USE AdventureWorks2022;\n\nSELECT TOP 5 *\nFROM HumanResources.Employee\nORDER BY HireDate; Or, this query retrieves national IDs and job titles of the top 10% of employees who have more than 20 hours of vacation: USE AdventureWorks2022;\n\nSELECT TOP 10 PERCENT NationalIDNumber, JobTitle\nFROM HumanResources.Employee\nWHERE VacationHours > 20; Using the SELECT TOP clause brings a number of advantages. First, performance optimization—it limits the size of the result set, reducing memory and processing load when only a portion of the data is needed. Next, it can be used to create efficient pagination for large result sets by retrieving only the rows required for the current page. Also, it’s helpful when testing queries on large tables by limiting the number of returned rows. Note that SELECT TOP doesn’t provide random rows. To achieve randomness, you can combine it with ORDER BY NEWID() , but this can be inefficient for large datasets. On the other hand, without specifying an ORDER BY clause, the results can be unpredictable since SQL Server doesn’t guarantee the order of returned rows. Implementing pagination with OFFSET-FETCH Speaking of pagination, another clause— OFFSET-FETCH —can be used in SQL Server to implement pagination, allowing you to retrieve a specific subset of records by skipping a number of rows and then fetching a defined number of rows. This clause has the following syntax: SELECT column_names\nFROM table_name\nORDER BY column_name\nOFFSET number_of_rows_to_skip ROWS\nFETCH NEXT number_of_rows_to_return ROWS ONLY; The OFFSET clause lets you specify how many rows need to be skipped before returning rows, and FETCH NEXT defines how many rows to return after the skipped ones. To illustrate, imagine you need to skip the first ten records sorted by BusinessEntityID and return the next ten, effectively fetching page 2 in a paginated result. Your query will look like this: USE AdventureWorks2022;\n\nSELECT *\nFROM HumanResources.Employee\nORDER BY BusinessEntityID\nOFFSET 10 ROWS\nFETCH NEXT 10 ROWS ONLY; You can also use OFFSET-FETCH with a dynamic page size and page number. For example, this dynamic query fetches records for page 3, assuming a page size of ten rows per page: USE AdventureWorks2022;\n\nDECLARE @PageSize INT = 10;\nDECLARE @PageNumber INT = 3;\n\nSELECT BusinessEntityID, JobTitle, HireDate\nFROM HumanResources.Employee\nORDER BY BusinessEntityID\nOFFSET (@PageSize * (@PageNumber - 1)) ROWS\nFETCH NEXT @PageSize ROWS ONLY; This combination of clauses is great because it gives you precise control over paging—you get a clean and efficient way to handle pagination, especially for web applications. Moreover, OFFSET-FETCH adheres to the SQL standard, making it portable and easy to understand for developers coming from other RDBMS. On top of that, unlike other methods (such as using ROW_NUMBER() with subqueries), OFFSET-FETCH directly skips and fetches rows without the need for complex workarounds. It’s worth noting, though, that—for large datasets—the further you go in the pagination (say, page 1000), the slower the query might become because SQL Server has to skip more rows. Another point to consider is that OFFSET-FETCH doesn’t return the total number of rows, so if you need to display pagination metadata (like the total number of pages), an additional query— COUNT(*) —is required to fetch the total row count. And don’t forget that an ORDER BY clause is mandatory when using OFFSET-FETCH ; otherwise, the results are unpredictable. Using the SET ROWCOUNT command You can use the SET ROWCOUNT command in SQL Server to limit the number of rows returned by a SELECT statement or affected by UPDATE or [DELETE](https://blog.devart.com/sql-delete-statement-remove-one-or-multiple-rows-from-a-table.html) . The command’s syntax is as follows: SET ROWCOUNT { number | 0 } Instead of number you specify the number of rows to return or process, and 0 resets the row count. If you use SET ROWCOUNT and SELECT together with other commands such as ORDER BY and WHERE , their interaction can be quite powerful. In this combination, the WHERE clause filters the rows first, the ORDER BY clause sorts the filtered rows, and then SET ROWCOUNT limits the number of rows returned from the sorted result set. Here’s an example where we want to retrieve only the top five records of alphabetically sorted (by their job title) employees with over 50 hours of vacation and then reset the row count limit so that future queries return all matching rows: USE AdventureWorks2022;\n\nSET ROWCOUNT 5;\nSELECT *\nFROM HumanResources.Employee\nWHERE VacationHours > 50\nORDER BY JobTitle;\nSET ROWCOUNT 0; SET ROWCOUNT with data modification commands such as UPDATE or DELETE works similarly. Let’s consider an example of updating data. Running this script will change the job title to Chief Stocker for only the first of the employees with the Stocker job title, then reset the row count limit, and display the result: USE AdventureWorks2022;\n\nSET ROWCOUNT 1;\nUPDATE HumanResources.Employee\nSET JobTitle = 'Chief Stocker'\nWHERE JobTitle = 'Stocker';\nSET ROWCOUNT 0;\n\n-- To see the result of the update\nSELECT *\nFROM HumanResources.Employee\nWHERE JobTitle LIKE ('%Stocker%')\nORDER BY JobTitle; As you can see, SET ROWCOUNT provides a simple method for limiting results without complex syntax. Unlike TOP , which sometimes requires subqueries for more complex logic, SET ROWCOUNT can be used directly with a SELECT statement. However, the SET ROWCOUNT command is considered deprecated for limiting result sets in favor of the TOP clause, which offers clearer semantics. As a downside, the effect of SET ROWCOUNT is session-specific, meaning it must be set in each session where it’s needed. In addition, using SET ROWCOUNT in more complex queries can lead to unintended results if not carefully applied. Note that SET ROWCOUNT is less and less commonly used today. SQL standards have evolved, and modern SQL practices favor more explicit control over result sets and data manipulation. So Microsoft recommends using the TOP clause instead, as SET ROWCOUNT won’t affect DELETE and UPDATE statements in future releases following the SQL Server 2022 version. Try it yourself with dbForge Studio We’ll try one of the above use cases in dbForge Studio for SQL Server. Let’s first check how many employees in the HumanResources.Employee table of the AdventureWorks2022 database are holding the Marketing Specialist position. We do this as follows: SELECT\n COUNT(*) AS Count\nFROM HumanResources.Employee\nWHERE JobTitle = 'Marketing Specialist'; As we can see, there are five of them: We want to know three specialists who have the largest number of unused vacation hours. We use the following query: SET ROWCOUNT 3;\nSELECT\n *\nFROM HumanResources.Employee\nWHERE JobTitle = 'Marketing Specialist'\nORDER BY VacationHours DESC;\nSET ROWCOUNT 0; And dbForge Studio returns the result (we’ve changed the order of columns just for the purpose of this demonstration): Why [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) ? We believe that dbForge Studio—with its enhanced user interface and advanced features—outshines SQL Server Management Studio (SSMS). While SSMS revolves around the basics, dbForge Studio offers sophisticated tools like visual query building, database comparison and synchronization, data aggregation and analysis, automated unit testing, and integration with version control systems. This focus on user experience and powerful capabilities makes dbForge Studio a compelling [alternative to SSMS](https://www.devart.com/dbforge/sql/studio/alternative-to-ssms.html) for SQL Server management. To know more, watch this video: [SSMS vs dbForge Studio for SQL Server – Features Comparison](https://www.youtube.com/watch?v=UiVxy83826Y) . Got interested? You can try out dbForge Studio for SQL Server by [downloading it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) . The [installation guide](https://docs.devart.com/studio-for-sql-server/getting-started/installing.html) will help you get dbForge Studio running. Comparing the alternatives To sum up, let’s briefly compare the SQL Server alternatives to the LIMIT clause we’ve discussed above: SELECT TOP : Best for quick retrieval of a specified number of rows from a result set, especially when you need a fixed limit. Ideal for reports or dashboards where only the top N records are required. OFFSET-FETCH : Most appropriate for pagination in larger datasets. It allows you to skip a specified number of rows and return a defined set, making it great for displaying results across multiple pages. SET ROWCOUNT : Useful for limiting rows in older versions of SQL Server or for updating or deleting a specific number of records. However, its use is declining in favor of TOP and OFFSET-FETCH . So, each method has its unique strengths depending on the use case. Further learning We’d like to offer you the following resources to learn more about dbForge Studio for SQL Server, what it can do, and how to use it: [SQL Server Tutorials](https://blog.devart.com/sql-server-tutorial) [dbForge Studio Documentation](https://docs.devart.com/studio-for-sql-server/) [dbForge Studio Video Tutorials](https://www.youtube.com/playlist?list=PLpO6-HKL9JxXSZgO3L0MxOTt3QxpFbJNt) [Devart Academy](https://www.devart.com/academy/sql-server-studio/) Tags [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Oleksandra Pokhvalna](https://blog.devart.com/author/olivia-brooks) Oleksandra Pokhvalna is a Technical Writer at Devart. In her blog posts, she focuses on topics that may interest current and potential Devart product users. By making complex technical concepts accessible and user-friendly, her goal is to enhance user understanding and support database management and data connectivity. Oleksandra has a background in translation and is always eager to learn new technologies. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Falternatives-to-limit-clause-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Using+Alternatives+to+the+LIMIT+Clause+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Falternatives-to-limit-clause-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/alternatives-to-limit-clause-in-sql-server.html&title=Using+Alternatives+to+the+LIMIT+Clause+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/alternatives-to-limit-clause-in-sql-server.html&title=Using+Alternatives+to+the+LIMIT+Clause+in+SQL+Server) [Copy URL](https://blog.devart.com/alternatives-to-limit-clause-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Develop Android Database Applications in RAD Studio XE5 By [DAC Team](https://blog.devart.com/author/dac) October 8, 2013 [12](https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html#comments) 9770 Not so long ago, in RAD Studio XE5 there was introduced support for Android application development. On the Internet there can be found a lot of information about Android application development in Delphi XE5, but when it comes to development of database applications, it becomes much more difficult. The point is that the components and drivers included in the RAD Studio XE5 distribution (for example, FireDAC and dbExpress drivers) have severe limitations: on Android and iOS platforms, they support just 2 databases — SQLite and InterBase. And what to do when you develop a serious business-application, that requires remote connection to Oracle, MySQL or PostgreSQL from Android? We have come across such questions on various forums dedicated to database application development for Android and decided to reveal the topic as detailed as possible. Components Let’s start with description of the components we will use: [UniDAC](https://www.devart.com/unidac/) – it is universal data access components that allow to connect to SQLite, Oracle, MySQL, PostgreSQL, and InterBase from Android (you can use UniDAC for connection from Windows to almost all the known databases, but this is beyond the scope of this blog). If you want to work not with all those databases, but with a particular one, you can use more specific components: [LiteDAC](https://www.devart.com/litedac/) – SQLite Data Access Components [ODAC](https://www.devart.com/odac/) – Oracle Data Access Components [MyDAC](https://www.devart.com/mydac/) – MySQL Data Access Components [PgDAC](https://www.devart.com/pgdac/) – PostgreSQL Data Access Components [IBDAC](https://www.devart.com/ibdac/) – InterBase Data Access Components All these components also support Android as a target platform in RAD Studio XE5. Connection to database Work with databases on Android in general is no different from the one on Windows, but there are some nuances when setting connection and deploying files to a mobile device, if you work with a local DB. We will consider here how to establish connection to each supported database. SQLite If you not deploy the database file to a mobile device, you should set: ForceCreateDatabase := True In this case, on the first application launch, a database file will be created automatically. LiteDAC sample var\n Connection: TLiteConnection;\nbegin\n Connection := TLiteConnection.Create(nil);\n try\n Connection.Options.ForceCreateDatabase := True;\n Connection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.sqlite3';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; UniDAC sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'SQLite';\n Connection.SpecificOptions.Values['ForceCreateDatabase'] := 'True';\n Connection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.sqlite3';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; Oracle It is impossible to install Oracle client to a mobile device, because Oracle client for Android simply doesn’t exist. Therefore the only way to establish connection to Oracle from a mobile device is to connect directly via TCP/IP. For this, the Direct option should be set to True: Direct := True; In addition, the server name must be generated correctly, since, if we have no client, we have no tnsnames.ora file with the server list as well. Therefore, to establish connection from Android, we need to know the server Host and Port, as well as its SID or Service Name. To connect via the SID, the server should be set in the following way: Server := 'Host:Port:sid=SID'; or a simplified way: Server := 'Host:Port:SID'; To connect via the Service Name – as follows: Server := 'Host:Port:sn=SID'; In other words, the ‘sid=’ prefix of the third parameter indicates that connection is established via the SID, and the ‘sn=’ prefix indicates that connection is established via the Service Name. If no prefix is specified, then, by default, it is considered, that we want to establish connection via the SID. The majority of Oracle servers have the same SID and Service Name, so you, most likely, won’t have to go into such nuances, since you can learn more about this in the Oracle documentation. ODAC sample var\n Session: TOraSession;\nbegin\n Session := TOraSession.Create(nil);\n try\n Session.Options.Direct := True;\n Session.Server := 'server:1521:orcl';\n Session.Username := 'user_name';\n Session.Password := 'password';\n Session.Connect;\n finally\n Session.Free;\n end;\nend; UniDAC sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'Oracle';\n Connection.SpecificOptions.Values['Direct'] := 'True';\n Connection.Server := 'server:1521:orcl';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end; MySQL MySQL client software for Android also doesn’t exist, therefore connection to MySQL server will also be established directly via TCP/IP. For this, let’s set the corresponding option: Direct := True; MyDAC sample var\n Connection: TMyConnection;\nbegin\n Connection := TMyConnection.Create(nil);\n try\n Connection.Options.Direct := True;\n Connection.Server := 'server';\n Connection.Port := 3306;\n Connection.Database := 'database_name';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; UniDAC sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'MySQL';\n Connection.SpecificOptions.Values['Direct'] := 'True';\n Connection.Server := 'server';\n Connection.Port := 3306;\n Connection.Database := 'database_name';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; PostgreSQL With PostgreSQL, everything is more simple. Since PgDAC and UniDAC only allow establish direct connection via TCP/IP, it is enough for us to specify the Server and Port and perform Connect. PgDAC sample var\n Connection: TPgConnection;\nbegin\n Connection := TPgConnection.Create(nil);\n try\n Connection.Server := 'server';\n Connection.Port := 5432;\n Connection.Database := 'database_name';\n Connection.Schema := 'schema_name';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; UniDAC sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'PostgreSQL';\n Connection.Server := 'server';\n Connection.Port := 5432;\n Connection.Database := 'database_name';\n Connection.SpecificOptions.Values['Schema'] := 'schema_name';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; InterBase Using InterBase ToGo, you can connect to both local or remote DB. To connect to a local db, just the path to the local db on the device should be set: Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.gdb'; If you need to establish connection to a remote server, you should specify not only the database, but the server as well: UniConnection.Server := 'server';\nUniConnection.Database := 'C:\\db.gdb'; Please note that the IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) prefix should be specified when connecting to a local DB, and it is not needed for connection to a remote database. IBDAC local database sample var\n Connection: TIBCConnection;\nbegin\n Connection := TIBCConnection.Create(nil);\n try\n Connection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.gdb';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; IBDAC remote database sample var\n Connection: TIBCConnection;\nbegin\n Connection := TIBCConnection.Create(nil);\n try\n Connection.Server := 'server';\n Connection.Database := 'C:\\db.gdb';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; UniDAC local database sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'InterBase';\n Connection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.gdb';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; UniDAC remote database sample var\n Connection: TUniConnection;\nbegin\n Connection := TUniConnection.Create(nil);\n try\n Connection.ProviderName := 'InterBase';\n Connection.Server := 'server';\n Connection.Database := 'C:\\db.gdb';\n Connection.Username := 'user_name';\n Connection.Password := 'password';\n Connection.Connect;\n finally\n Connection.Free;\n end;\nend; Deployment to mobile device In order for our application to be able to work with local SQLite and InterBase ToGo databases, we should make sure these databases are deployed to an Android device. Nothing difficult at this, since the deployment process is similar in both Delphi XE4 and Delphi XE5. First we should call the Project->Deployment menu: After this add our databases for SQLite and InterBase to the list of files, that must be deployed to an Android device together with your application: Please note that the deployment path is different for Android and iOS. If you want to deploy your application to both platforms, then make sure the deployment paths are specified correctly for both of them. NOTE: Dont forget to change the Remote Path default value . with one of the described above. Deployment Path Destination on Device TPath.GetDocumentsPath .\\assets\\internal /data/data/com.embarcadero.MyProjects/files TPath.GetSharedDocumentsPath .\\assets /mnt/sdcard/Android/data/com.embarcadero.MyProjects/files Debug We won’t focus on the debug process in this article, since it was described in details in the previous article: [Remote Debug of Android Applications in RAD Studio XE5 via Wi-Fi](https://blog.devart.com/remote-debug-of-android-application-in-rad-studio-xe5-via-wifi.html) Conclusion For the time being, with the RAD Studio XE5 release, Devart Data Access Components allow to connect to SQLite, Oracle, MySQL, PostgreSQL and InterBase from both mobile platforms: Android and iOS. But we are not going to stop. Many users ask us to support connection to SQL Server from Android and iOS – and we will make every effort to give them such a capability. Update We are glad to inform, that we have supported connection to SQL Server and SQL Azure from iOS and Android. Tags [android development](https://blog.devart.com/tag/android-development-2) [delphi](https://blog.devart.com/tag/delphi) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fandroid-database-application-development-in-rad-studio-xe5.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Develop+Android+Database+Applications+in+RAD+Studio+XE5&url=https%3A%2F%2Fblog.devart.com%2Fandroid-database-application-development-in-rad-studio-xe5.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html&title=How+to+Develop+Android+Database+Applications+in+RAD+Studio+XE5) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html&title=How+to+Develop+Android+Database+Applications+in+RAD+Studio+XE5) [Copy URL](https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 12 COMMENTS Компоненты доступа к данным от DevArt - новые возможности для разработки под Android | Delphi 2010 ru October 27, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 2:07 am […] в блоге компании был опубликован пост-инструкция “How to Develop Android Database Applications in RAD Studio XE5“, из которого явственно следует, с помощью […] Prashant Bhosale October 29, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 1:56 pm nice article Marco Antonio Santin Torres March 13, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 6:08 pm And you can not connect to a SQL Server database? DAC Team May 15, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:53 am We are now working at this. D.Wy May 18, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 3:19 pm interesting ! , how about firebird ? DAC Team May 19, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:24 am As soon as a Firebird client library for Android is released, we will immediately engage in implementing support for it. Ramil Rustamov September 3, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:29 pm Are you serious? Is it impossible to connect to MSSQL Server? The only reason that I’ve downloaded your components was to make easily and possible to connect to MSSQL Server otherwise your components are useless! DAC Team September 4, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 10:29 am Ramil, Currently, SDAC and UniDAC don’t support connection to MS SQL Server. This information can be seen on the compatibility pages of these products: [https://www.devart.com/unidac/compatibility.html](https://www.devart.com/unidac/compatibility.html) and [https://www.devart.com/sdac/compatibility.html](https://www.devart.com/sdac/compatibility.html) We plan to add a capability to connect to MS SQL Server on mobile platforms by the end of the year. Follow the news on our website. untung December 4, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 8:24 am with datasnap + unidac, you can communication with ms sql server. unidac make it simple. mehmet akif kabakçı April 24, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 9:00 pm where will this code be pasted? “var Connection: TUniConnection; begin Connection := TUniConnection.Create(nil); try Connection.ProviderName := ‘SQLite’; Connection.SpecificOptions.Values[‘ForceCreateDatabase’] := ‘True’; Connection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + ‘db.sqlite3’; Connection.Connect; finally Connection.Free; end; end;” and will be entered any “uses”? i still have trouble errors on my device when i tried. please send an example project thank you. DAC Team July 1, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 12:48 pm Please specify the errors you are getting on your device or provide screenshots with them. In order to be able to help you resolve the issues. Kelly Bamson November 29, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 7:49 am Nice Post.Thank you for Sharing with us. Comments are closed."} {"url": "https://blog.devart.com/announcing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Product Release](https://blog.devart.com/category/product-release) [What’s New](https://blog.devart.com/category/whats-new) Announcing Devart’s New dotConnect Providers for Zoho Desk & Zoho Books By [Sofiia Fomitska](https://blog.devart.com/author/sophie-rawlings) May 31, 2024 [0](https://blog.devart.com/announcing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html#respond) 1229 We are extremely happy to herald the release of brand new dotConnect data providers for [Zoho Books](https://www.devart.com/dotconnect/zohobooks/) and [Zoho Desk](https://www.devart.com/dotconnect/zohodesk/) . In today’s business world, the significance of efficiency and accuracy in managing finances and customer service has reached unprecedented levels. This is the reason why we extend our dotConnect product line to Zoho applications, providing our users with helpful tools to optimize their workflows and enhance productivity. What are the advantages of our dotConnect providers for Zoho Books & Zoho Desk? Direct connection from dotConnect to Zoho apps Through our most recent add-ins, users can seamlessly link dotConnect directly to Zoho Books and Zoho Desk. With this integration in place, they can easily retrieve and analyze financial records, customer service inquiries, and other important data directly within the dotConnect interface. Seamless integration Devart’s new dotConnect providers ensure easy data synchronization between Zoho Books or Zoho Desk and external databases, securing real-time access to critical information and optimized workflows. Enhanced performance Using our dotConnect providers for Zoho Books and Zoho Desk, you will experience improved performance and reliability with optimized data retrieval and improved connectivity. Advanced security The new add-ins allow for the protection of sensitive data with advanced security features, including encryption protocols and authentication mechanisms, to shield against unauthorized access and data breaches. Customizable workflows dotConnect can be easily tailored to suit any specific business requirements with customizable workflows, automation capabilities, and integration options. Robust functionality Finally, users can take advantage of a wide range of cutting-edge features, such as reporting tools, data analytics, and dashboarding capabilities, to gain valuable insights and facilitate informed decision-making. Conclusion We believe that the addition of [ADO.NET](https://www.devart.com/dotconnect/what-is-ado-net.html) providers for Zoho Books and Zoho Desk to our product line will not only optimize workflows but also elevate the overall user experience, enabling our customers to stay ahead in today’s competitive market space. Looking to enhance your data experience? Download the brand new dotConnect for [Zoho Books](https://www.devart.com/dotconnect/zohobooks/) and [Zoho Desk](https://www.devart.com/dotconnect/zohodesk/) and start your data journey right away! Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [Zoho Books](https://blog.devart.com/tag/zoho-books) [Zoho Desk](https://blog.devart.com/tag/zoho-desk) [Sofiia Fomitska](https://blog.devart.com/author/sophie-rawlings) Sofiia is a talented technical writer who writes easy-to-understand, concise, and user-friendly documentation for our pioneering IT solutions. Besides being a writer, she's a loving mother of two sons. That's why Sofiia spends her free moments playing soccer, biking, and joining in on the fun activities her boys are passionate about. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fannouncing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html) [Twitter](https://twitter.com/intent/tweet?text=Announcing+Devart%E2%80%99s+New+dotConnect+Providers+for+Zoho+Desk+%26+Zoho+Books&url=https%3A%2F%2Fblog.devart.com%2Fannouncing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/announcing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html&title=Announcing+Devart%E2%80%99s+New+dotConnect+Providers+for+Zoho+Desk+%26+Zoho+Books) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/announcing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html&title=Announcing+Devart%E2%80%99s+New+dotConnect+Providers+for+Zoho+Desk+%26+Zoho+Books) [Copy URL](https://blog.devart.com/announcing-devarts-new-dotconnect-providers-for-zoho-desk-zoho-books.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/another-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Another Badge on the Wall: Why G2 Users Love dbForge SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) February 10, 2022 [0](https://blog.devart.com/another-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html#respond) 2570 More good news coming our way! [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , your favorite add-in for SSMS and Visual Studio—packing a punch with its SQL code completion, formatting, and refactoring capabilities—has earned a new badge on G2, the world’s largest tech marketplace that helps businesses discover and review products that help them fulfill their potential and achieve their goals. We are happy to know that companies and individual SQL developers and DBAs alike choose [our products](https://www.devart.com/dbforge/sql/) ; it’s their high satisfaction rating and consistently positive feedback on G2 that helped SQL Complete earn the Users Love Us badge. So why do users love SQL Complete? Let’s answer this question with a few quotes from the feedback on G2: “SQL Complete is THE solution for autocompletion/IntelliSense for SQL. I honestly could not do my job without this tool and it has allowed me to code at speeds that let me solve complex problems in time-critical situations.” “The features of SQL Complete completely changed my workflow. The formatting, snippets, and color coding saves me hours every week.” “Thoughtfully designed and easy to use, yet highly sophisticated in its capabilities.” “This is a must-have tool for SQL Server developers. It covers all the bases from automatically suggesting objects to code suggestions, making it easier to code your linked servers. I particularly like the ability to see object info at a glance.” “It works with Azure SQL Databases and also works with the Azure Synapse Analytics SQL pool, which is really important for me.” “The tool is so well thought of and highly engineered, it’s a pleasure to use, and it improves your productivity by a significant margin.” To read full reviews, feel free to [visit the SQL Complete page on G2](https://www.g2.com/products/dbforge-sql-complete/reviews) . A few words about dbForge SQL Complete For those who are not using SQL Complete yet: it’s an advanced add-in for SSMS that gives a big boost to your SQL coding, helps you beautify and refactor your code, eliminates repetitive coding, and delivers productivity enhancements and tools for data analysis. And we would gladly like to invite you to try it. [Get your FREE 14-day trial of SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and check it in action! Tags [dbforge](https://blog.devart.com/tag/dbforge) [g2 awards](https://blog.devart.com/tag/g2-awards) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fanother-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=Another+Badge+on+the+Wall%3A+Why+G2+Users+Love+dbForge+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fanother-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/another-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html&title=Another+Badge+on+the+Wall%3A+Why+G2+Users+Love+dbForge+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/another-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html&title=Another+Badge+on+the+Wall%3A+Why+G2+Users+Love+dbForge+SQL+Complete) [Copy URL](https://blog.devart.com/another-badge-on-the-wall-why-g2-users-love-dbforge-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/applications-deployment-on-mac-os-x.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Delphi XE2 FireMonkey Applications Deployment on Mac OS X By [DAC Team](https://blog.devart.com/author/dac) February 2, 2012 [0](https://blog.devart.com/applications-deployment-on-mac-os-x.html#respond) 4430 RAD Studio XE2 allows creating applications for Mac OS X. But the problem is that RAD Studio XE2 (both Delphi XE2 and C++Builder XE2) can be installed and run only under MS Windows, therefore applications for Mac OS X can be developed under MS Windows. That is why many people may face the problem of deploying applications on Mac OS X. Taking this into account, we decided to write this article that will help our users deal with deploying problem and deploy their applications to Mac OS X successfully. In this article, we will tell about two ways of applications deploying on Mac OS X and the peculiarities that must be taken into account during deploying applications that use Devart [Data Access Components](https://www.devart.com/dac.html) and Devart [dbExpress drivers](https://www.devart.com/dbx/) . Using PAServer The easiest way to deploy applications to Mac OS X is to deploy them using PAServer. When PAServer is used to debug applications on a remote computer with Mac OS X, it copies executable Mac OS X application packages to the “PAServer_Installation_Directory” directory (where “PAServer_Installation_Directory” is the “/Users/$USER/Applications/Embarcadero/PAServer” directory by default). In the scratch-dir directory PAServer creates directories with names, that consist of the computer name and the remote profile name. For example, if the computer name (where RAD Studio XE2 is run) is “MyPC”, and the remote profile name for the Mac OS X platform is “Mac OS X”, PAServer will create the “PAServer_Installation_Directory/scratch-dir/MyPC-Mac OS X” directory. In this directory, PAServer will create an application package with the name of the project plus the “.app” extension. For example, if your application name is “MyProject”, PAServer will create the “MyProject.app” application package. You can use PAServer for both debugging your application and for deploying it as well. For this, you should choose Release Build Configuration and run it without debugging (the Run Without Debugging command from the Run menu). Also, before the final deployment, you should clean the application package folder (for example, “MyProject.app”) from all old files that could remain after previous runnings and deployments. Note : You can choose files to deploy in an application package by selecting “Deployment” from the Project menu in RAD Studio XE2. Manual An application package is internally represented by a folder with the following structure: For Delphi: For C++Builder: Note: To view the content of the application package in Mac OS X, you can right click on package and select “Show Package Contents” from the shortcut menu. All needed files to create the Mac OS X application package are supplied by RAD Studio XE2. To create the Mac OS X application package manually, the following steps must be performed: create a FireMonkey Delphi or C++Builder application; add the OS X platform to Target Platforms and make it active; build the application using Release Build Configuration; create a folder with the name that consists of the project name and the “.app” extension, for example, “MyProject.app”; create the Contents folder in the MyProject.app folder; copy the “MyProject_DirectoryOSX32ReleaseMyProject.info.plist” file to the Contents folder and rename it to info.plist; create the MacOS folder in the Contents folder; for Delphi applications: copy “RAD_Studio_XE2_Install_DirectoryRedistosx32libcgunwind.1.0.dylib” to the MacOS folder; for C++Builder applications: copy “RAD_Studio_XE2_Install_DirectoryRedistosx32libcgunwind.1.0.dylib”, “RAD_Studio_XE2_Install_DirectoryRedistosx32libcgcrtl.dylib” and “RAD_Studio_XE2_Install_DirectoryRedistosx32libcgstl.dylib” to the MacOS folder; copy the “MyProject_DirectoryOSX32ReleaseMyProject” file to the MacOS folder; create the Resources folder in the Contents folder; copy the “MyProject_DirectoryOSX32ReleaseMyProject.icns” file to the Resources folder. After performing these steps you can run the MyProject.app application package on Mac OS X. Specificity of deploying applications that use Devart Data Access Components Applications using some of the Devart Data Access Components products require database client libraries to function on Mac OS X. Client libraries can be located in public libraries directory (for example, /usr/lib ) or in the MacOS folder of the application package. The table below shows the required client libraries for each particular Devart Data Access Components: DAC Mode Direct Client [ODAC](https://www.devart.com/odac/) Not required Oracle client [MyDAC](https://www.devart.com/mydac/) Not required libmysql.dylib [IBDAC](https://www.devart.com/ibdac/) for InterBase: libibtogo.dylib for Firebird: libfbclient.dylib [PgDAC](https://www.devart.com/pgdac/) Not required [SQLite (by UniDAC)](https://www.devart.com/unidac/) libsqlite3.dylib [UniDAC](https://www.devart.com/unidac/) Correspondent libraries for used data provider Specificity of deploying applications that use Devart dbExpress drivers Applications using some of the Devart dbExpress drivers products require database client libraries plus the libmidas.dylib library and the correspondent dbExpress driver library to function on Mac OS X. All required libraries can be located in the public libraries directory (for example, /usr/lib ) or in the MacOS folder of the application package. The table below shows all required libraries for each particular dbExpress driver: dbExpress driver Mode Direct Client [dbExpress driver for Oracle](https://www.devart.com/dbx/oracle/) libmidas.dylib libdbexpoda40.dylib libmidas.dylib libdbexpoda40.dylib Oracle client [dbExpress driver for MySQL](https://www.devart.com/dbx/mysql/) libmidas.dylib libdbexpmda40.dylib libmidas.dylib libdbexpmda40.dylib libmysql.dylib [dbExpress driver for InterBase & Firebird](https://www.devart.com/dbx/interbase/) libmidas.dylib libdbexpida40.dylib for InterBase: libibtogo.dylib for Firebird: libfbclient.dylib [dbExpress driver for PostgreSQL](https://www.devart.com/dbx/postgresql/) libmidas.dylib libdbexppgsql40.dylib [dbExpress driver for SQLite](https://www.devart.com/dbx/sqlite/) libmidas.dylib libdbexpsqlite40.dylib libsqlite3.dylib Tags [c++builder](https://blog.devart.com/tag/cbuilder) [delphi](https://blog.devart.com/tag/delphi) [macos development](https://blog.devart.com/tag/macos-development) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fapplications-deployment-on-mac-os-x.html) [Twitter](https://twitter.com/intent/tweet?text=Delphi+XE2+FireMonkey+Applications+Deployment+on+Mac+OS+X&url=https%3A%2F%2Fblog.devart.com%2Fapplications-deployment-on-mac-os-x.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/applications-deployment-on-mac-os-x.html&title=Delphi+XE2+FireMonkey+Applications+Deployment+on+Mac+OS+X) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/applications-deployment-on-mac-os-x.html&title=Delphi+XE2+FireMonkey+Applications+Deployment+on+Mac+OS+X) [Copy URL](https://blog.devart.com/applications-deployment-on-mac-os-x.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/apply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) Apply Integrity Constraint to Your MySQL Database With the Help of dbForge Studio! By [dbForge Team](https://blog.devart.com/author/dbforge) July 30, 2019 [0](https://blog.devart.com/apply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html#respond) 8010 Our database tools team is pleased to announce the release of our dbForge MySQL products with a CHECK constraint support. We are looking forward to seeing our customers benefit from it in their everyday work. The fact that MySQL doesn’t support CHECK Constraints syntax caused lots of inconveniences to the developers and DBAs. That deviation from SQL standard complicated maintaining data integrity and assuring data quality. Introducing a CHECK constraint feature in the 8.0.16 edition was one of the most requested and long-awaited features for MySQL. In our turn, at Devart, we aim to bring the latest innovations to our customers to keep them up to date with the new technologies. What is a CHECK constraint? A CHECK constraint is a type of integrity constraint in SQL, it allows users to specify a condition on each row in a table. It is used to limit the value range that can be placed in a column. The constraint must be a predicate and can refer to a single column, or multiple columns of a table. Depending on the presence of NULLs, the result of the predicate can be: • True • False • Unknown In case the predicate evaluates to UNKNOWN, the constraint is not violated and the row can be inserted or updated in the table. In the editions prior to MySQL 8.0.16, [CREATE TABLE](https://blog.devart.com/mysql-create-table-query.html) allows only the limited version of table CHECK constraint syntax, which is parsed and ignored: CHECK (expr) In MySQL 8.0.16, CREATE TABLE permits the core features of table and column CHECK constraints, for all storage engines. CREATE TABLE permits the following CHECK constraint syntax, for both table constraints and column constraints: [CONSTRAINT [symbol]] CHECK (expr) [[NOT] ENFORCED] Where: symbol is an optional parameter and specifies a name for the constraint. In case omitted, MySQL will generate a name from the table name, a literal _chk_, and an ordinal number (1, 2, 3, …). Note, that constraint names can have a maximum length of 64 characters and they are case sensitive, but not accent sensitive. expr specifies the constraint condition as a Boolean expression that must evaluate to TRUE or UNKNOWN (for NULL values) for each row of the table. If the condition evaluates to FALSE, it fails, and a constraint violation occurs. ENFORCED is an optional clause and indicates whether the constraint is enforced. In case omitted or specified as ENFORCED, the constraint is created and enforced. If specified as NOT ENFORCED, the constraint is created but not enforced. A CHECK constraint is specified as either a table constraint or column constraint: • A table constraint does not appear within a column definition and can refer to any table column or columns. Forward references are permitted to columns appearing later in the table definition. • A column constraint appears within a column definition and can refer only to that column. Creating a CHECK constraint The SQL standard syntax to create check constraint is supported in the column definition and table definition of CREATE TABLE and ALTER TABLE statements. mysql> CREATE TABLE t1 (c1 INTEGER CONSTRAINT c1_chk CHECK (c1 > 0),\n -> c2 INTEGER,\n -> CONSTRAINT c2_chk CHECK (c2 > 0),\n -> CONSTRAINT c1_c2_chk CHECK (c1 + c2 < 9999));\nQuery OK, 0 rows affected (0.05 sec)\n \nmysql> SHOW CREATE TABLE t1\\G\n*************************** 1. row ***************************\n Table: t1\nCreate Table: CREATE TABLE `t1` (\n `c1` int(11) DEFAULT NULL,\n `c2` int(11) DEFAULT NULL,\n CONSTRAINT `c1_c2_chk` CHECK (((`c1` + `c2`) < 9999)),\n CONSTRAINT `c1_chk` CHECK ((`c1` > 0)),\n CONSTRAINT `c2_chk` CHECK ((`c2` > 0))\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci\n1 row in set (0.01 sec) As mentioned earlier, MySQL generates a name for any CHECK constraint without a specified one. To see the names generated for the table definition given above, use SHOW CREATE TABLE statement. mysql> CREATE TABLE t1 (c1 INTEGER CONSTRAINT c1_chk CHECK (c1 > 0),\n -> c2 INTEGER CHECK (c2 > 0),\n -> c3 INTEGER,\n -> c4 INTEGER,\n -> CONSTRAINT c3_chk CHECK (c3 > 0),\n -> CHECK (c4 > 0),\n -> CONSTRAINT chk_all CHECK (c1 + c2 + c3 + c4 < 9999),\n -> CHECK (c1 + c3 < 5000));\nQuery OK, 0 rows affected (0.06 sec)\n \nmysql> SHOW CREATE TABLE t1\\G\n*************************** 1. row ***************************\n Table: t1\nCreate Table: CREATE TABLE `t1` (\n `c1` int(11) DEFAULT NULL,\n `c2` int(11) DEFAULT NULL,\n `c3` int(11) DEFAULT NULL,\n `c4` int(11) DEFAULT NULL,\n CONSTRAINT `c1_chk` CHECK ((`c1` > 0)),\n CONSTRAINT `c3_chk` CHECK ((`c3` > 0)),\n CONSTRAINT `chk_all` CHECK (((((`c1` + `c2`) + `c3`) + `c4`) < 9999)),\n CONSTRAINT `t1_chk_1` CHECK ((`c2` > 0)),\n CONSTRAINT `t1_chk_2` CHECK ((`c4` > 0)),\n CONSTRAINT `t1_chk_3` CHECK (((`c1` + `c3`) < 5000))\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci\n1 row in set (0.02 sec) As you can see, t1_chk_1, t1_chk_2, t1_chk_3 are the names generated for CHECK constraints. The SQL standard implies that all types of constraints (primary key, unique index, foreign key, check) belong to the same namespace. In MySQL, each constraint type has its own namespace per schema (database). Therefore, CHECK constraint names must be unique per schema; no two tables within the same schema can have a same CHECK constraint name. Generating constraint names on the basis of table names helps ensure schema uniqueness since table names must also be unique within the schema. Note, that all check constraints are enforced by default. If you want to create a CHECK constraint but do not want to enforce it, then you should use the NOT ENFORCED clause. mysql> CREATE TABLE t1 (c1 INTEGER CHECK (c1 > 0),\n -> c2 INTEGER CHECK (c2 > 0) NOT ENFORCED);\nQuery OK, 0 rows affected (0.04 sec)\n \nmysql> SHOW CREATE TABLE t1\\G\n*************************** 1. row ***************************\n Table: t1\nCreate Table: CREATE TABLE `t1` (\n `c1` int(11) DEFAULT NULL,\n `c2` int(11) DEFAULT NULL,\n CONSTRAINT `t1_chk_1` CHECK ((`c1` > 0)),\n CONSTRAINT `t1_chk_2` CHECK ((`c2` > 0)) /*!80016 NOT ENFORCED */\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci\n1 row in set (0.01 sec) CHECK Constraints Support in dbForge Studio for MySQL and Mini Tools dbForge Studio for MySQL 8.2 empowered with CHECK Constraints support has been just rolled out. We keep expanding functionality to please even the most demanding customers. CHECK Constraints Support in Visual Table Editor: CHECK Constraints Support in Visual Database Diagram: CHECK Constraints Support in Generate Script As: CHECK Constraints Support in Database Explorer: CHECK Constraints Support in Database Backup: CHECK Constraints Support in Database Refactoring: CHECK Constraints Support in Schema Compare: CHECK Constraints Support in Data Compare: CHECK Constraints Support in Code Completion for ALTER TABLE … ADD CONSTRAINT, CREATE TABLE, and ALTER TABLE: Assuring data integrity is a prime task when working with databases. Therefore CHECK constraint support in dbForge products for MySQL helps avoid a number of problems developers faced before. On the whole, it results in lowering error rates, time and effort saving, and increasing the data quality. Tell Us What You Think We welcome you to [try the new version](https://www.devart.com/dbforge/mysql/studio/download.html) of dbForge Studio for MySQL and [share your thoughts](https://www.devart.com/dbforge/mysql/studio/feedback.html) about the release with us. We’re always looking for ways to improve. Share your experience with our team and help us keep you satisfied. Tags [check constraint](https://blog.devart.com/tag/check-constraint) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fapply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html) [Twitter](https://twitter.com/intent/tweet?text=Apply+Integrity+Constraint+to+Your+MySQL+Database+With+the+Help+of+dbForge+Studio%21&url=https%3A%2F%2Fblog.devart.com%2Fapply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/apply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html&title=Apply+Integrity+Constraint+to+Your+MySQL+Database+With+the+Help+of+dbForge+Studio%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/apply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html&title=Apply+Integrity+Constraint+to+Your+MySQL+Database+With+the+Help+of+dbForge+Studio%21) [Copy URL](https://blog.devart.com/apply-integrity-constraint-to-your-mysql-database-with-the-help-of-dbforge-studio.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/asp-net-core-blazor-best-practices-architecture-and-performance-optimization.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) ASP.NET Core Blazor Best Practices — Architecture and Performance Optimization By [Victoria Shyrokova](https://blog.devart.com/author/victorias) November 7, 2024 [0](https://blog.devart.com/asp-net-core-blazor-best-practices-architecture-and-performance-optimization.html#respond) 814 Adopting Blazor best practices is the only way to get your ASP.NET Core applications on the right track. Rather than juggling between C#  for the backend logics and JavaScript for the frontend interactions, you can use Blazor to build interactive web apps using only C#. It makes your development process easier, reduces context switching, and lets you create a more cohesive codebase. That said, working with Blazor can sometimes throw you a curveball. If you’re building a large-scale app, you may run into slow load times and clunky rendering. You may also struggle with state management, especially when trying to keep data in sync across multiple components. Plus, connecting a server-side Blazor app to a database can be a nightmare without a solid data access layer — and if you are using Blazor WebAssembly, things get worse. In this article, we’ll help you navigate these common pitfalls. We’ll share some key Blazor best practices to optimize your architecture and enhance performance, ensuring your applications are efficient and user-friendly. Table of contents ASP.NET Core Blazor architecture best practices Performance optimization best practices Reliability and security best practices Benefits of using ASP.NET Core Blazor and dotConnect Conclusion ASP.NET Core Blazor architecture best practices Building complex applications is never easy. However, implementing Blazor architecture best practices when designing component interactions and managing state can make quite a difference. It all comes down to choosing the right architecture for your project, keeping your components modular and reusable, and leveraging the power of C# and .NET. Choose the right hosting model Depending on your app’s requirements and personal preference, you have two primary options: Blazor WebAssembly (WASM) or Blazor Server . Here’s how they stack up against each other: Feature Blazor WebAssembly Blazor Server Architecture Features a decoupled architecture that requires a Web API backend. Connects directly to the database using dependency injection. Frontend Flexibility API operates independently; easy to switch frontends (e.g., React, Vue.js). Tightly coupled with server-side processing. Client-Side Processing Runs C# code directly in the browser. All processing happens on the server. Performance Since it executes right in the browser, near-native performance. Performance can suffer from latency because every action involves a round trip to the server. User Experience Responsive; suitable for users with slow or intermittent connections. May not provide fast response times for public websites. Real-Time Communication Not inherently supported; can be implemented using SignalR with additional setup. Built-in support via SignalR. Offline Capabilities Can work offline after the initial load. Doesn’t support offline functionalities. Security Code runs in the client browser, which may expose sensitive logic to users. Server-side execution secures critical data from exposure. Use Cases Ideal for single-page applications, progressive web apps, and offline apps (e.g., inventory management, CRM tools). Best for enterprise applications in finance, healthcare, and government services. Scalability Limited by client resources; relies on the user’s device capabilities. Scales well with server resources; can handle many concurrent users. Not sure which model to use? Well, you can have it both ways with ASP.NET Core Blazor Hybrid . It combines client-side and server-side functionalities. At the same time, you can manage high-security operations, like sensitive data transactions, through a WebSocket connection to your backend API. Build efficient and streamlined components Blazor’s component-based architecture is all about creating small, focused components that encapsulate both UI and logic. This minimizes complexity and makes it easier to scale your project as it grows. Here are a few tips to work with Blazor components: Keep presentation components simple. They should only display the information they’re given and trigger events when needed. For tasks like checking if a username is already taken, it’s okay to use injected services. However, avoid using services to fetch data directly within your presentation components. Centralize your business logic. Keep business functions within services that interact with the back end. Your presentation and container components should only rely on these services to access and manipulate data. Favor composition over inheritance. This means building components from smaller, reusable pieces rather than creating complex hierarchies — which makes your code flexible and easier to manage. Use container components for coordination . Implement container components (like pages) to coordinate multiple presentation components. They handle the heavy lifting of fetching data from services and then distribute it to the presentation components. Organize your project structure. Group shared components into a Shared project with clear folders. For larger features, create separate folders or even separate projects for components and services that are common across multiple pages within that feature. Maximize C# and .NET integration Whether you’re building a new web app from scratch or connecting it to existing services, use C#’s asynchronous programming, LINQ, and generics to write concise, efficient, and type-safe code. Another good practice is to write your business logic once in C# and reuse it in both your client-side and server-side projects. For example, if you have a utility class that performs calculations or data transformations, you can reference it in both your Blazor WAMS and Blazor Server apps. This way, you will prevent duplication and maintain a consistent codebase. In addition, since Blazor is built on ASP.NET, you’re free to reach for a ton of pre-built tools that save you time and effort in the long run. Use .NET’s libraries like [Newtonsoft.Json](https://github.com/JamesNK/Newtonsoft.Json) for JSON serialization, [NLog](https://nlog-project.org/) and [Serilog](https://serilog.net/) for logging, and [IdentityServer](https://github.com/identityserver) for authentication. Also, focus on building reusable UI components using Razor syntax . It allows you to create a consistent look and feel across your application, which simplifies updates and maintenance. For database interactions, use [Entity Framework Core](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) . This ORM abstracts the complex SQL queries behind the scenes, and you don’t have to write them yourself. You do the CRUD operations directly in your C# code, which makes working with your database a whole lot easier and keeps things type-safe. Interested in how to efficiently connect different databases with a Blazor app? Check out these tutorials: [How to use SQLite and Entity Framework Core\nin Blazor >](https://www.devart.com/dotconnect/sqlite/sqlite-efcore-blazor.html) Explore how to use the code-first approach to integrate SQLite with a Blazor application to design and manage a database directly from the C# code [How to use MySQL and Entity Framework Core in Blazor >](https://www.devart.com/dotconnect/mysql/mysql-efcore-blazor.html) Learn the Code-First method to connect MySQL to a Blazor application, giving you full control over database design and management with C# code. [How to use PostgreSQL and Entity Framework Core in Blazor >](https://www.devart.com/dotconnect/postgresql/postgresql-efcore-blazor.html) Discover the best practices to integrate PostgreSQL with your Blazor application, enabling database design and management directly from C# code. Performance optimization best practices .NET developers have been talking about Blazor’s performance for a while  now, with some expressing frustrations about slow load times and overall sluggishness. While there’s a bit of truth in this, with a proper approach, you can have pretty fast and responsive apps that meet your users’ expectations. Master сomponent life cycle management Blazor’s component lifecycle is similar to other frameworks, like React and Angular. You will be able to hook into each stage of the process with methods like “OnInitialized”, “OnParametersSet”, and “OnAfterRender” to add your custom behavior. The more calls to these lifecycle methods Blazor has to make, the greater the overhead and impact on performance. So, the key lies in building components that are lightweight and optimized . How? Avoid creating thousands of component instances for repetitive UI elements. When it makes sense, inline child components in their parent components rather than making an entire separate tree for them. For example, instead of having a separate “Notification” component for each alert in a list, simply render them directly within a loop in the parent component “NotificationList.” Optimize rendering speed First, focus on reducing auto or unnecessary re-renders . You should always set the parameters of child components using immutable types like “string”, “float”, and “bool”. This will allow Blazor to skip the re-rendering if values have not changed. In the case of complex parameters, you need to override the “ShouldRender” method to control when a component needs to re-render. For example, if you are working with an app that contains a component showing a grid of products, make sure to pass as a parameter one immutable list of product IDs. Something like: “new List { 101, 102, 103 }”. Blazor won’t waste time re-rendering the UI until you add new products or remove existing ones. Try also the agnostic render mode in .NET 8 . This lets you decide whether to use server-side or client-side rendering, depending on what works best for your application. So, let’s say you have a “UserProfile” component displaying information about users. You can start with server-side rendering just to get that quick initial load. Then, you can switch to client-side rendering for faster updates when users edit their profiles. This way, they won’t have to wait around and everything will feel much smoother. Implement efficient event handling When an event is triggered, such as a button, it can cause the component to re-render. If your event handlers are designed efficiently, you can minimize unnecessary renders and keep your UI snappy even when there’s a lot going on. All you need to do is use the “IHandleEvent” interface . This will let you handle events without forcing “StateHasChanged()” after each event — which is what typically happens when you handle events directly in your component. Here’s an example: TaskList.razor @page \"/tasklist\"\n@implements IHandleEvent\n\n

Task List

\n\n\n\n
    \n @foreach (var task in tasks)\n {\n
  • \n ToggleTaskCompletion(task)\" checked=\"@task.IsCompleted\" />\n @task.Name\n
  • \n }\n
\n\n@code {\n private List tasks = new();\n private string newTask;\n\n Task IHandleEvent.HandleEventAsync(EventCallbackWorkItem callback, object? arg) =>\n callback.InvokeAsync(arg); \n} In this example, if you use lambda expressions for each checkbox’s onClick event, Blazor might get a bit sluggish when rendering a large number of tasks. To speed things up, create a collection of task objects. Assign each task’s “@onclick” delegate to an Action, which prevents Blazor from rebuilding all the task delegates on each render. The result? A noticeable performance boost, especially when dealing with many interactive elements. Use virtualization for large data sets If your applications load a lot at once, your users are in for quite some time before they are able to use it. Now, if we are talking about thousands or even more entries, what do you think would happen? The wait time will become unbearable. An easy way to avoid this is using the “Virtualize” component . It will render only the elements currently in view, improving responsiveness and reducing the load on the UI. Take a look below to see an example of how you can implement it: Orders.razor @page \"/orders\"\n\n

Order List

\n\n
    \n \n
  • \n
    @order.Id
    \n
    @order.Description
    \n
  • \n
    \n
Lazy load your components Much like virtualization, lazy loading means fetching only the necessary data or components your users need at any given moment. However, lazy loading specifically focuses on delaying the loading of resources until they’re actually needed, which can enhance initial load times even more. The simplest way to do this is to use the @page directive with a RouteView component to load specific components based on the current route. Following the example above, you can set up your routing to load the “OrderList” component when navigating to the “/orders” route like this: Orders.razor @page \"/orders\"\n\n

Order List

\n\n OrderList.razor @page \"/orders/list\"\n\n

Order List

\n\n
    \n \n
  • \n
    @order.Id
    \n
    @order.Description
    \n
  • \n
    \n
Optimize JavaScript interop Communication between .NET and JavaScript can be a bit slow because it involves a few extra steps. First, calls are asynchronous, each call can introduce latency. Second, data needs to be translated to a format that both can understand (JSON), which takes some time. A few things you can do to improve JavaScript Interop’s speed are: Reduce the number of calls. Instead of calling a JavaScript function separately for each item in a list, pass the entire list as a single parameter and handle it in one call. Use the “[JSImport]” and “[JSExport]” attributes. This lets .NET code and JavaScript talk directly without needing to go through the traditional interop API, which can be slower and less stable. Cache results. If a JavaScript function returns results that don’t change frequently, consider caching those results to avoid unnecessary calls to JavaScript. Try synchronous calls. This is only available in Blazor WASM apps (ASP.NET Core 5.0 or later), but it’s a great way to get instant replies from JavaScript. However, use it sparingly. It can slow things down if used too much. For immediate answers from JavaScript, use “DotNet.invokeMethodAsync” in your .NET code. To call a .NET method from JavaScript, use “DotNet.invokeMethod”. Minimize app size with AOT compilation and IL trimming In Blazor development, size matters. A large app takes longer to download, making users wait and potentially abandon your app before it even loads. This is especially true for Blazor WebAssembly (WASM) applications, as the entire app and its dependencies need to be downloaded to the client’s browser. To optimize your app’s size, you can use two main techniques: Ahead-of-time (AOT) compilation and Intermediate Language (IL) trimming . With traditional compilation, you build all the pieces separately and assemble them on-site (the browser). AOT compilation , on the other hand, pre-assembles everything into a single, ready-to-run package. This pre-assembly makes your app run faster, especially for tasks that require a lot of processing power. However, the downside is that the final package is larger and takes longer to download. That’s why you should couple AOT compilation with runtime relinking , which removes unused parts of your app at runtime. In contrast, IL trimming analyzes your app’s code and removes any redundant parts included in the final build. Reliability and security best practices Keeping users happy means delivering a reliable and safe experience. But security is also crucial to protect the integrity of your application, especially if you’re working on internal enterprise apps that handle sensitive data. Setup proper error handling and logging If your Blazor application throws an exception and doesn’t have proper error handling in place, it will throw a very cryptic error message that will leave users confused and frustrated. Not exactly the experience you’re trying to give them, right? To prevent this meltdown, implement a solid error-handling solution that will display friendly messages while logging the actual errors for your review. This way, you can diagnose issues without exposing sensitive information. You can use the built-in ASP.NET Core logging framework , which lets you record information, warnings, and errors in different places like consoles and files. You can also integrate with third-party libraries like Serilog or NLog . Serilog gives you structured logging, so analysing complex data is much easier. NLog makes it simple to route your logs out to various targets. When logging, always be extra careful about sensitive data . Log messages should not directly expose personal data; instead, use placeholders. For example, log “User {username} attempted to log in” instead of including details which may be too sensitive. Enforce ASP.NET Core authentication and authorization Use ASP.NET Core Identity to manage user accounts and roles smoothly. With the “[Authorize]” attribute, you can restrict access to specific controllers or Blazor pages based on user roles (like Admin or User). To get started, you’ll need to configure authentication within your Startup.cs file . This involves adding services such as “AddAuthentication” and “AddAuthorization” to define the access policies that dictate who can access what based on roles or claims. For example, if you want to ensure that only users with an Admin role can access certain Blazor pages, you can create a policy for that and apply it using the “[Authorize(Policy = “RequireAdminRole”)]” attribute on your Blazor components. Note: ASP.NET Core supports both role-based and claims-based authorization models. In a nutshell, role-based is meant to control access based on user roles. In contrast, claims-based offers a lot more detail in what you can evaluate against per-user attributes. This gives you the opportunity to apply complex authorization rules tailored to your application’s needs, enhancing security and adaptability as your user base grows. Stick to HTTPS If your Blazor ASP.NET app handles user data, HTTPS is non-negotiable . It encrypts everything transmitted between the client and server, which helps prevent eavesdropping and man-in-the-middle attacks. Head over to your “Startup.cs” file and make sure you include the “UseHttpsRedirection middleware”. This simple step redirects all HTTP requests to HTTPS. Tackle OWASP security risks Following the above Blazor best practices can help you fight off some of the [top 10 OWASP risks](https://owasp.org/www-project-top-ten/) , like broken access control and cryptographic failures. However, you still need to keep an eye out for several other potential threats. Here are some additional OWASP issues to watch out for, along with how to address them: Server-side request forgery (SSRF) . Validate and sanitize all user-supplied URLs to restrict requests to trusted domains only. Security misconfiguration . Regularly review and harden server configurations. It’s a good idea to use automated tools like Azure Security Center for ongoing security assessments. Unrestricted resource consumption . Use rate limiting and request throttling to control API usage and prevent denial-of-service attacks. Cross-site scripting (XSS) injection . Validate user input and properly encode output. This will stop malicious scripts from running in the browser. Benefits of using ASP.NET Core Blazor and dotConnect Managing data connections and ensuring smooth data flow in your Blazor apps can be a real headache. You might face complex data access patterns, performance bottlenecks, or compatibility issues with different data sources. [dotConnect](https://www.devart.com/dotconnect/) helps streamline these challenges. Built over the ADO.NET architecture, its data providers work with ASP.NET Core Blazor and other .NET frameworks like ASP.NET MVC and .NET Core. Fast data access . Experience super-fast component loading and optimized data handling. Direct connectivity. Connect directly to most major databases and data sources using [Entity Framework](https://www.devart.com/dotconnect/entityframework.html) or Entity Framework Core. Enhanced performance. Benefit from batch updates and LINQ support to improve performance and reduce database round trips. Visual data modeling. Quickly build and manage your data models visually with dotConnect’s [Entity Developer](https://www.devart.com/entitydeveloper/) . Conclusion Implementing Blazor best practices goes a long way in enhancing your application’s responsiveness, but having a robust connectivity solution is essential for optimizing data flow and component rendering. Try [dotConnect](https://www.devart.com/dotconnect/) for fast, efficient data handling and loading. It seamlessly integrates across all ASP.NET Core Blazor apps, optimizing your performance even if you decide to branch out into other .NET frameworks. Tags [ASP.NET](https://blog.devart.com/tag/asp-net) [Blazor](https://blog.devart.com/tag/blazor) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fasp-net-core-blazor-best-practices-architecture-and-performance-optimization.html) [Twitter](https://twitter.com/intent/tweet?text=ASP.NET+Core+Blazor+Best+Practices+%E2%80%94+Architecture+and+Performance+Optimization&url=https%3A%2F%2Fblog.devart.com%2Fasp-net-core-blazor-best-practices-architecture-and-performance-optimization.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/asp-net-core-blazor-best-practices-architecture-and-performance-optimization.html&title=ASP.NET+Core+Blazor+Best+Practices+%E2%80%94+Architecture+and+Performance+Optimization) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/asp-net-core-blazor-best-practices-architecture-and-performance-optimization.html&title=ASP.NET+Core+Blazor+Best+Practices+%E2%80%94+Architecture+and+Performance+Optimization) [Copy URL](https://blog.devart.com/asp-net-core-blazor-best-practices-architecture-and-performance-optimization.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/audit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Audit and Rollback Transactions Live with dbForge Transaction Log for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) October 20, 2020 [0](https://blog.devart.com/audit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html#respond) 2581 Today we’d love to unveil a big update of [dbForge Transaction Log for SQL Server](https://www.devart.com/dbforge/sql/transaction-log/) . This tool was designed to provide detailed information on all changes in your SQL Server database, recover data, and easily revert unwanted transactions. The newly released Version 2.0 adds a new feature — live audit of changes and rollback of transactions — which is invaluable for databases that must be kept up and running 24/7. That said, now you don’t have to detach your databases from the server to view transaction logs and take action. It’s also worth noting that now you have 3 viewing options to choose from: Online NTFS (read transaction logs directly) Online VSS (read transaction logs via Shadow Copy) Offline (for detached databases) Currently, live audit and rollback can be maintained only if your SQL Server and dbForge Transaction Log are deployed on the same machine. Here we’d like to reveal a little secret — our next goal is to remove this constraint and therefore give you even more freedom. Please feel free to download the full-featured [dbForge SQL Server Transaction Log](https://www.devart.com/dbforge/sql/transaction-log/download.html) for a free 30-day trial. We would also appreciate your feedback; it helps us keep our [database tools](https://www.devart.com/dbforge/) perfect for you. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbforge transaction log](https://blog.devart.com/tag/dbforge-transaction-log) [sql server transaction log](https://blog.devart.com/tag/sql-server-transaction-log) [sql server transactions](https://blog.devart.com/tag/sql-server-transactions) [transaction log](https://blog.devart.com/tag/transaction-log) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Faudit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Audit+and+Rollback+Transactions+Live+with+dbForge+Transaction+Log+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Faudit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/audit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html&title=Audit+and+Rollback+Transactions+Live+with+dbForge+Transaction+Log+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/audit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html&title=Audit+and+Rollback+Transactions+Live+with+dbForge+Transaction+Log+for+SQL+Server) [Copy URL](https://blog.devart.com/audit-and-rollback-transactions-live-with-dbforge-transaction-log-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/automated-database-deployment-and-releases-with-jenkins-and-dbforge.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Automated Database Deployment and Releases with Jenkins and dbForge By [dbForge Team](https://blog.devart.com/author/dbforge) February 12, 2021 [0](https://blog.devart.com/automated-database-deployment-and-releases-with-jenkins-and-dbforge.html#respond) 6807 Introduction to the CI process The modern software development process is impossible without constant improvement. There will be multiple builds for each change, and it is only an accomplished build tested in its entirety that can be pushed to production. Continuous integration (CI) is the process that verifies the creation and thorough testing of all those builds. This way, developers ensure that each new change works as planned, and all the other modules and functions are also correct. The process can’t be manual. It would be too long, tiresome, and complicated. In its turn, automation of Continuous Integration offers all kinds of benefits: The developers have more time, as they don’t have to hand-craft the database scripts and deploy them manually. Any issues or bugs are easy to detect for each build. The database deployment becomes a simple process that you perform with a few clicks of the mouse. The biggest advantage is that automation removes the risks of errors for both the application and the database releases. The latter is crucial as the databases are the core of applications. Missing any bug in a database causes much more problems. The Specificity of the Jenkins Continuous Integration Server The key element is the continuous integration server. It is the driving force that manages builds, tests and deploys them, reports the results, and documents all the details for developers and analysts. Among the leaders of this technology, it’s worth talking about Jenkins. The Jenkins CI server is an extremely popular open-source solution with wide automation options. It also allows integrating other applications via API and third-party build tools, thus becoming more powerful. Many specialists consider Jenkins the CI server standard. It is compatible with Windows, Unix, or Linux, as it runs in a Java environment. Due to its open-source nature, it is the default choice for many smaller companies that can download it and use it for free. An impressive number of libraries and plugins let the users adjust the CI server performance to all their needs and for any complexity of operations. Additionally, watch these videos to discover how dbForge products can boost database development. [How dbForge SQL Complete is involved in the Database DevOps process](https://youtu.be/RNgxe_8InU0) [How to import data to SQL Server database with dbForge Data Pump during the DevOps process](https://youtu.be/R7nq351mlHo) [Creating database documentation during the Continuous Integration workflow](https://youtu.be/S4W0ybixQII) [How to automate database schema changes for the CI process during database deployment](https://youtu.be/hllTzoXvoO8) [dbForge Source Control in the DevOps pipeline](https://youtu.be/reU4ALv2ctg) [Test data generation in the Continuous Integration and Deployment processes](https://youtu.be/G3GNo0i03bk) [Unit Testing for SQL Server Database in DevOps process](https://youtu.be/3A5JEs3Nz0I) DevOps Automation on Jenkins with the Devart dbForge Plugin The [Devart dbForge DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) ensures Jenkins CI server support. There is a dedicated [Plugin](https://plugins.jenkins.io/dbforge-devops-automation-for-sqlserver/) for establishing and configuring all continuous integration phases on Jenkins: The Build stage : The solution ensures the database deployment on LocalDB or another specified SQL Server. It also generates a NuGet package from the version control repository. The Test stage : The plugin launches the tSQLt unit tests and generates test data. The Synchronization stage : It deploys the generated NuGet package and syncs it with the working database. The Publish stage : The tool puts the generated NuGet Package to the NuGet feed for deployment. Devart dbForge DevOps Automation for SQL Server lets the users reduce the database release costs, improve the update quality and overall workflow, and minimize risks of deployment errors. Further, we’ll examine the usage of this plugin in a practical scenario. If you never worked with Devart products, you need to install them on the machine serving as a build agent. Choose one of the below options (there are fully-functional free trials for each of the mentioned tools): The [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle includes all tools necessary for setting up DevOps on the Jenkins CI server. [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is a multi-featured and multi-purpose solution that includes the DevOps automation functionality among the rest. Finally, you can use separate dbForge tools designed for performing the DevOps tasks: [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) , [dbForge Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) , [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) , and [dbForge Unit Test for SQL Server](https://www.devart.com/dbforge/sql/unit-test/) . Besides, you need [dbForge DevOps Automation PowerShell for SQL Server](https://www.powershellgallery.com/packages/Devart.DbForge.DevOpsAutomation.SqlServer/1.0.147) — get it from the PowerShell Gallery and install. Install the Plugin The dbForge DevOps Automation for SQL Server plugin is present in the standard Jenkins collection. Thus, you install it in the same way as all other plugins. 1. In the Jenkins home page window, navigate to Manage Jenkins > Manage Plugins : 2. On the Available tab, select the dbForge DevOps Automation for SQL Server plugin (you can use the Search option to find it faster). 3. Then, install the plugin using the default Jenkins options: Create a new SQL CI job After the plugin installation, we create a Continuous Integration job. 1. Navigate to the Jenkins home page > New Item : 2. Give the project a name, specify the project type, and click OK . Configure the CI job 1. Enter the project description and specify the path to the working directory: Note : Jenkins assumes that the Version Control System is already linked for pulling changes from the Script Folders. If not, you can arrange to pull the changes from your VCS as a separate Jenkins job. 2. Configure Build Triggers . For instance, you can set the build jobs on schedule: Build the database package Necessary tools : dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server. At this stage, you build the database package and deploy it from the Script Folder on the server. We use the installed dbForge DevOps Automation for SQL Server plugin. 1. Click Add build step > Build a database package (note the plugin name defined for the step): 2. In the Build window, provide the following details: The Subfolder location relates to the Script Folder. Ensure that the path set at the previous step is correct. The Package ID for the NuGet package that the system will generate. This ID will be the identifier for the further steps. The Temporary database server name where you will deploy the database from the source folder. The Temporary database name will define that database selected for deployment. Test the database using tSQLt Necessary tools: dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server and dbForge Unit Test for SQL Server Unit tests validate the SQL scripts deployed on the server at the Build stage. To configure the process: Click Add build step > Test a database using tSQLt : 2. In the configuring window, specify the Package ID , the server, and database names which were also set at the previous stage: Publish the database package Necessary tools: dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server At this stage, we pack the Script Folder and publish the NuGet package on the specified server. 1. Click Add build step > Publish a database package : 2. In the configuring window, define the Package ID and specify the upload path for the package: Run the project Run the process manually from the Jenkins home page. Select the necessary project and click the icon next to it, as shown below: You can view the execution results on the Console Output. It provides both the general information and the data for each previously described step (unit tests’ results or NuGet package publishing results). The result of creating the database from the Script Folder are presented in the following way: Using the dbForge DevOps Automation Plugin for Jenkins is a method favored by many professionals. The plugin steps include all the necessary commands and put them in the correct sequence. Thus, there is no need to enter them manually during the job configuration, as the dbForge tools will care about them. However, there is another method that is also applicable for automating CI processes. The method is using the Jenkins Command-Line interface. Automate the database releases on Jenkins with the dbForge tools and Command-Line For doing the jobs, you will need dbForge Studio for SQL Server that has all the necessary functionality for working with the command line. Or, you can use separate tools, the same as defined at the beginning of this article. Select the method To automate the CI jobs using Command-Line, navigate to Add build step and select Run with timeout from the drop-down menu: Jenkins will open the following window to configure the step: Choosing this approach has an additional advantage: You can limit the time of the operation performance. It is convenient to avoid unpredicted hangings. Also, you can check the ExitCode of the executed command in the Advanced options. Automate the database releases through Command-Line We’ll use a simple scenario with the test example database called AdventureWorks2019 . Currently, this database is located in the Git repository. Hence, we need to perform the following steps: 1. Download the script folder into the temporary directory of our machine. In our case, the directory is D:\\Temp\\DevOps\\. Execute the following CMD in Jenkins: git clone https://github.com/svetlanafet/AdventureWorks2019.git D:\\Temp\\DevOps 2. Create the AdventureWorks2019 test database with its objects. Execute the following SQL Server scripts for the Command-Line. The [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) script: cd \"C:\\Program Files\\Devart\\dbForge Studio for SQL server\"\ndbforgesql.com /execute /connection:\"%user connection%\" /inputfile \"D:\\Temp\\DevOps\\Create_AdventureWorks2019.sql\" dbforgesql.com /schemacompare /compfile:\"D:\\Temp\\DevOps\\AdventureWorks2019.scomp\" /sync The [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) script: cd \"C:\\Program Files\\Devart\\dbForge Schema Compare for SQL Server\"\nschemacompare.com /execute /connection:\"%user connection%\" /inputfile \"D:\\Temp\\DevOps\\Create_AdventureWorks2019.sql\" schemacompare.com /schemacompare /compfile:\"D:\\Temp\\DevOps\\AdventureWorks2019.scomp\" /sync 3. Now, we can deploy the data to the database. All the listed dbForge tools work via CMD and participate in automation. Note : First, we must configure the (scomp) template file in dbForge Schema Compare for SQL Server. It is necessary for syncing our script folder with the server. Generate test data dbForge also provides the functionality for the data generation at the Test stage before running tSQLt unit tests. This functionality comes in handy when users need to deploy large data volumes, but they can’t or won’t store all those data on the drives. Here we need to configure the (.dgen) project file with all settings and rules for test data generation (the applicable tool is dbForge Data Generator for SQL Server). Then, you should locate this file in VCS for the system to make the correct path to the checkout directory. For [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , use the following command to deploy the test data into the table: dbforgesql.com /generate data /project file:\"D:\\Temp\\DevOps\\Addressr.dgen\" For [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) , use the below command: datagenerator.com /generatedata /projectfile:\"D:\\Temp\\DevOps\\Addressr.dgen\" No matter if you prefer using the dedicated plugin for Jenkins or automate the database releases through command-line, you can apply the dbForge tools for SQL Server. They all serve properly for automation purposes and help you remove the tiresome routines. Tags [ci jenkins](https://blog.devart.com/tag/ci-jenkins) [dbforge](https://blog.devart.com/tag/dbforge) [jenkins continuous deployment](https://blog.devart.com/tag/jenkins-continuous-deployment) [jenkins continuous integration](https://blog.devart.com/tag/jenkins-continuous-integration) [jenkins devops](https://blog.devart.com/tag/jenkins-devops) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fautomated-database-deployment-and-releases-with-jenkins-and-dbforge.html) [Twitter](https://twitter.com/intent/tweet?text=Automated+Database+Deployment+and+Releases+with+Jenkins+and+dbForge&url=https%3A%2F%2Fblog.devart.com%2Fautomated-database-deployment-and-releases-with-jenkins-and-dbforge.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/automated-database-deployment-and-releases-with-jenkins-and-dbforge.html&title=Automated+Database+Deployment+and+Releases+with+Jenkins+and+dbForge) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/automated-database-deployment-and-releases-with-jenkins-and-dbforge.html&title=Automated+Database+Deployment+and+Releases+with+Jenkins+and+dbForge) [Copy URL](https://blog.devart.com/automated-database-deployment-and-releases-with-jenkins-and-dbforge.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/automatic-schema-comparison-log-delivery-via-email.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Automatic Schema Comparison Log Delivery via Email By [dbForge Team](https://blog.devart.com/author/dbforge) March 19, 2019 [0](https://blog.devart.com/automatic-schema-comparison-log-delivery-via-email.html#respond) 4330 In this article, we will show how to set up automatic email delivery of the log file when a comparison of multiple SQL schemas fails. The task will be completed by means of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . Specifically, we will use its command-line functionality, allowing to perform routine tasks in both, command prompt and PowerShell. In this article, we will focus on the command prompt. Script Creation To complete the task, we will need the following set of files: TXT files with Source and Target. A PS file with mail settings. A BAT file to run both, comparison and mail delivery. Setting Source and Target First, we need to create two TXT files containing Source and Target connections. For this, we need to do the following: Open a new Notepad document and specify source servers and databases separated by a comma. In our case we specify the following two databases: DBMSSQLx64\\MSSQL2016, dev_database\nDBMSSQLx64\\MSSQL2014, test_database Save the new TXT file. Do the same for Target. In our case, we specify the following: DBMSSQLx64\\MSSQL2016, test_database\nDBMSSQLx64\\MSSQL2014, dev_database Note that to illustrate the failed comparison scenario in this article, we have intentionally included the databases that do not exist ( test_database and dev_database from DBMSSQLx64\\MSSQL2014 ). Configuring email settings Next, we need to configure email settings for both, sender and addressee. For this, we need to create a Windows PowerShell Cmdlet file ( .ps1). It’s a script, that contains a series of lines written in the PowerShell scripting language. To create a PS1 file with mail settings: Open a plain text editor such as Notepad. Put the following code:| $emailFrom = \"email_from@test.com\"\n$emailTo = \"email_to@test.com\"\n$subj = \"email_subject\"\n$body = \"\"\n$file = \"path_to_file\"\n$smtpServer = \"\"\n\n$att = new-object Net.Mail.Attachment($file)\n$smtp = new-object Net.Mail.SmtpClient($smtpServer)\n$msg = new-object Net.Mail.MailMessage\n\n$msg.From = $emailFrom\n$msg.To.Add($emailTo)\n$msg.Subject = $subj\n$msg.Body = $body\n$msg.Attachments.Add($att)\n\n$smtp.Send($msg)\n$att.Dispose() Replace the quoted values of the following parameters with your data: $emailFrom = “email_from@test.com” – specify the sender email address. $emailTo = “email_to@test.com” – specify the addressee email address. $subj = “email_subject” – specify the mail subject. $body = “” – specify any text for the mail body, if required. $file = “path_to_file” – specify the path to the log file. $smtpServer = “” – specify the SMTP server of your mail service. Save the file with the PS1 extension. Bat File Finally, we can put everything together, and create an executable BAT file to run the chief task via the command-line interface. To create a BAT file: Open a plain text editor such as Notepad. Put the following code into the file: Set Compare=\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\"\nSet Sender= powershell.exe\n\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (Source_Servers_and_DBs.txt) do (\n\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%g in (Target_Servers_and_DBs.txt) do (\n\n\n%compare% /schemacompare /source connection:\"Data Source=%%e;Initial Catalog=%%f;Integrated Security=False;User ID=sa\" /target connection:\"Data Source=%%g;Initial Catalog=%%h;Integrated Security=False;User ID=sa\" /log:Compare_result.log\n\n(\nif %ERRORLEVEL%==0 %Sender% -File D:\\temp\\sync_to_mail\\PowerShell\\send_email_script.ps1\ncd.>Compare_result.log\n)\n\n)\n)\n\npause Where Source_Servers_and_DBs.txt is the name of the file containing source connections. Target_Servers_and_DBs.txt is the name of the file containing target connections. D:\\temp\\sync_to_mail\\PowerShell\\send_email_script.ps1 is the location and name of the script with mail settings. For more information regarding all the aspects of the command-line syntax, refer to the product documentation. Save the file with the .bat extension. Script Execution Now we have anything we need. Let’s run the created bat file and see what happens. Comparison #1 The first comparison of the databases located on the SQL Server 2016 has been successful, the differences have been found and the log file has been emailed: As for the rest three comparisons, all of them feature non-existing databases in Source or in Target. Thus, they all resulted in the generation of the error. In all these cases, the corresponding log file has been generated and sent to the specified email address. Comparison #2 Comparison #3 Comparison #4 Now let’s check the inbox. As expected, we received 4 emails with the log attached: Comparison log #1 Comparison log #2 Comparison log #3 Comparison log #4 Conclusion In this article, we provided a solution for automatic delivery of schema comparison log files to an email when comparing schemas of multiple SQL Server databases. To complete this task, we created a bat file that copes with the task in a single click. Further, we can create synchronization task in Windows Scheduler, and the process will become a 100-percent automatic. The command-line functionality of dbForge Studio for SQL Server provides a bunch of options and possibilities for customizing schema comparison to your specific needs. Try dbForge Studio for SQL Server and check how it can help you in your DB tasks. Tags [command line](https://blog.devart.com/tag/command-line) [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fautomatic-schema-comparison-log-delivery-via-email.html) [Twitter](https://twitter.com/intent/tweet?text=Automatic+Schema+Comparison+Log+Delivery+via+Email&url=https%3A%2F%2Fblog.devart.com%2Fautomatic-schema-comparison-log-delivery-via-email.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/automatic-schema-comparison-log-delivery-via-email.html&title=Automatic+Schema+Comparison+Log+Delivery+via+Email) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/automatic-schema-comparison-log-delivery-via-email.html&title=Automatic+Schema+Comparison+Log+Delivery+via+Email) [Copy URL](https://blog.devart.com/automatic-schema-comparison-log-delivery-via-email.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/average-function-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Mastering Average Function With dbForge Studio for SQL Server By [Nataly Smith](https://blog.devart.com/author/nataly-smith) April 29, 2024 [0](https://blog.devart.com/average-function-in-sql-server.html#respond) 1422 In data analysis, the AVG() function stands out as a useful tool that allows extracting trends, patterns, and key metrics from datasets. In this article, our focus is to explore the practical applications, advanced techniques, and hands-on examples that will equip you with the necessary skills to unfold the full potential of the AVG() function. Additionally, we will delve into its arguments: ALL and DISTINCT. To illustrate these concepts effectively, we will use [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) as our primary instrument, providing clear examples of how these functionalities operate in real-world scenarios. Contents Understanding the AVG() function Practical uses of the AVG() function Try it yourself with dbForge Studio Data Generator for SQL Server Hands-on examples Advanced techniques with AVG() Further learning Conclusion Understanding the AVG() function If you are a frequent guest on our blog, you already know that we always start with the basics and work our way to more complex topics. Today is not an exception. Thus, let us move on to understanding the AVG() function. It is crucial for proficient data analysis as it provides a streamlined approach to computing averages within datasets. The basic syntax of this function looks somewhat like this: AVG([ALL|DISTINCT] expression) \n [OVER([partition_by_clause] order_by_clause)]; This SQL query calculates the average value of the specified column in the given table. The arithmetic formula behind the AVG() is dividing the sum of the values in the array by the number of values in that array. The result is displayed under the alias average_number . Now, let us see what all those arguments are about: ALL (default) – applies the function to all values. DISTINCT – operates only on one unique instance of each value, regardless of how many times that value occurs. expression – the exact numeric or approximate numeric data type category, except for the bit data type. It cannot be an aggregate function or subquery. OVER : partition_by_clause – divides the result set of the FROM clause into partitions to which the function is applied. order_by_clause – determines the logical order in which the operation is performed. Practical uses of the AVG() function Finding an average within an array of data is one of the most widely used operations in different fields of human activity, including: Statistical analysis : One of the primary uses of AVG() is in statistical analysis since it allows you to calculate average values for numerical data, providing insights into central tendencies within datasets. Performance metrics: In business and finance, this function can be used to compute average performance metrics: sales, revenue per customer, transaction value, etc. These metrics help in evaluating the overall business performance over time. Quality control : In manufacturing and production environments, AVG() can be employed to monitor and maintain quality control standards. Organizations can identify trends and deviations from desired norms by calculating average measurements or defects, facilitating timely corrective actions. Resource planning : This function can be useful in resource planning. For example, workforce management can compute average employee workload or average response times in customer service, aiding in staffing decisions and capacity planning. Market analysis : In marketing and market research, AVG() helps analyze consumer behavior and market trends. It can calculate average customer spending, order value, or website visit duration, providing valuable insights for strategic decision-making and marketing campaigns. Educational assessment : In educational institutions, the function computes average grades, scores, or test results, aiding teachers and administrators in assessing student performance and identifying areas for improvement. Try it yourself with dbForge Studio Even though SQL Server Management Studio (SSMS) is the most popular and familiar tool that allows you to work with SQL Server databases, it is not the only one. Moreover, in the continuously evolving world of database development, administration, and management, new GUIs keep appearing like mushrooms after the rain. How do you choose the tool that is perfect for you in this variety? Let us compare dbForge Studio for SQL Server with SSMS so that you can make an informed decision on which solution best aligns with your daily requirements: Feature dbForge Studio for SQL Server SQL Server Management Studio User-friendly interface Boasts an intuitive and user-friendly interface, providing a smooth user experience for both beginners and experienced developers. While powerful, SSMS can have a steeper learning curve, particularly for those new to SQL Server tasks. Advanced functionality Offers a wide range of advanced features, including visual query builder, data and schema comparison tools, and advanced SQL editing capabilities. Provides essential functionalities but may need some of the advanced features available in dbForge Studio. Integrated tools Comes with integrated tools for schema and data comparison, enabling seamless data synchronization and database management from the box. While offering basic tools, SSMS may require additional add-ons for certain advanced functionalities. Data generation Provides a powerful data generation tool that enables the creation of realistic test data with customizable parameters, offering flexibility in data generation for specific tables and columns. Incorporates fundamental data generation features but may necessitate additional scripts or tools for advanced and specific data generation requirements. Cross-platform support Supports Windows, macOS, and Linux, providing flexibility for users on different operating systems. Is primarily designed for Windows, limiting its accessibility for macOS users. Take advantage of dbForge Studio for SQL Server by [downloading a free fully-functional 30-day trial version](https://www.devart.com/dbforge/sql/studio/download.html) and [installing it on your computer](https://docs.devart.com/studio-for-sql-server/getting-started/installing.html) . With a huge pack of advanced features and intuitive GUI, this all-in-one MSSQL tool can maximize productivity and make SQL Server database development, administration, and management process efficient. The Studio can also be of use when it comes to today’s topic, from generating test data to performing advanced percentage calculations. For a more visual comparison of the two solutions, watch the [SSMS vs. dbForge Studio for SQL Server – Features Comparison](https://www.youtube.com/watch?v=UiVxy83826Y) video on the Devart YouTube channel. Data Generator for SQL Server In order to properly demonstrate the behavior of the AVG() function, we need a test database. Therefore, we are going to create it and fill it with realistic data using Data Generator for SQL Server. You can use this tool as a part of dbForge Studio or [download it as a separate solution](https://www.devart.com/dbforge/sql/data-generator/download.html) . Moreover, Data Generator comes with [a free add-in for SQL Server Management Studio](https://docs.devart.com/data-generator-for-sql-server/generating-data/setting-databases-in-ssms.html) that allows you to quickly populate your databases with meaningful test data right from the Management Studio Object Explorer. Let us say we have created an empty BicycleStore database. The screenshot below graphically demonstrates the structure of the database, including tables, columns, connections, data types, foreign keys, etc. Now, it is time to populate the database with test data: 1. In the Tools menu, click New Data Generation . The Data Generator wizard will open. 2. Specify the connection and select the BicycleStore database. 3. Click Next . The Options page will appear. Set the required options here. 4. Click Open . After processing, you will be presented with the data generation result. You can specify the tables that you want to populate by selecting the check box that is located next to the table name. Further, you can define how you want the data to be generated: click the table name in the tree view and specify the details in the settings pane. All the changes are displayed in real time. 5. On the Data Generator toolbar, click Populate data to the target database . 6. The Data Population Wizard will open. On the Output page, you can select how to manage the data population script: Open the data population script in the internal editor. Save the script to a file. Execute the data population script against the database. Select a required option and click Next . 7. On the Options page, configure the synchronization options. Click Next . 8. On the Additional Scripts page, type or select the script to be executed before and/or after the data population. Click Next . 9. The Summary page allows you to see the details of an error or warning. When you are setting up the tables and columns that you want to populate, dbForge Studio displays warning and error messages to inform you when there may be a problem with the data generation. 10. Click Generate to finish the process. Hands-on examples Now that we have covered the theoretical part of today’s agenda and prepared the testing ground for our experiments, we can finally get our hands on the AVG() function. Example 1 For starters, we chose the simplest example: SELECT\n AVG(TotalAmount) AS AverageTotal\nFROM OrderDetails; Here, the query calculates the average total the customers spent on an order. Example 2 The second example not only showcases the application of the AVG() function but also involves the complexity introduced by the GROUP BY operator: SELECT\n CustomerID\n ,AVG(UnitPrice) AS AveragePrice\nFROM Orders\nGROUP BY CustomerID; This query calculates the average price of individual products within each customer’s order and organizes them based on the unique CustomerID. Advanced techniques with AVG() In this section of the article, let us challenge ourselves with a little more complex examples of the AVG() function usage: How to incorporate AVG() with subqueries How to handle NULL values Incorporating AVG() with subqueries Advanced analysis often requires combining AVG() with subqueries to derive insights from complex datasets. Subqueries can be used to filter data, perform calculations, or aggregate results before computing the average. For example, you might use a subquery to calculate the average sales per region, product category, or customer segment. This allows for more granular analysis and facilitates the identification of trends and patterns within specific subsets of data. SELECT\n FirstName + ' ' + LastName AS CustomerName\n ,AVG(QuantityOrdered) AS AverageQuantity\nFROM Customers c\nINNER JOIN BicycleStore.dbo.Orders o\n ON c.CustomerID = o.CustomerID\nINNER JOIN BicycleStore.dbo.OrderDetails od\n ON o.OrderID = od.OrderID\nGROUP BY c.FirstName\n ,c.LastName; For instance, this query retrieves data from the Customers , Orders , and OrderDetails tables to calculate the average quantity of goods ordered by each customer: FirstName + ' ' + LastName AS CustomerName : The first name and last name of each customer concatenated to form their full name. AVG(QuantityOrdered) AS AverageQuantity : The average quantity ordered by each customer. FROM Customers c : The source table, aliased as c . INNER JOIN BicycleStore.dbo.Orders o ON c.CustomerID = o.CustomerID : The Customers table, joined with Orders based on the CustomerID column. INNER JOIN BicycleStore.dbo.OrderDetails od ON o.OrderID = od.OrderID : The Orders table, joined with OrderDetails based on the OrderID column. GROUP BY : The clause that groups the results by each customer’s first name and last name. AVG() : The function that calculates the average quantity ordered for each group of customers with the same first and last name. Handling NULL values in average calculations NULL values can cause trouble when calculating averages, as they are typically excluded from the equation by default. However, it is important to handle those appropriately to ensure accurate results. To do that, you can use the COALESCE() function to replace NULL values with a specified default value: SELECT\n AVG(COALESCE(UnitPrice, 0)) AS AveragePriceNullHandling\nFROM Products; AVG(COALESCE(UnitPrice, 0)) : Calculates the average unit price of products. The COALESCE function handles NULL values. If the UnitPrice column contains NULL values, it replaces them with 0 before calculating the average. AS AveragePriceNullHandling : Assigns the AveragePriceNullHandling label to the result of the AVG() function. FROM BicycleStore.dbo.Products : Specifies the source table from which the data is retrieved. Further learning [SQL Server Tutorials](https://blog.devart.com/sql-server-tutorial) [dbForge Studio Documentation](https://docs.devart.com/studio-for-sql-server/) [dbForge Studio Video Tutorials](https://www.youtube.com/playlist?list=PLpO6-HKL9JxXSZgO3L0MxOTt3QxpFbJNt) [Devart Academy](https://www.devart.com/academy/) Conclusion The AVG() function serves as a powerful tool in data analysis, allowing users to calculate averages efficiently across diverse datasets. Its practical applications spread across various industries, from statistical analysis to resource planning and market research. Through hands-on examples and exploration using tools like dbForge Studio, you can gain a deeper understanding of its functionalities and potential. Moreover, dbForge Studio for SQL proves invaluable for average calculation tasks, offering a user-friendly interface and robust features. Feel free to [download its free 30-day trial version](https://www.devart.com/dbforge/sql/studio/download.html) to experience its capabilities firsthand and further enhance your data analysis skills. [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Faverage-function-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Mastering%C2%A0Average+Function+With+dbForge+Studio+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Faverage-function-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/average-function-in-sql-server.html&title=Mastering%C2%A0Average+Function+With+dbForge+Studio+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/average-function-in-sql-server.html&title=Mastering%C2%A0Average+Function+With+dbForge+Studio+for+SQL+Server) [Copy URL](https://blog.devart.com/average-function-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/awards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Awards and Endorsements Recap: Discover the Best dbForge Products of Q2 and Q3 2024 By [Victoria Shyrokova](https://blog.devart.com/author/victorias) October 3, 2024 [0](https://blog.devart.com/awards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html#respond) 979 Recognition is essential for the Devart team, especially in the contests and nominations where the users get to vote. Such endorsements show us that our effort is acknowledged and highly appreciated by the community of developers, database administrators, and everyone involved in data management processes. Throughout the second and third quarters of 2024, the Devart team achieved multiple awards for the dbForge product line, winning gold and silver for the DBTA Readers’ Choice Awards 2024 nominations, as well as getting badges from Crozdesk, Sourceforge, G2, and SoftwareSuggest listings. By being part of our user community, you have also contributed to these remarkable achievements, as they represent your votes and appreciation for our products. DBTA Readers’ Choice Awards 2024 Participation in the DBTA Readers’ Choice Awards has become a tradition for dbForge product line since this contest considers the opinions of database developers and administrators who work daily with all kinds of databases, and the Devart team is open to feedback. This year, our [dbForge product line](https://www.devart.com/dbforge/) , the go-to for streamlining development, automating tasks, and ensuring database stability, won the DBTA Best Database Development Solution 2024 nomination. [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , and [Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , along with [dbForge Edge](https://www.devart.com/dbforge/edge/) , have also made their way to the top ratings of DBTA Best DBA Solution nomination, winning Silver for their ability to boost database administration routine and simplify repetitive tasks. Last but not least in its importance was a hard-won Bronze in DBTA’s Best Database Performance Solution nomination for [dbForge Studios for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , as well as for [dbForge Edge](https://www.devart.com/dbforge/edge/) , which bundles four Studios to work with the most popular DBMSs for ultimate experience. “Right now, most organizations we speak with are very focused on improving operational efficiency and productivity while laying the groundwork for AI capabilities. This is driving a lot of modernization efforts. To move the needle in these areas, how data is ingested, stored, processed, integrated, analyzed, and secured needs to be improved.” Tom Hogan Jr., Publisher of DBTA [dbForge](https://www.devart.com/dbforge/) product line aims to solve most database-related data management tasks, providing a powerful toolset that accelerates workflows and boosts productivity. dbForge products come in different variations, from dbForge Edge solution to standalone IDEs and tools for specific tasks that speed up coding with Intellisense-like suggestions, query profiling tools, database design functionality, tools for easy migration, and solutions that streamline CI/CD. Having such an asset as you work with a database can significantly assist you in your routines. Check the [Winners’ Circle column](https://www.dbta.com/Editorial/Actions/Winners-Circle-Devart-165384.aspx) by Oleksii Honcharov , Head of Engineering at Devart, to learn more about what has inspired the Devart team to build dbForge solutions and what’s coming in the future. Crozdesk Recognition Products with high user satisfaction, extensive features, and a strong market presence earn badges on Crozdesk to mark their achievements. These badges make it easier for potential customers to find the best solutions for their projects, which is why getting recognition is so important. Thanks to the community of users actively voting for dbForge products, the Devart team is proud to announce that [dbForge Edge](https://www.devart.com/dbforge/edge/) , as well as [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [dbForge for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , along with [dbForge Compare Bundle for MySQL](https://www.devart.com/dbforge/mysql/compare-bundle/) and [Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) , were highlighted with a Happiest Users , Trusted Vendors , and Quality Choice badges. These endorsements spotlight the highly rated products that show extensive market presence and set themselves apart from the rest of the competition in many ways. G2 Achievements [dbForge products for SQL Server](https://www.devart.com/dbforge/sql/) have gained momentum on the G2 platform, as they got fairly recognized for high performance, momentum leadership, easy implementation, and best-estimated ROI. Among the products that received the most recognition are [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , a powerhouse IDE for SQL Server to boost development, manage databases, analyze data, streamline collaboration, and facilitate DevOps tasks, [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) for accelerated code completion, and [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) for SQL Server development. In the second and third quarters, these tools have earned multiple awards from G2. As a comprehensive toolset for SQL development, database design, data management, and administration, [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) has become a key part of the workflow for MySQL specialists. It’s a pleasure for the Devart team to see it widely recognized and awarded with badges as a high-performer that provides best-in-class support and offers the best experience in setup and usability. Throughout the second and third quarters of 2024, we were honored to receive 18 G2 awards for the [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) . [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) has become a guiding star for many developers, managers, and administrators working with Oracle databases. This IDE significantly increases PL/SQL coding speed and provides versatile data editing tools for managing in-database and external data. It has been highly appreciated by G2 reviewers as well, winning badges as a high performer and badges for being easy to do business with. SourceForge Badge SourceForge is a trusted platform where buyers discover, compare, and review business software and IT services. Top-performing listing items receive badges that highlight their value and set them apart from the competition. This summer, many Devart products were recognized by SourceForge and Slashdot Media, which operates Sourceforge listings and boasts a wide community of technology experts. Among the items that received the Customers Love Us badge are the [dbForge Edge](https://www.devart.com/dbforge/edge/) ultimate solution, [dbForge Studios for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , and numerous products that boost one’s work with SQL Server, such as [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , and [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . Moreover, the small SQL Server tools aimed at specific tasks, like [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) , [dbForge Search](https://www.devart.com/dbforge/sql/search/) , [Unit Test](https://www.devart.com/dbforge/sql/unit-test/) , [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) , [Data Generator](https://www.devart.com/dbforge/sql/data-generator/) , [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) , [DevOps Automation tool](https://www.devart.com/dbforge/sql/database-devops/) , [Documenter](https://www.devart.com/dbforge/sql/documenter/) , [Event Profiler](https://www.devart.com/dbforge/sql/event-profiler/) , [Index Manager](https://www.devart.com/dbforge/sql/index-manager/) , [Monitor](https://www.devart.com/dbforge/sql/monitor/) , [Query Builder](https://www.devart.com/dbforge/sql/querybuilder/) , and [SQL Decryptor](https://www.devart.com/dbforge/sql/sqldecryptor/) , also received this badge. SoftwareSuggest Rankings SoftwareSuggest is another popular platform that’s widely used by professionals to choose the best tools for their work. The Devart team is excited to get our products highly ranked there, as this eventually helps more people needing our expertise and solutions find us. Overall, our dbForge database tools have scored an overall of 26 new badges in the second and third quarters of 2024. Let’s take a look at these impressive achievements. Numerous tools and utilities for SQL Server got 13 new badges from SoftwareSuggest. Among them were [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , [Data](https://www.devart.com/dbforge/sql/datacompare/) [C](https://www.devart.com/dbforge/sql/datacompare/) [ompare](https://www.devart.com/dbforge/sql/datacompare/) , [Data Generator](https://www.devart.com/dbforge/sql/data-generator/) , [Query Builder](https://www.devart.com/dbforge/sql/querybuilder/) , and [Search for SQL Server](https://www.devart.com/dbforge/sql/search/) , which are recognized as highly recommended, top-trending, and top-rated tools in the niche. Also, products such as [dbForge Index Manager](https://www.devart.com/dbforge/sql/index-manager/) , [Documenter](https://www.devart.com/dbforge/sql/documenter/) , [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) , [Event Profiler](https://www.devart.com/dbforge/sql/event-profiler/) , [Decryptor](https://www.devart.com/dbforge/sql/sqldecryptor/) , [Monitor](https://www.devart.com/dbforge/sql/monitor/) , and [Transaction Log](https://www.devart.com/dbforge/sql/transaction-log/) for SQL Server have received endorsements for top customer support, highest satisfaction, and easy implementation. Moreover, this year, our [dbForge Edge](https://www.devart.com/dbforge/edge/) , [Data Compare](https://www.devart.com/dbforge/mysql/datacompare/) , and [Schema Compare](https://www.devart.com/dbforge/mysql/schemacompare/) tools for MySQL, as well as [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) and [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) solutions, received six new endorsements from this listings portal. SoftwareSuggest has marked them for best customer support, leadership, accelerated growth, and top ratings. The [dbForge products for Oracle](https://www.devart.com/dbforge/oracle/) and [PostgreSQL](https://www.devart.com/dbforge/postgresql/) have also received numerous endorsements and have won 8 new badges for high performance, leading position in the category, top usability, and best customer support. Among the products for Oracle that were caught in the spotlight were [dbForge Compare Bundle](https://www.devart.com/dbforge/oracle/compare-bundle/) , [Schema Compare](https://www.devart.com/dbforge/oracle/schemacompare/) , [Data Compare](https://www.devart.com/dbforge/oracle/datacompare/) , and [Data Generator](https://www.devart.com/dbforge/oracle/data-generator/) , as well as the [Documenter](https://www.devart.com/dbforge/oracle/documenter/) tool. [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , as well as [Schema Compare](https://www.devart.com/dbforge/postgresql/schemacompare/) and [Data Compare](https://www.devart.com/dbforge/postgresql/datacompare/) tools for PostgreSQL DBMS, were also highlighted in the SoftwareSuggest listings. Wrapping Up The support and enthusiasm from our community fuel our drive to keep innovating and adding new tools and key features in future updates, as we know that our work gets so much appreciation from the users. Today, we want to thank everyone who uses [dbForge tools](https://www.devart.com/dbforge/) for their tremendous support, and we look forward to your ongoing feedback. If you haven’t shared your review on any of the platforms featured above, please feel free to do so. Your ratings not only assist others who face similar challenges but also let us improve our products further. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fawards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html) [Twitter](https://twitter.com/intent/tweet?text=Awards+and+Endorsements+Recap%3A+Discover+the+Best+dbForge+Products+of+Q2+and+Q3+2024&url=https%3A%2F%2Fblog.devart.com%2Fawards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/awards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html&title=Awards+and+Endorsements+Recap%3A+Discover+the+Best+dbForge+Products+of+Q2+and+Q3+2024) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/awards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html&title=Awards+and+Endorsements+Recap%3A+Discover+the+Best+dbForge+Products+of+Q2+and+Q3+2024) [Copy URL](https://blog.devart.com/awards-and-endorsements-recap-discover-the-best-dbforge-products-of-q2-and-q3-2024.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/azure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Azure Data Studio vs dbForge SQL Complete: Which One’s Best at Code Completion? By [dbForge Team](https://blog.devart.com/author/dbforge) February 19, 2023 [0](https://blog.devart.com/azure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html#respond) 3731 With the right SQL code completion tools by your side, you can get approximately 2 to 4 times more productive with your daily coding routine, as evidenced by our clients in [success stories](https://www.devart.com/success-story/) . Empowered with context-sensitive suggestions, advanced code formatting, and productivity boosters, you can streamline monotonous operations and work every day with a far sharper focus on things that require your attention the most. After all, you want to work smart and make it a pleasure, don’t you? That is why we are here. Today we’ll be comparing the code completion-related features of two major solutions for SQL developers—Devart’s dbForge SQL Complete and Microsoft’s very own Azure Data Studio —and see which one gains the upper hand when it comes to accelerating database development and making the user more effective. Let’s start with the general information and capabilities of both tools. A quick overview of SQL Complete [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is a high-end add-in that seamlessly integrates into SQL Server Management Studio and Visual Studio; both of these rank among the top Microsoft IDEs, and the latter, sure enough, ventures far beyond SQL. However functional these IDEs may be, SQL Complete substantially enhances them with IntelliSense-style suggestions, instant statement expansion, rich formatting capabilities, predefined and custom code snippets, as well as safe refactoring with auto-correction of references to the objects you need to rename. Among other things, we should mention a set of built-in data aggregation and manipulation tools and a T-SQL Debugger for complex queries, stored procedures, triggers, and functions. A quick overview of Azure Data Studio Azure Data Studio is a cross-platform database IDE that delivers a solid SQL editor with IntelliSense completion, smart code snippets, integration of version control, and a built-in terminal. You also get a few other goodies, such as customizable server and database dashboards, but overall, Azure Data Studio is not about in-depth administration or server configuration. Generally, Microsoft recommends using Azure Data Studio if your work is mostly about editing or executing queries if you need the ability to quickly chart and visualize result sets, and if you are comfortable working with the command line. This is why it makes sense to compare it with SQL Complete—both of these solutions focus on fast and efficient query writing and both support CLI. Now we only have to find out which one makes it all better. Code completion comparison: dbForge SQL Complete vs Azure Data Studio For your convenience, we have divided all features into 3 categories: SQL code completion (obviously, this category is the biggest one), SQL code formatting , and Productivity enhancements . Features dbForge SQL Complete Azure Data Studio SQL code completion Context-sensitive suggestion of keywords Yes Yes, but not context-sensitive Context-sensitive object suggestions Yes Yes, but not context-sensitive Context-sensitive object suggestions for CTE Yes Yes Context-sensitive object suggestions in the SQLCMD mode Yes No Name suggestions for objects on linked servers Yes No Sorting of suggested keywords by relevance Yes No JOIN clause auto-generation Yes No Phrase completion Yes No Auto-generation of table aliases Yes No Column picker for quick list building Yes No Wildcard expansion Yes Yes Expansion of INSERT, EXEC, ALTER, and UPDATE statements Yes No Exclusion of databases from suggestions Yes No Highlighting of identifier occurrences Yes Yes Pair highlighting Yes No Highlighting of matching columns in the INSERT statements Yes No Named regions Yes No Parameter information for functions Yes No Quick object information Yes Yes Row count information Yes No SQL code formatting SQL formatting Yes Yes Formatting in files and directories Yes No Quick selection of formatting profiles Yes Yes Automated formatting from the command line Yes No Productivity enhancements SQL snippets Yes Yes Semi-transparent suggestion box Yes No Current statement execution option Yes Yes Semicolon insertion Yes No Generation of CREATE/ALTER scripts for server objects Yes Yes Copy Data As from the grid to XML, CSV, HTML, JSON, Excel Yes Yes Go to Definition for database objects Yes Yes Recovery of recently closed documents Yes No T-SQL Analyzer Yes No Auto-suggesting non-aggregated columns for GROUP BY clause Yes No Find Invalid Objects CLI Yes No Command Line Wizard Yes No Releases First release v1.0 (November 19, 2010) v1.0 (September 24, 2018) Latest release (at the time of publication) v7.0 (Sep 5, 2024) v1.49 (Aug 15, 2024) Total number of releases 141 80 As you can see, [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is the clear winner here. It expands the capabilities of SSMS so greatly that it basically leaves no chance for Azure Data Studio to overcome it any day soon in this respect. Conclusion That said, if using SSMS is just fine with you, and if you need to give a powerful boost to your daily SQL coding, you don’t need Azure Data Studio really. The combination of SSMS and SQL Complete is your best bet. What’s more, besides the completion, formatting, and productivity features that have been mentioned in the comparison above, SQL Complete delivers quite a few other things that you will most certainly find highly useful in your daily work: [Smart renaming of database objects and variables](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html#database_identifier) [Alias refactoring and custom alias mapping](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html#rename_aliases) [Search for invalid objects](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html#find_invalid_objects) [Versatile data management in the SSMS results grid](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#generate_script_as) [Data visualization](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#data_visualizers) [Query execution warnings and notifications](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#execution_warnings) [Transaction reminders](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#transaction_reminder) [Query execution history](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#executed_sql_statements_history) [Custom SSMS window title](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#custom_ssms_main_window_title) [Tab coloring](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#tabs_coloring) [Document Outline window for simplified navigation](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#document_outline_window) Try it all free of charge — [download SQL Complete for a free 14-day trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and make your daily work with SQL an easy breeze. Tags [Azure Data Studio](https://blog.devart.com/tag/azure-data-studio) [Code Completion](https://blog.devart.com/tag/code-completion) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fazure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html) [Twitter](https://twitter.com/intent/tweet?text=Azure+Data+Studio+vs+dbForge+SQL+Complete%3A+Which+One%E2%80%99s+Best+at+Code+Completion%3F&url=https%3A%2F%2Fblog.devart.com%2Fazure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/azure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html&title=Azure+Data+Studio+vs+dbForge+SQL+Complete%3A+Which+One%E2%80%99s+Best+at+Code+Completion%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/azure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html&title=Azure+Data+Studio+vs+dbForge+SQL+Complete%3A+Which+One%E2%80%99s+Best+at+Code+Completion%3F) [Copy URL](https://blog.devart.com/azure-data-studio-vs-dbforge-sql-complete-which-ones-best-at-code-completion.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/azure-data-studio-vs-ssms-whos-the-winner.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Azure Data Studio vs SSMS: Who’s the Winner? By [dbForge Team](https://blog.devart.com/author/dbforge) September 22, 2020 [0](https://blog.devart.com/azure-data-studio-vs-ssms-whos-the-winner.html#respond) 6470 For 15 years, SSMS has held the title of the top SQL Server database tool. That’s no easy feat, especially in the software world, where things can change faster than you say “blueberry pie.” But then a contender emerged, developed and released by the same Microsoft folks in 2018. It was Azure Data Studio – obviously less mature, weaker in terms of database administration, yet very promising and eager to win its audience. And today, we’re going to put them face to face and find out who’s better. If the title of this post caught your attention, you are most likely quite familiar with SSMS. But should you abandon it in favor of Azure Data Studio? Well, it depends on how you are going to use it, on the particular set of features you need for your particular tasks. That said, it’s up to you to determine the winner. We’ll only help you with a detailed comparison of features based on our recent research. Scoring the Strengths of Each Opponent Since the fight (the research in question) has already been completed, we can proceed straight to evaluation. And while SSMS is still marketed as the primary tool, all those neat little features introduced in Azure Data Studio (ADS) may look quite compelling. Let’s overview them. FEATURES AZURE DATA STUDIO v1.17.1 SSMS v18 COMMENTS GENERAL Azure Sign-In Yes Yes Support for connection via Azure. It is performed identically in both tools, but ADS has broader connection capabilities, with fields like Attestation Protocol, Persist Security Info, etc. Dashboard Yes No A very illustrative representation of databases, their sizes and statuses. Extensions Yes Yes* Work with compatible 3rd-party extensions. Besides them, add-ins can also be used (e.g. [dbForge SSMS tools & add-ins](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) ). Integrated Terminal Yes No Built-in editor that works with the command line and supports numerous standards, including PowerShell commands. Object Explorer Yes Yes It shares the same principles in both tools, but the quantity of supported objects is more limited in ADS. Object Scripting Yes Yes Available in ADS as an extension. Requires installation. Project System Yes No Available in ADS as an extension. Requires installation. Select from Table Yes Yes Context menu in Object Explorer, which allows executing a selection from the table. Source Code Control Yes Yes* ADS has a built-in Git source control manager. Still, additional Git installation is required. [dbForge Source Control add-in](https://www.devart.com/dbforge/sql/source-control/) can also be installed for work with Git. Task Pane Yes No A panel that displays all performed tasks (event log). Theming Yes No Allows setting up a preferred theme in ADS. As for SSMS, Dark Mode can be set up, but still it remains unavailable in general settings in the release version. Dark Mode Yes No Officially unavailable in SSMS, does not work correctly with Object Explorer, but can be set up manually with a bit of effort. Still, there is a chance that SSMS will soon be officially introduced. Azure Resource Explorer Preview No Available as an extension. Requires installation. Allows setting up and manage Azure connections. Generate Scripts Wizard No Yes A wizard that helps to generate database scripts. Import/Export DACPAC No Yes A set of components for importing/exporting DACPAC. Object Properties No Yes A window with object properties. Table Designer No Yes A typical editor where tables can be created/modified/deleted. [dbForge Add-Ins](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) No Yes A host of useful add-ins that eliminate functional gaps and increase development productivity. dbForge can be integrated with SSMS only. QUERY EDITOR Chart Viewer Yes No Graphical representation of queries. Export Results to CSV, JSON, XLSX Yes No The results of an executed query can be saved in CSV, JSON, and XLSX formats. In the data editor, the corresponding buttons can be found on the right. Upon clicking one, Explorer will be opened; it will allow saving the file. IntelliSense Yes Yes ADS has far better IntelliSense than the one in SSMS; still, both pale in comparison with the one in SQL Complete. As an alternative to the native IntelliSense, [dbForge SQL Complete add-in](https://www.devart.com/dbforge/sql/sqlcomplete/) offers a wide range of capabilities, including smart code suggestions, formatting, and a number of other productivity-enhancing features. Snippets Yes Yes Allows using code snippets; the functionality is similar in both tools. Show Plan Preview Yes Available as an extension. Requires installation. Client Statistics No Yes Displays performance statistics in a table. Live Query Stats No Yes Allows checking the execution plan of an active query in real time. Query Options No Yes Allows setting up query execution options. Results to File No Yes Allows exporting query execution results to a file. Results to Text No Yes Allows exporting query execution results as text. Spatial Viewer No Yes The Spatial Results window in Query Editor provides visual mapping tools for viewing spatial data results (in addition to the data displayed in the grid format in the Results window). SQLCMD No Yes Allows executing SQLCMD commands and scenarios in the SSMS query editor. Notebooks Yes No Allows using a built-in open-source app Jupyter Notebook, which allows creating and sharing documents containing text, code, images, and query results. Save Query as Snippet Yes No Allows creating a snippet out of code in a SQL document. OPERATING SYSTEM SUPPORT Linux Yes No Installation and launch on Linux. macOS Yes No Installation and launch on macOS. Windows Yes Yes Installation and launch on Windows. DATA ENGINEERING Create External Table Wizard Yes No Allows using Oracle databases as sources of data. HDFS Integration Yes No Connection to Big Data Cluster. DATABASE ADMINISTRATION Backup / Restore Yes Yes Built-in backup and restore features. Big Data Cluster Support Yes No Support for the connection to SQL Server Big Data Clusters, which allow deploying scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes. Flat File Import Preview Yes Allows copying data from a flat file (.csv, .txt) to a new table in a database. SQL Agent Preview Yes A SQL Server service, which enables scheduled launch of SQL scripts. SQL Profiler Preview Yes An interface that helps to create and manage traces, as well as analyze and replay trace results. Always On No Yes Work with “Always On”. Always Encrypted No Yes Work with “Always Encrypted”. Copy Data Wizard No Yes SQL Server data import and export wizard. Database Engine Tuning Advisor No Yes Helps to analyze required indexes, statistics, partitioning, strategy and physical design structure for performance improvement. Error Log Viewer No Yes Allows viewing the log of errors. Maintenance Plans No Yes Allows using a wizard to create a maintenance plan. Multi-Server Query No Yes Allows running queries on multiple servers at once. Policy-Based Management No Yes Allows building policies for servers and run regular scheduled checks. PolyBase No Yes PolyBase configuration feature. Query Store No Yes A set of user interfaces designed for configuring Query Store and for consuming collected data about the workload. Registered Servers No Yes Allows storing the server connection information for future connections. Replication No Yes Replication configuration feature. Security Management No Yes User account and permission management. Service Broker No Yes Service Broker configuration feature. SQL Mail No Yes Configuration of SQL Server Agent to send notifications and alerts in SQL Server using Database Mail. Template Explorer No Yes A built-in explorer for code templates. In order to extend the capabilities of snippet and template usage in SSMS, one can install [dbForge SQL Complete add-in](https://www.devart.com/dbforge/sql/sqlcomplete/) . Vulnerability Assessment No Yes A tool that can help to discover, track, and remediate potential database vulnerabilities. XEvent Management No Yes Displays a live viewer window with extended events. SQL Assessment API Integration No Yes A mechanism that helps to evaluate the configuration of a SQL Server. The API is delivered with a ruleset containing best practices suggested by SQL Server Team. Azure Data Studio vs SQL Management Studio: The Verdict Currently, the champion is clearly far stronger than the contender and isn’t going to retire anytime soon. Still, Azure Data Studio is not without its key advantages — such as availability on multiple platforms, better IntelliSense, built-in source control manager, use of notebooks, and focus on queries — which may prove vital enough for you to favor it. Here’s a brief recap with the crucial points to help you make your choice. SSMS is your winner if… Your work is all about database administration, and you need deep configuration tools. Security management is a necessity: you need to configure security features, manage users, and assess vulnerabilities. You require access to Registered Servers and control over SQL Server services on Windows. Reports for SQL Server Query Store are a must for you. Reports for SQL Server Query Store are a must for you. You need to Import/Export DACPACs. You use 3rd-party add-ins to cover the lacking functionality that you require (e.g. [dbForge SSMS tools and add-ins](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) ). SSMS (SQL Server Management Studio) Azure Data Studio is your winner if… You need a cross-platform SQL editing solution, running on macOS or Linux, with a more sophisticated IntelliSense. You use a number of 3rd-party extensions that you’d always like to have at hand. You don’t need advanced administration features. Most of your administration can be done with an integrated terminal. You are mostly focused on writing and executing queries without much tuning to help them be faster. You would also like to get customizable visualized result sets. The built-in Git source control manager, absent from SSMS, is a must for you. You value support for tools that help you conveniently share your queries, e.g. Jupyter Notebook. You need to query both SQL Server and PostgreSQL. You are alright with using a fast-evolving tool that is not as proven or well-documented as SSMS. Azure Data Studio Of course, someday Azure Data Studio may grow strong enough to challenge SSMS on every level. But even today, it is viable as a more lightweight tool that can simplify a host of specific tasks. The rest depends on whether you focus on those tasks. That said, you’ve most likely already figured out your winner. We hope we made it easier for you. Afterword We’d like to end this post with a few words about our own involvement in the SSMS vs Azure Data Studio showdown. As we mentioned before, our team created quite a few [SSMS tools and add-ins](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) that fill its functional gaps most effectively. Check them to see how they can make your work faster and easier, boosting your development productivity by up to 53%. We also believe in the growth of Azure Data Studio, and we consider creating a SQL Complete plugin for it. After we heard some people say that the absence of SQL Complete features was the only reason that prevented them from using Azure Data Studio, we decided to create a [special petition](https://devart.uservoice.com/forums/87893-sql-complete/suggestions/36603256-create-sql-complete-for-azure-data-studio) to see how many people really want to boost their ADS. If you’re interested, please feel free to upvote. If we see enough interest, we’ll be happy to build it faster for you. Tags [dbforge](https://blog.devart.com/tag/dbforge) [source control](https://blog.devart.com/tag/source-control) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fazure-data-studio-vs-ssms-whos-the-winner.html) [Twitter](https://twitter.com/intent/tweet?text=Azure+Data+Studio+vs+SSMS%3A+Who%E2%80%99s+the+Winner%3F&url=https%3A%2F%2Fblog.devart.com%2Fazure-data-studio-vs-ssms-whos-the-winner.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/azure-data-studio-vs-ssms-whos-the-winner.html&title=Azure+Data+Studio+vs+SSMS%3A+Who%E2%80%99s+the+Winner%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/azure-data-studio-vs-ssms-whos-the-winner.html&title=Azure+Data+Studio+vs+SSMS%3A+Who%E2%80%99s+the+Winner%3F) [Copy URL](https://blog.devart.com/azure-data-studio-vs-ssms-whos-the-winner.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/azure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Azure Database for MySQL: How to Connect and Migrate Databases with dbForge Studio for MySQL By [dbForge Team](https://blog.devart.com/author/dbforge) November 13, 2020 [0](https://blog.devart.com/azure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html#respond) 4074 In this article, you’ll learn how to connect to Azure Database for MySQL and discover three scenarios of database migration from MySQL to Azure using dbForge Studio for MySQL. The article demonstrates how to connect to Azure Database for MySQL Server via dbForge Studio for MySQL. It also explores three common approaches of using the Studio to migrate a database from MySQL to Azure MySQL server. The choice of an approach depends on the circumstances and project requirements. Connect to Azure Database for MySQL using dbForge Studio for MySQL Migrate a database using the Backup and Restore functionality Migrate a database using the Copy Databases functionality Migrate a database using Schema and Data Compare tools Prerequisites To walk through the steps of this guide, you need to: Create an instance in Azure Database for MySQL Install [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) on your machine Connect to Azure Database for MySQL using dbForge Studio for MySQL To connect to Azure Database for MySQL using dbForge Studio for MySQL: On the Database menu, click New Connection. Provide a host name and login credentials. Click the Test Connection button to check the configuration. Migrate a database using the Backup and Restore functionality The Studio allows migrating databases to Azure in a number of ways, the choice of which depends solely on your needs. If you need to move the entire database, it’s best to use the Backup and Restore functionality. In this worked example, we will migrate the sakila database that resides on MySQL server to Azure Database for MySQL. The logic behind the migration process using the Backup and Restore functionality of dbForge Studio for MySQL is to create a backup of the MySQL database and then restore it in Azure Database for MySQL. Step 1. Backup the database 1. On the Database menu, point to Backup and Restore, and then click Backup Database. The Database Backup Wizard will appear. 2. On the Backup content tab of the Database Backup Wizard, select database objects you want to backup. 3. On the Options tab, configure the backup process to fit your requirements. 4. Specify errors processing behavior and logging options. 5. Click Backup. Step 2. Restore the database 1. Connect to Azure for Database for MySQL as described above. 2. Right-click the Database Explorer body, point to Backup and Restore, and then click Restore Database. 3. In the Database Restore Wizard that opens, select a file with a database backup. 4. Click Restore. Migrate a database using the Copy Databases functionality The Copy Databases functionality is somewhat similar to the Backup and Restore, except that with it you do not need two steps to migrate a database. And what is more, the feature allows transferring two or more databases in one go. The Copy Databases functionality is only available in the Enterprise edition of dbForge Studio for MySQL. In this worked example, we will migrate the world_x database from MySQL server to Azure Database for MySQL. To migrate a database using the Copy Databases functionality: 1. On the Database menu, click Copy Databases. 2. In the Copy Databases tab that appears, specify the source and target connection and select the database(s) to be migrated. We enter Azure MySQL connection and select the world_x database. Click the green arrow to initiate the process. 3. Check the result. As a result of our database migration efforts, the world_x database has successfully appeared in Azure MySQL. Migrate a database using Schema and Data Compare tools dbForge Studio for MySQL incorporates a few tools that allow migrating MySQL databases, MySQL schemas and\\or data to Azure. The choice of functionality depends on your needs and the requirements of your project. If you need to selectively move a database, i.e. migrate certain MySQL tables to Azure, it’s best to use Schema and Data Compare functionality. In this worked example, we will migrate the world database that resides on MySQL server to Azure Database for MySQL. The logic behind the migration process using Schema and Data Compare functionality of dbForge Studio for MySQL is to create an empty database in Azure Database for MySQL, synchronize it with the required MySQL database first using Schema Compare tool and then using Data Compare tool. This way MySQL schemas and data will be accurately moved to Azure. Step 1. Connect to Azure Database for MySQL and create an empty database Step 2. Schema synchronization 1. On the Comparison menu, click New Schema Comparison. The New Schema Comparison Wizard appears. 2. Select the Source and the Target, then specify the schema comparison options. Click Compare. 3. In the comparison results grid that appears, select objects for synchronization. Click the green arrow button to open the Schema Synchronization Wizard. 4. Walk through the steps of the wizard configuring synchronization. Click Synchronize to deploy the changes. Step 3. Data Comparison 1. On the Comparison menu, click New Data Comparison. The New Data Comparison Wizard appears. 2. Select the Source and the Target, then specify the data comparison options and change mappings if necessary. Click Compare. 3. In the comparison results grid that appears, select objects for synchronization. Click the green arrow button to open the Data Synchronization Wizard. 4. Walk through the steps of the wizard configuring synchronization. Click Synchronize to deploy the changes. 5. Enjoy the result. Summary Nowadays more and more businesses move their databases to Azure Database for MySQL, as this database service is easy to set up, manage, and scale. That migration doesn’t need to be painful. dbForge Studio for MySQL boasts immaculate migration tools that can significantly facilitate the process. The Studio allows database transfer to be easily configured, saved, edited, automated and scheduled. More than that, dbForge Studio for MySQL supports [a bunch of database servers](https://www.devart.com/dbforge/mysql/studio/database-connections.html) to move your workloads from and to. Tags [Azure MySQL](https://blog.devart.com/tag/azure-mysql) [copy database](https://blog.devart.com/tag/copy-database) [data compare](https://blog.devart.com/tag/data-compare) [database backup](https://blog.devart.com/tag/database-backup) [database restore](https://blog.devart.com/tag/database-restore) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [migrate from MySQL to Azure](https://blog.devart.com/tag/migrate-from-mysql-to-azure) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fazure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=Azure+Database+for+MySQL%3A+How+to+Connect+and+Migrate+Databases+with+dbForge+Studio+for+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fazure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/azure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html&title=Azure+Database+for+MySQL%3A+How+to+Connect+and+Migrate+Databases+with+dbForge+Studio+for+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/azure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html&title=Azure+Database+for+MySQL%3A+How+to+Connect+and+Migrate+Databases+with+dbForge+Studio+for+MySQL) [Copy URL](https://blog.devart.com/azure-database-for-mysql-how-to-connect-and-migrate-databases-with-dbforge-studio-for-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/backing-up-all-mysql-server-databases-or-only-the-required-ones.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) MySQL Backup Strategies: How to Backup All and Multiple Databases By [dbForge Team](https://blog.devart.com/author/dbforge) July 9, 2019 [0](https://blog.devart.com/backing-up-all-mysql-server-databases-or-only-the-required-ones.html#respond) 5800 This article will outline two approaches to performing regular database backups. The first method involves backing up databases specified in a list, while the second entails bulk backing up of all databases. These procedures can be executed on both Linux and Windows platforms using [dbForge Studio for MySQL](http://devart.com/dbforge/mysql/studio/) . The software facilitates the execution of routine tasks through both command prompt and PowerShell interfaces. Contents Backup MySQL databases using mysqldump Dump multiple MySQL databases Dump all MySQL databases Backup databases using GUI tool for MySQL through PowerShell script Backing up all MySQL databases Backing up multiple databases from a list Backing up the databases by a mask Creating and using bat file in the command-line interface Conclusion When having only several databases to manage, regular backup operations can be done quite easily and seamlessly, either with the help of a few simple scripts or by configuring a tool that will perform the backup creation automatically. But sometimes, the situation is more complicated. When, for instance, there are hundreds of databases, backing up each one manually can turn out to be quite a time-consuming task. So, finding a solution that would allow backing up all databases, or only the required ones, without affecting the server performance is highly important for developers and DBAs. Backup MySQL databases using mysqldump Let us begin with figuring out how to use the mysqldump command-line utility during the backup creation process. The tool generates SQL statements to reproduce the database’s structure and content. It provides a straightforward and efficient means to create backups, allowing for the preservation of data in case of unforeseen circumstances like hardware failures or accidental deletions. Additionally, mysqldump is rather versatile and can be customized to include or exclude specific databases or tables, providing flexibility in the backup process. Lastly, since this utility generates SQL scripts, it ensures that the backups are in a universally recognized format, making it easy to restore the database on different MySQL installations. Dump multiple MySQL databases In order to perform a dump of multiple MySQL databases using mysqldump , you’ll need to specify each database individually in a single command. The exact command may vary slightly depending on whether you have authentication set up or not: # Authentication disabled\nmysqldump --databases database1 database2 database3 > my_backups.sql\n\n# Authentication enabled\nmysqldump -u username -p --databases database1 database2 database3 > my_backups.sql username should be replaced with your MySQL username. database1 , database2 , database3 , etc., should be replaced by the actual names of the databases you wish to back up. backup.sql is the name of the backup file. You can change it to whatever you prefer. Dump all MySQL databases The syntax does not differ very much when it comes to backing up all the databases. The only thing you have to change in your command is to use the --all-databases flag instead of listing the names of the databases: # Authentication disabled\nmysqldump --all-databases > my_backups.sql\n\n# Authentication enabled\nmysqldump -u username -p --all-databases > my_backups.sql Note : When using the --all-databases flag, keep in mind that the internal mysql database is included as well. Backup databases using GUI tool for MySQL through PowerShell script In this section, we are going to learn more about automated backups with the help of endless capabilities of PowerShell scripts. Discover how you can optimize and simplify your backup creation procedures by harnessing the versatility and automation potential of PowerShell scripting. Backing up all MySQL databases To accomplish the task, we need to create a script. Open a plain text editor such as a Notepad. Type in the following code: Set-Location -Path \"C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\\" -PassThru\n\n.\\mysql.exe --host=localhost --user=root --password=root --skip-column-names --execute=\"SELECT s.SCHEMA_NAME FROM information_schema.SCHEMATA s WHERE s.SCHEMA_NAME NOT IN ('\nmysql', 'information_schema', 'sys', 'performance_schema')\" | Out-File \"D:\\backup\\all_databases_backup\\PowerShell\\alldatabases.txt\"\n\nforeach($DBname in Get-Content \"D:\\backup\\all_databases_backup\\PowerShell\\alldatabases.txt\")\n{Write-Host $DBname\n&\"C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com\" /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:$DBname /outputfile:\"D:\\backup\\all_databases_backup\\PowerShell\\all_DB_backup\\$DBname.sql\"} Where: C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\ – MySQL server path. D:\\backup\\all_databases_backup\\PowerShell\\all_DB_backup – location at your computer to store the output files. C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com – dbForge Studio for MySQL path. Assign your own values to the User Id , Password , Host and Port parameters. ave the file with the *.ps1 file extension. S1 file extension (for example, all_DB_backup.ps1). Please note that we do not intend to explain all aspects of the command-line syntax within this article. For more information, refer to our [documentation center](https://docs.devart.com/studio-for-mysql/backup-and-restore-of-mysql-databases/backup-arguments-cmd.html) . Script Execution After the backup has been completed successfully, a folder all_DB_backup with SQL files will be created. Backing up multiple databases from a list To accomplish the task, we will need the following set of files: A TXT file with a list of databases to be backed up named DB_list.txt. A PS file with a script. First, let’s specify the databases we want to back up. Open a new Notepad document and make a list of databases. Save the new TXT file as DB_list.txt . Please note, that the list of databases should be written in a columnar form without any delimiters, each database in a new line. Then, we need to create a script in a PS1 file. Open a plain text editor such as Notepad. Type in the following code: foreach($DBname in Get-Content \"D:\\backup\\all_databases_backup\\PowerShell\\DB_list.txt\")\n{Write-Host $DBname\n&\"C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com\" /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:$DBname /outputfile:\"D:\\backup\\DB_databases_backup\\PowerShell\\DB_list_backup\\$DBname.sql\"} Where: D:\\backup\\DB_databases_backup\\PowerShell\\DB_list_backup – location at your computer to store output files. C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com – dbForge Studio for MySQL path. Assign your own values to the User ID, Password, Host, and Port parameters. Save the file with the *.ps1 file extension (for example, DB_list_backup.ps1). Script Execution After the backup has been completed successfully, a folder DB_list_backup with SQL files will be created. Backing up the databases by a mask To accomplish the task, we need to create a script. Open a plain text editor such as Notepad. Type in the following code: Set-Location -Path \"C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\\" -PassThru\n\n.\\mysql.exe --host=localhost --user=root --password=root --skip-column-names --execute=\"SELECT s.SCHEMA_NAME FROM information_schema.SCHEMATA s WHERE s.SCHEMA_NAME NOT IN ('mysql', 'information_schema', 'sys', 'performance_schema') and s.SCHEMA_NAME like '%$args%' \" | Out-File \"D:\\backup\\all_databases_backup\\PowerShell\\DB_by_mask.txt\"\n\nforeach($DBname in Get-Content \"D:\\backup\\all_databases_backup\\PowerShell\\DB_by_mask.txt\")\n{Write-Host $DBname\n&\"C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com\" /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:$DBname /outputfile:\"D:\\backup\\all_databases_backup\\PowerShell\\DB_by_mask_backup\\$DBname.sql\"} Where: C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\ – server path. D:\\backup\\all_databases_backup\\PowerShell\\DB_by_mask_backup – location at your computer to store output files. C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com – dbForge Studio for MySQL path. Assign your own values to the User ID , Password, Host, and Port parameters. Save the file with the *.ps1 file extension (for example, DB_by_mask_backup.ps1). Script Execution You need to execute the script with an extra parameter. For example, DB_by_mask_backup.ps1 test_DB_name. After the backup has been completed successfully, a folder DB_by_mask_backup with SQL files will be created. Creating and using bat file in the command-line interface You can also create an executable BAT file to run the chief task via the command-line interface. To create a BAT file: Open a plain text editor such as Notepad. Type in the following code: SetLocal EnableExtensions EnableDelayedExpansion\n\nSet Backup=\"C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com\"\n\ngoto DB_backup%1\n\nrem backup of all databases\n:DB_backup\n:DB_backup1\n\n\"C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\mysql.exe\" --host=localhost --user=root --password=root --skip-column-names --execute=\"SELECT s.SCHEMA_NAME FROM information_schema.SCHEMATA s WHERE s.SCHEMA_NAME NOT IN ('mysql', 'information_schema', 'sys', 'performance_schema')\" > \"D:\\backup\\all_databases_backup\\alldatabases.txt\"\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (alldatabases.txt) do ( \n%backup% /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:%%e /outputfile:\"D:\\backup\\all_databases_backup\\all_DB_backup\\%%e.sql\"\n)\ngoto finish\n\nrem backup databases from the list\n:DB_backup2\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (DB_list.txt) do ( \n%backup% /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:%%e /outputfile:\"D:\\backup\\all_databases_backup\\DB_list_backup\\%%e.sql\"\n)\ngoto finish\n\nrem backup databases by mask\n:DB_backup3\nset mask='%%% %%2%%%'\n\"C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\mysql.exe\" --host=localhost --user=root --password=root --skip-column-names --execute=\"SELECT s.SCHEMA_NAME FROM information_schema.SCHEMATA s WHERE s.SCHEMA_NAME NOT IN ('mysql', 'information_schema', 'sys', 'performance_schema') AND s.SCHEMA_NAME LIKE %mask%\" > \"D:\\backup\\all_databases_backup\\DB_by_mask.txt\"\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (DB_by_mask.txt) do ( \n%backup% /backup /connection:\"User Id=root;Password=root;Host=localhost;Port=3306;Character Set=utf8\" /database:%%e /outputfile:\"D:\\backup\\all_databases_backup\\DB_by_mask_backup\\%%e.sql\"\n)\ngoto finish\n\n:finish Where: C:\\Program Files\\MySQL\\MySQL Server 5.7\\bin\\ – server path. D:\\backup\\all_databases_backup – location at your computer to store output files. C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com – dbForge Studio for MySQL path. Assign your own values to the User ID , Password, Host, and Port parameters. Save the file with the *.bat extension. Script Execution The BAT file is executed with the following input parameters: 1 or empty — all databases in connection will be backed up 2 — all databases specified in DB_list.txt file will be backed up 3 “mask” — all databases, the names of which match the mask name, will be backed up. The examples of execution All databases: all_DB_backup_with_param.bat all_DB_backup_with_param.bat 1 The databases from a list : all_DB_backup_with_param.bat 2 The databases by a mask: all_DB_backup_with_param.bat 3 max (a mask database name will be added to the constraint like ‘%max%’) all_DB_backup_with_param.bat 3 sakila (a mask database name will be added to the constraint like ‘%sakila%’) all_DB_backup_with_param.bat 3 a (a mask database name will be added to the constraint like ‘%a%’) After the backup has been completed successfully, a corresponding folder with SQL files will be created: all_DB_backup — for all databases DB_list_backup — for the databases from a list DB_by_mask_backup — for the databases chosen by a mask. There is another way to back up all databases using the command-line interface. You can create a *.bat file with the following content: Set Runtool=\"C:\\Program Files\\Devart\\dbForge Studio for MySQL\\dbforgemysql.com\"\n\nset timestamp=%DATE:~6.4%_%DATE:~3.2%_%DATE:~0.2%__%TIME:~0.2%_%TIME:~3.2%_%TIME :~6.2%\n\nFOR /F \"eol=; tokens=1,2,3,4,5,6* delims=, \" %%e in (db_for_backup.txt) do (\n\n%Runtool% /backup /connection:\"User Id=%%e;Host=%%f;Port=%%g;Character Set=%%h\" /database:%%i /password:%%j /outputfile: %%i-%timestamp%.sql\n\n) This file will also require the creation of the following db_for_backup.txt file: user_name1, host1, port1, encoding1, db1, password1\nuser_name2, host2, port2, encoding2, db2, password2\nuser_name3, host3, port3, encoding3, db3, password3\n...\nuser_nameN, hostN, portN, encodingN, dbN, passwordN For example, when you substitute the placeholder with actual data, the file contents will look somewhat like this: root, dbfmysrv, 3320, utf8, adventureworks, root\nroot, 192.169.0.2, 3310, utf8, sakila, root\nroot, localhost, 3306, utf8, bikestore, root By running a bat file, for example, through Windows Task Scheduler, you can configure a mass backup of databases at a certain time. Conclusion [MySQL Server backup](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) is an important task and is meant for protecting data stored in MySQL databases from critical loss due to hardware failures, network intrusions, human errors, etc. In this article, we provided a solution to backing up databases specified by a list and bulk backing up of all databases. To complete this task, we created a bat file that copes with the task in a single click. Three working scripts for backing up all databases, backing databases from a list, and backing databases by a mask were given as well. Related article: [How to Restore a MySQL Database with Command Line or Restore Tools](https://blog.devart.com/how-to-restore-mysql-database-from-backup.html) Tags [database backup](https://blog.devart.com/tag/database-backup) [MySQL](https://blog.devart.com/tag/mysql) [mysql backup](https://blog.devart.com/tag/mysql-backup) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbacking-up-all-mysql-server-databases-or-only-the-required-ones.html) [Twitter](https://twitter.com/intent/tweet?text=MySQL+Backup+Strategies%3A+How+to+Backup+All+and+Multiple+Databases&url=https%3A%2F%2Fblog.devart.com%2Fbacking-up-all-mysql-server-databases-or-only-the-required-ones.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/backing-up-all-mysql-server-databases-or-only-the-required-ones.html&title=MySQL+Backup+Strategies%3A+How+to+Backup+All+and+Multiple+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/backing-up-all-mysql-server-databases-or-only-the-required-ones.html&title=MySQL+Backup+Strategies%3A+How+to+Backup+All+and+Multiple+Databases) [Copy URL](https://blog.devart.com/backing-up-all-mysql-server-databases-or-only-the-required-ones.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/backup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Backup Comparison and Database Versioning Added in Schema Compare for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) January 29, 2010 [0](https://blog.devart.com/backup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html#respond) 3239 Schema Compare for SQL Server 2.00 compares schema snapshots, database backups, and database changes in version control systems to accelerate the work of advanced database developers. Devart today announced the public availability of [Schema Compare for SQL Server 2.00](https://www.devart.com/dbforge/sql/schemacompare/) that delivers new capabilities to help customers more effectively compare databases and more quickly migrate schema changes across different SQL Server versions. Featuring new functionality, the tool is better equipped to span a full spectrum of database comparison and synchronization tasks and allows completing them at a professional level. The highlights of Schema Compare for SQL Server 2.00 include: Support of native SQL Server backups . The users can use native SQL Server backups as a metadata source and compare them. It adds a new way to compare and update database structures without manual work. Database comparison and synchronization via command line . This is especially useful for the users, who do repeated schema comparison and synchronization. It is enough to set a comparison once and the tool will prepare a file with required command line arguments – Source and Target connection strings, comparison options and settings. The users can schedule the launch time for the processes specified in the file and escape any waiting. Tracking database changes using version control systems . This prevents several developers from changing database objects simultaneously. Now it is easy to revert to earlier revisions and quickly identify who and when changed the objects. Comparison takes less time as schema snapshots are compared. dbForge Schema Compare uses two most popular version control systems — SVN and TFS. Generating comparison and synchronization reports . Now publishing differences between databases is automated and offered as clear and professionally looking reports. Synchronization of database properties. A database will be synchronized as an object and you will see what changes have been made to the database. Character-oriented comparison. Even a single different character is highlighted in the text compare to save the user’s time and efforts. Standard and Professional product editions available. You can select only required [SQL tools](https://www.devart.com/dbforge/sql/schemacompare/editions.html) and save your money. Check the benefits yourself, [download dbForge Schema Compare for SQL Server 2.00](https://www.devart.com/dbforge/sql/schemacompare/download.html) now for free. Tell us what you think about the new version at [the product feedback page.](https://www.devart.com/dbforge/sql/schemacompare/feedback.html) We are looking forward to your comments and suggestions. Tags [command line](https://blog.devart.com/tag/command-line) [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [synchronize database](https://blog.devart.com/tag/synchronize-database) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbackup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html) [Twitter](https://twitter.com/intent/tweet?text=Backup+Comparison+and+Database+Versioning+Added+in+Schema+Compare+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fbackup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/backup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html&title=Backup+Comparison+and+Database+Versioning+Added+in+Schema+Compare+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/backup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html&title=Backup+Comparison+and+Database+Versioning+Added+in+Schema+Compare+for+SQL+Server) [Copy URL](https://blog.devart.com/backup-comparison-and-database-versioning-added-in-schema-compare-for-sql-server-2-00.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/bcp-utility-vs-dbforge-studio.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Data Import and Export: BCP Utility vs dbForge Studio for SQL Server By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) January 26, 2024 [0](https://blog.devart.com/bcp-utility-vs-dbforge-studio.html#respond) 1652 Data import and export might as well be the cornerstone of data management, an essential feature for regular use. The preconditions for easy and versatile import and export are rather simple. First, you need to have quite a few data formats to deal with, a dozen or so. Second, you need to tailor each operation to your needs and preferences with a selection of flexible settings. Third, you most certainly wouldn’t mind automating your recurring import and export tasks and thus save lots of the precious time that you’d rather dedicate to more important matters at hand. This is where we’d like to show you how to manage data import and export using two different approaches. The first one is the command line, best illustrated with the well-known bcp utility . The second one involves [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , a multi-featured database IDE that provides a handy wizard to guide you through the entire process and help you configure it with maximum convenience. Without further ado, let’s get started! Contents A brief overview of the bcp utility Data export and import via the bcp utility How to export and import an entire table How to export and import query results How to create a format file for recurring export and import operations A brief overview of dbForge Studio for SQL Server Data export and import via dbForge Studio for SQL Server How to export data using Data Export Wizard How to export data from the command line How to import data using Data Import Wizard How to import data from the command line A brief overview of the bcp utility Let’s get started with the bcp (bulk copy program) utility , a command-line tool that copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. It is more than capable of importing huge numbers of rows into your SQL Server databases as well as exporting your table data to files. If you have Microsoft Command Line Utilities for SQL Server installed on your machine, you can check your current bcp version by running the bcp -v command. Otherwise, you can [download them from the dedicated page](https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver16) . Data export and import via the bcp utility Now let’s see how it works. We’ll start with the basics and show you a few examples: how to export/import entire tables; how to export/import query results; and how to create a file with format information based on table metadata and apply it to recurring operations. How to export and import an entire table First off, let’s export an entire table with its data. For instance, we have a table HumanResources.Department . For your convenience, we’ll show you what it looks like in dbForge Studio. Then we have a second table, HumanResources.Department_New , which has the same structure but is empty. Now let’s use the Command Prompt with the bcp utility to export data from a table to a file with the following command. bcp HumanResources.Department out C:\\Export\\Department.dat -S \"demo-mssql\\SQLEXPRESS\" -U \"sa\" -P 123 -d \"AdventureWorks2022\" -n -t, In this command: HumanResources.Department is the source table that we will export data from C:\\Export\\Department.dat is the path and the name of the output file that the data will be exported to -S “demo-mssql\\SQLEXPRESS” is the server name -U “sa” is the username -P 123 is the password -d “AdventureWorks2022” is the database name -n instructs to use the native data type -t specifies the field separator; in our case it is “ , “ As you can see, the export has been successful. The file is firmly in place. Note that here, DAT is the only format you can deal with, and it can be processed with the bcp utility only. Our next step is to import the data from the said exported file into the empty second table, HumanResources.Department_New . To do that, we use the following command: bcp HumanResources.Department_New in C:\\Export\\Department.dat -S \"demo-mssql\\SQLEXPRESS\" -U \"sa\" -P 123 -d \"AdventureWorks2022\" -n -t, In this command, the syntax is identical to that of the previous case; the only difference is that we have in instead of out , which means we are importing data instead of exporting it. Let’s run it and see what happens. Success! Now let’s go back to dbForge Studio and see whether the data has really been successfully imported. And indeed, it has! We’ve got our data in the new table. How to export and import query results Now let’s clear HumanResources.Department_New with TRUNCATE TABLE and show you a bit more complex example — we’ll query data from HumanResources.Department and export it to a file with a single command. It is as follows: bcp \"SELECT * FROM HumanResources.Department d WHERE d.DepartmentID < = 8\" queryout C:\\Export\\Department_Query.dat -S \"demo-mssql\\SQLEXPRESS\" -U \"sa\" -P 123 -d \"AdventureWorks2022\" -n -t, We run it successfully. Next, we make sure that we’ve got the file. Now let’s import it into HumanResources.Department_New . Finally, we go back to dbForge Studio, run a SELECT query against HumanResources.Department_New , and see that we’ve got just what we expected. How to create a format file for recurring export and import operations The bcp utility supports format files , which contain format information for each record in any given table. A format file is based on the table metadata and is used as a predefined format profile for export and import operations. In other words, you configure the formats once, save them to a format file, and reuse the said file as many times as you wish. Let’s see how it’s done. We’ll create a format file for the HumanResources.Department table and save it to an FMT file. bcp HumanResources.Department format nul -n -f C:\\Export\\HumanResources_Department_Format.fmt -S \"demo-mssql\\SQLEXPRESS\" -U \"sa\" -P 123 -d \"AdventureWorks2022\" We run it. And then we make sure we’ve got the file. Next, we import data to HumanResources.Department_New using this file—and the format settings contained in it. Here is the command. bcp HumanResources.Department_New in C:\\Export\\Department.dat -f C:\\Export\\HumanResources_Department_Format.fmt -S \"demo-mssql\\SQLEXPRESS\" -U \"sa\" -P 123 -d \"AdventureWorks2022\" -n -t, And so we run it… Now let’s check the imported data in dbForge Studio. Success! That’s it! Now you know the basics of work with the bcp utility, it’s time to move on to something more advanced—and, dare we say, something far more convenient in everyday use. A brief overview of dbForge Studio for SQL Server What if we say you can do all that (and much more, in fact) with ease using a familiar graphical user interface? Yes, we are talking about dbForge Studio for SQL Server, an integrated environment that provides nearly all the tools you might need to effectively handle database development, management, and administration. In terms of export/import, dbForge Studio offers the following features: Data export and import involving 14 most popular data formats Data migration from third-party databases to SQL Server Intuitive wizards with flexible settings for each format User templates for recurring scenarios Automation of operations from the command line Now let’s see it in action! Data export and import via dbForge Studio for SQL Server The first step is, of course, to download dbForge Studio; for your convenience, it comes with a free 30-day trial. The rest is simple: you install it, then launch it, and then connect to the required database instance. After that, you can run both import and export in two possible ways: using a handy wizard and from the command line. Let us show you all of these ways in detail. How to export data using Data Export Wizard Now that we’re connected, we might as well start with export. We find the required table in Database Explorer , right-click it, and select Export Data from the shortcut menu. The export wizard opens and greets us with the Export format page. And when it comes to formats, the Studio’s a clear winner with 14 of them at your service: HTML, TXT, XLS, XLSX, MDB, RTF, PDF, JSON, XML, CSV, ODBC, DBF, SQL, and Google Sheets. This makes the Studio a far more flexible solution than bcp, which can export data to the DAT format that can be further processed only by the said utility. That said, let’s pick CSV as our file format and click Next . On the Source page, we can select a server connection, a database and its schema, as well as tables and views that we want to export. It is worth noting that the bcp utility allows exporting only one table at a time. Meanwhile, in the Studio, you can work with multiple tables simultaneously. Now let’s export two tables— HumanResources.Department and HumanResources.Shift —and click Next . On the Output settings page, we can select to export data into either a single file or separate files. We’ll go with the latter option. Additionally, this page offers options to append a timestamp to the names of exported files, auto-delete exported files that are older than a specified number of days, and export files as a ZIP archive . On the Options page, we can select whether to use Unicode , show the table header , and force quote strings (as well as specify a character for quoting in the Quote string field). Finally, we can select the required field separator (tab, space, comma, or a custom character) and click Next . Next comes the Data formats page, where we have two auxiliary tabs. The first one is Columns , where we can further specify columns for export and check their aliases and data types. The second one is Formats , where we can change the default format settings for Date, Time, Date Time, Currency, Float, Integer, Boolean, Null String, as well as select the required binary encoding from the drop-down list. On the Exported rows page, we can select to export all rows, export the rows selected on the Data formats page, or export a specified range of rows. We’ll go with the first one. On the Errors handling page, we need to specify the error processing behavior (using one of the three available options: Prompt the user for an action , Ignore all errors , or Abort at the first error ) and opt to write reports to a log file with a specified path. Once we finish configuring our settings, we can save a template with these settings or a command line via the Save button in the lower left corner of the wizard. This allows the automation of recurring export and import operations. Let’s save a template by clicking Save > Save Template . We’ll need it a bit later. And if we go to Save > Save Command Line , we can quickly get a BAT file. We’ll show you how it works shortly. First, let’s save it. Now we can click Export . When our data export is completed, we have several options: we can open the exported file or folder, perform another export operation, view the log file, or simply click Finish . As you can see, we’ve got all the files we need in our folder: the log file (DataExport.log), the BAT file (ExportCLI.bat), the template with our configured settings (ExportTables.det), and two exported CSV files named after the tables they contain. How to export data from the command line Now let’s demonstrate how the saved BAT file handles your export. To do that, let’s delete the exported files and the log file. Now let’s run our BAT file. As you can see, the operation is successful. We’ve got our files exported again. Note : You can just as easily perform export operations directly from the results grid —that’s how you can export query results. You can filter and select the data you want to export right there, right-click it, and click Export Data from the shortcut menu. All right, now we can truncate HumanResources.Department and HumanResources.Shift and proceed to import data into them. How to import data using Data Import Wizard The import operation is just as simple and flexible. We find the required database table in Database Explorer , right-click it, and select Import Data from the shortcut menu. The import wizard opens. On the Source file page, we select CSV, and provide the path to the previously exported file. Now it’s our source file for import. Note : This is also where you can load your templates with settings. Just switch to Templates under Categories , double-click Load Template , and select the required file. But as of now, we don’t have any import templates, so we proceed to the Destination page, we can check our server connection, database, and schema. Then we can select whether the data will be imported into a new table or into an existing table . Let’s select HumanResources.Department_New as our destination table and click Next . On the Options page, we can configure the options for imported data. We can check whether the Encoding is set correctly or select another one from the drop-down list. In Quote string , we can specify the character to be used for string quoting. In Skip lines , we can specify the number of lines to be skipped during import; the lines are counted from the top. We can specify the Header position (the required number of lines); if we don’t, the imported columns will get default names – column1, column2, and so on. We can specify the Field Separator by either keeping the Auto defined checkbox selected, or by clearing it and selecting one of the following options: Tab , Space , Comma , or Custom character . We make sure we’ve got what we want and click Next . On the Data formats page, we have two auxiliary tabs. The first one is Common Formats, where we can specify the formats for null strings, thousand and decimal separators, boolean variables, date and time. There is also the Autodetect Date and Time format checkbox, selected by default. The second tab is Column Settings , where we can configure the format settings for separate columns. We have four options here: Null String , Left Quote , Right Quote , and Date and Time . Note that if a format mask is not set, the application will identify date/time values automatically. Our next page is Mapping , where we can map the source columns to the target ones. If we are importing data into a new table, the application will automatically create and map all the columns. We can see the results in the Preview section. Additionally, we can view column properties, as well as clear and restore the mapping of all columns with corresponding buttons. On the Modes page, we select an import mode. There are five available modes: Append – add records to the target table Update – update a record in the target table with a matching record from the source Append/Update – update a record if it exists in the target table; otherwise, add a record Delete – delete records in the target table that match records in the source Repopulate – delete all records in the target table and repopulate them from the source Optionally, we can select the checkboxes Use a single transaction and Use bulk insert (the latter of which reduces the quantity of statements and speeds up the import operation, but can affect the error handling mechanism). On the Output page, we have 3 options: Open the data import script in the internal editor Save the data import script to a file with a specified path and name; additionally, we can select to add a timestamp to the file name and open it in the internal editor Import data directly to the database On the Errors handling page, we need to specify the error processing behavior (using one of the three available options: Prompt the user for an action , Ignore all errors , or Abort at the first error ) and opt to write reports to a log file with a specified path. Finally, we’ve configured everything, and now we can click Import . The operation is successfully completed. Now we can check whether the data has been imported. Success! How to import data from the command line We still have one more table, HumanResources.Shift , and now we’ll import data into it using a newly generated BAT file. Let’s go back to the Source file page, where we select CSV and the previously exported HumanResources.Shift.csv source file. On the Destination page, we select HumanResources.Shift_New . Now, to keep things short, let’s skip everything and proceed right to Save > Save Template and get ourselves a template file in the DIT format. Similarly, we can go to Save > Save Command Line to save a BAT file that we’ll further use to import data. Now that we’ve got the command line, we run the file. It is instantly executed. And now we can check our table and make sure that the data has really been imported. That’s it! Note: If you are an avid user of SSMS, you can augment it with the exact same export and import functionality by installing an add-in called [dbForge Data Pump](https://www.devart.com/dbforge/sql/data-pump/) . Download dbForge Studio and make your daily work easier today! Now that you know both approaches in detail, you can decide which one works best for you. And if you’d rather go with the GUI-powered dbForge Studio for SQL Server, we’d love to invite you to [download it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and get some firsthand experience with all of its rich capabilities, including the import and export of your data. Tags [data export](https://blog.devart.com/tag/data-export) [data import](https://blog.devart.com/tag/data-import) [data pump](https://blog.devart.com/tag/data-pump) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbcp-utility-vs-dbforge-studio.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Import+and+Export%3A+BCP+Utility+vs+dbForge+Studio+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fbcp-utility-vs-dbforge-studio.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bcp-utility-vs-dbforge-studio.html&title=Data+Import+and+Export%3A+BCP+Utility+vs+dbForge+Studio+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bcp-utility-vs-dbforge-studio.html&title=Data+Import+and+Export%3A+BCP+Utility+vs+dbForge+Studio+for+SQL+Server) [Copy URL](https://blog.devart.com/bcp-utility-vs-dbforge-studio.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/be-careful-while-using-unsigned-data-type-in-the-routine-body.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) Be careful while using UNSIGNED data type in the routine body By [dbForge Team](https://blog.devart.com/author/dbforge) June 26, 2009 [0](https://blog.devart.com/be-careful-while-using-unsigned-data-type-in-the-routine-body.html#respond) 4025 Introduction MySQL Server (starting from v 5.0), as Oracle and SQL Servers, allows creating [stored procedures and functions](https://dev.mysql.com/doc/refman/8.0/en/stored-routines.html) . Stored procedures are a set of SQL commands that can be compiled and stored on the server. Thus instead of storing a frequently-used query, we can refer to a corresponding stored procedure. This provides better performance (as this query is analyzed only once) and reduction of traffic between client and server. While developing business logic of procedures, we often use a great number of variables (e.g., temporary outputs) to store. To assign static values to a variable or values of other variables, SET operator is used. [SET operator in stored procedures](https://dev.mysql.com/doc/refman/8.0/en/stored-programs-defining.html) is an extended version of usual SET operator. This allows using extended syntax SET а=х, Ь=у , where different variables types (local and server variables, global and session ones) can be mixed. Problem While assigning a value which size exceeds the maximum data type size of a variable, MySQL Server should show “Data type size exceeded” error. In such situations farther execution of procedure’s code with the error should be aborted. Let us illustrate this case exploring how MySQL Server reacts when a variable value with [SMALLINT](https://dev.mysql.com/doc/refman/8.0/en/numeric-types.html) data type is assigned to a variable with [TINYINT](https://dev.mysql.com/doc/refman/8.0/en/numeric-types.html) data type. Here DECLARE v_TINYINT TINYINT means that any value in the range -128 to 127 can be assign to the variable. If we assign a value being out of this range, for example, assign 250 to the variable with SMALLINT data type where the tolerable limit is -32768 to 32767 range, MySQL Server will show “ [Out of range](https://dev.mysql.com/doc/refman/5.1/en/error-messages-server.html#error_er_warn_data_out_of_range) ” error. A poor developer should review the procedure’s code. If the code contains UNSIGNED data type, the situation is a bit different. [UNSIGNED](https://dev.mysql.com/doc/refman/8.0/en/numeric-types.html) is a special instruction for main data type, extending a positive limit twice and excluding the negative limit. To illustrate this case, let us assign a variable value with TINYINT UNSIGNED data type to a variable with TINYINT data type. DECLARE v_TINYINTUNSIGNED TINYINT UNSIGNED means that assigned values of v_TINYINTUNSIGNED variable should be within 0 to 255 range. We assigned TINYINT UNSIGNED value that exceeded the maximum size – 127. After assigning, MySQL Server has stored value – 6, instead of expected – 250. No errors or warnings were shown. You will most likely pay no attention to this, but it can cause unexpected results after executing the procedure’s code. Solution Here are our recommendations for such cases: Watch over the correspondence between data types of variables during evaluation. Do not use UNSIGNED at all, instead use a character type of a big size. Use [visual tools for debugging stored procedures](https://www.devart.com/dbforge/mysql/studio/debugging.html) to find such errors. Try dbForge Studio for MySQL , it has a modern integrated [MySQL debugger](https://www.devart.com/dbforge/mysql/studio/code-debugger.html) which allows you to see values of each variable while executing the code. Tags [debugging](https://blog.devart.com/tag/debugging) [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbe-careful-while-using-unsigned-data-type-in-the-routine-body.html) [Twitter](https://twitter.com/intent/tweet?text=Be+careful+while+using+UNSIGNED+data+type+in+the+routine+body&url=https%3A%2F%2Fblog.devart.com%2Fbe-careful-while-using-unsigned-data-type-in-the-routine-body.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/be-careful-while-using-unsigned-data-type-in-the-routine-body.html&title=Be+careful+while+using+UNSIGNED+data+type+in+the+routine+body) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/be-careful-while-using-unsigned-data-type-in-the-routine-body.html&title=Be+careful+while+using+UNSIGNED+data+type+in+the+routine+body) [Copy URL](https://blog.devart.com/be-careful-while-using-unsigned-data-type-in-the-routine-body.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/be-ready-to-meet-devart-at-pass-2016.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Be ready to meet Devart at PASS Summit 2016 By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) October 11, 2016 [0](https://blog.devart.com/be-ready-to-meet-devart-at-pass-2016.html#respond) 2985 Where? We are more than happy to announce that our team is going to attend [PASS Summit 2016](https://www.pass.org/) . The event will take place on October 25 – 28 in Seattle, WA, USA. What? Whether you are a data novice or a professional, PASS Summit is the place to find industry-leading speakers, in-depth training, technical tips and tricks, and connections to take your career to the next level. Why? In 2015, we enjoyed visiting PASS Summit and communicating with attendees from across the world. This year, we will be glad to meet you again to share opinions, expose new features to you that we have been working on during the last year. You are welcome to visit the [Devart](https://www.devart.com/) exhibition booth where you can see the live demo of our [SQL Server tools](https://www.devart.com/dbforge/sql/) , ask questions, get product discount, and take part in Exhibitor Raffle. Devart booth will be located at the exhibition place #404 . Please do not confuse with HTTP: 404 :) The Chance to Win a Prize! Yeah, this year we are going to raffle Microsoft Surface 4 Pro (128GB / Intel Core i5). Additionally, you have a chance to win a FREE license for our SQL Server tools: [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) and [dbForge Developer Bundle for SQL Server](https://www.devart.com/dbforge/sql/developer-bundle/) . All you need to do is just to fill a card with your personal info and wait until the raffle begins! See you in two weeks! Tags [devart](https://blog.devart.com/tag/devart) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbe-ready-to-meet-devart-at-pass-2016.html) [Twitter](https://twitter.com/intent/tweet?text=Be+ready+to+meet+Devart+at+PASS+Summit+2016&url=https%3A%2F%2Fblog.devart.com%2Fbe-ready-to-meet-devart-at-pass-2016.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/be-ready-to-meet-devart-at-pass-2016.html&title=Be+ready+to+meet+Devart+at+PASS+Summit+2016) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/be-ready-to-meet-devart-at-pass-2016.html&title=Be+ready+to+meet+Devart+at+PASS+Summit+2016) [Copy URL](https://blog.devart.com/be-ready-to-meet-devart-at-pass-2016.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/begin-try-begin-catch-vs-goto-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) BEGIN TRY/BEGIN CATCH vs GOTO in SQL Server By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) September 2, 2024 [0](https://blog.devart.com/begin-try-begin-catch-vs-goto-in-sql-server.html#respond) 1030 When handling errors in SQL Server, SQL developers have several options for resolving issues that arise during query execution. Two commonly used methods are BEGIN TRY/BEGIN CATCH and GOTO . While both serve to manage and respond to errors, they do so in distinct ways. In this article, we’ll compare BEGIN TRY/BEGIN CATCH and GOTO , exploring their use cases, advantages, and potential drawbacks to help you choose the best approach for your SQL Server error-handling needs. Error handling in T-SQL is crucial for ensuring the reliability and stability of database operations. It involves detecting and managing errors that occur during the execution of SQL queries, preventing data corruption and maintaining system integrity. Effective error handling allows for quick recovery from unexpected issues, minimizes downtime, and ensures that critical processes run as expected. In database-dependent application development, data integrity, application performance, and user experience directly depend on timely and effective error handling. Contents Understanding GOTO in T-SQL Syntax of GOTO in T-SQL When to use GOTO When NOT to use GOTO Limitations of using GOTO GOTO alternatives: TRY…CATCH and TRY…FINALLY TRY…CATCH in T-SQL TRY…FINALLY in T-SQL Advantages of BEGIN TRY/BEGIN CATCH over GOTO Scenarios where GOTO is more efficient than BEGIN TRY/BEGIN CATCH SQL error handling: Advanced strategy What is T-SQL Code Analyzer? Conclusion Understanding GOTO in T-SQL The GOTO statement in T-SQL is a control-of-flow language element that allows you to redirect the execution of your code to a labeled section within the same procedure, batch, or statement block. While GOTO provides a way to bypass the normal sequence of operations, it is generally considered to be poor programming practice due to its potential to make code harder to read, maintain, and debug. Syntax of GOTO in T-SQL The basic syntax for using GOTO in T-SQL involves two components: The GOTO statement : This is where you specify the specific point in the code you want to jump to. The label : This is a point in the code that you name, which serves as the destination for the GOTO statement. Define the label: \nLabel: \nAlter the execution: \nGOTO label Here’s a simple example that demonstrates how GOTO works in T-SQL: DECLARE @Counter INT = 1;\n\nPRINT 'Starting loop';\n\nStartLoop:\n PRINT @Counter;\n SET @Counter = @Counter + 1;\n\n IF @Counter <= 5\n GOTO StartLoop;\n\nPRINT 'End of loop'; In this example, the GOTO statement redirects execution back to the StartLoop label, creating a loop that runs until the condition ( @Counter <= 5 ) is no longer true. When to use GOTO 1. Breaking out of nested loops If you’re working with multiple nested loops or complex logic, exiting from deeply nested structures can be cumbersome. Using GOTO allows you to jump directly out of the loop or bypass certain conditions that would otherwise require multiple IF or BREAK statements. DECLARE @OuterCounter INT = 1, @InnerCounter INT;\n\nWHILE @OuterCounter <= 3\nBEGIN\n SET @InnerCounter = 1;\n\n WHILE @InnerCounter <= 3\n BEGIN\n IF @OuterCounter = 2 AND @InnerCounter = 2\n GOTO ExitLoop;\n \n PRINT 'Outer Loop: ' + CAST(@OuterCounter AS VARCHAR) + ', Inner Loop: ' + CAST(@InnerCounter AS VARCHAR);\n SET @InnerCounter = @InnerCounter + 1;\n END;\n\n SET @OuterCounter = @OuterCounter + 1;\nEND;\n\nExitLoop:\nPRINT 'Exited loop early'; In this example, GOTO is used to break out of both loops when specific conditions are met, allowing cleaner code than using multiple BREAK or CONTINUE statements. 2. GOTO for Retry logic Similarly, just as GOTO is useful for breaking out of loops when an exit or cleanup is required from different points within the loop, you can also use it for implementing the retry logic. The power of using GOTO for the retry logic lies in its flexibility: you can direct the flow of execution to retry from different points in your code. This can simplify the control flow when dealing with operations that might fail at various stages of execution. By using GOTO , you can efficiently manage retries without duplicating code, keeping your retry mechanism centralized. CREATE PROCEDURE PerformOperationWithRetry AS BEGIN DECLARE @RetryCount INT = 0; DECLARE @MaxRetries INT = 3; DECLARE @ErrorCode INT; RetryLabel: IF @RetryCount >= @MaxRetries BEGIN PRINT 'Max retries reached. Exiting.'; RETURN; END BEGIN TRY -- First operation UPDATE SomeTable SET SomeColumn = SomeValue WHERE SomeCondition = 'Condition'; -- Second operation INSERT INTO AnotherTable VALUES ('Data'); -- If all operations succeed, exit RETURN; END TRY BEGIN CATCH -- Increment retry count SET @RetryCount = @RetryCount + 1; -- Log the error SET @ErrorCode = ERROR_NUMBER(); PRINT 'Error encountered: ' + CAST(@ErrorCode AS NVARCHAR(10)); -- Retry the operation from the beginning GOTO RetryLabel; END CATCH END; This code defines a stored procedure PerformOperationWithRetry that attempts to perform two database operations ( UPDATE and INSERT ). If any of these operations fails, the procedure catches the error, logs it, increments a retry counter, and then retries the operations up to a maximum of three times. If the maximum retry count is reached without success, the procedure exits with a message indicating that the maximum retries have been reached. 3. Error handling in legacy code Before the introduction of the BEGIN TRY/BEGIN CATCH blocks in SQL Server 2005, GOTO was often used for error handling. Even though the TRY...CATCH structure is now the preferred method for handling exceptions, you may encounter legacy codebases where GOTO is still used to jump to an error-handling section when a problem occurs. For example, older T-SQL error handling might look like this: DECLARE @Error INT;\n\nINSERT INTO Employees (EmployeeID, Name) VALUES (1, 'John');\nSET @Error = @@ERROR;\nIF @Error <> 0\n GOTO ErrorHandler;\n\nINSERT INTO Employees (EmployeeID, Name) VALUES (2, 'Jane');\nSET @Error = @@ERROR;\nIF @Error <> 0\n GOTO ErrorHandler;\n\nPRINT 'Data inserted successfully';\nGOTO EndProcess;\n\nErrorHandler:\nPRINT 'An error occurred. Rolling back...';\n\nEndProcess:\nPRINT 'Process complete'; In this case, GOTO directs the flow to the error-handling section when a failure is detected. This is an older pattern, but GOTO might still be necessary in legacy systems where TRY...CATCH has not been implemented. 4. Jumping to specific code sections When certain parts of your code need to be skipped based on a condition, you can use GOTO to jump to a specific point in the code instead of wrapping sections of code in conditional statements. While IF...ELSE blocks are generally preferred for this purpose, GOTO can provide a quick solution for skipping code without adding more indentation or nested logic. For instance: DECLARE @SkipSection BIT = 1;\n\nIF @SkipSection = 1\n GOTO SkipProcessing;\n\nPRINT 'This will be skipped if SkipSection is 1';\n\nSkipProcessing:\nPRINT 'Processing completed'; 5. Situations requiring immediate termination of execution In cases where you need to immediately stop further execution of a process, GOTO can be used to jump to a termination point in the code, ensuring that no further operations are performed. This is especially useful when a critical error occurs, and continuing to process the query could lead to further issues. When NOT to use GOTO 1. In loops Avoid using GOTO to break out of or manage loops. Structured loop constructs (e.g., FOR , WHILE , BREAK , CONTINUE ) are specifically designed for these purposes and provide clearer and more maintainable code. 2. For erratic code navigation Do not use GOTO to arbitrarily jump between sections of code without a clear logical reason. This can make your code difficult to follow, understand, and maintain, leading to “spaghetti code.” When the program logic becomes too complex due to multiple GOTO statements, it is better to refactor the code into simpler, more understandable constructs. 3. In place of functions or procedures GOTO should not be used as a substitute for properly defining and calling functions or procedures. Structured programming encourages breaking code into manageable, reusable pieces, which GOTO does not support. Limitations of using GOTO However, while GOTO can be useful in specific scenarios, such as breaking out of deeply nested loops or error handling in legacy code, it’s generally advised to avoid using it in favor of more structured programming constructs like BEGIN TRY/BEGIN CATCH , WHILE , or IF...ELSE . These alternatives promote clearer and more maintainable code. Understanding GOTO drawbacks is crucial for making informed decisions about when (or if) to use GOTO in your SQL code. Reduced readability One of the most common drawbacks of GOTO is that it reduces the readability of the code. When the flow of execution jumps abruptly from one part of code to another, it can be difficult for developers to follow the logic. This is particularly problematic in complex scripts, where GOTO statements can cause the code to become disjointed and hard to understand. The clearer the flow of a program, the easier it is to maintain and debug. In contrast, GOTO can obscure this flow, making the code more challenging to work with. Increased complexity and maintenance GOTO can lead to what is often referred to as “spaghetti code,” where the control flow is too complex, with multiple jumps which make it difficult to trace the execution path. This type of code is hard to maintain and modify because changes in one part of the code can have unexpected effects elsewhere. As the codebase grows and evolves, this complexity can lead to an increase in bugs and a decrease in the overall stability of the application. Challenges in debugging Debugging code that makes heavy use of GOTO can be particularly challenging. Since GOTO disrupts the normal sequential flow of a program, it can be difficult to predict where the execution will jump to next, making it harder to track down the source of a bug. Traditional debugging tools and techniques are often less effective in such scenarios, as they rely on a more predictable and structured flow of control. Risk of infinite loops and unintended consequences Misusing GOTO can sometimes lead to infinite loops or other unintended consequences. For example, if a GOTO statement causes the program to jump back to a previous point without a clear exit condition, it can create a loop that continues indefinitely. This not only leads to poor performance but can also cause the program to crash or become unresponsive. Additionally, since GOTO can bypass normal control flow, it might skip over important initializations or cleanup operations, leading to further errors. GOTO alternatives: TRY…CATCH and TRY…FINALLY While the GOTO statement has historically been used to manage control flow and handle errors, modern T-SQL offers more structured and maintainable alternatives: TRY…CATCH and TRY…FINALLY . These patterns not only improve readability and maintainability but also align with best practices in error handling and control flow management. TRY…CATCH in T-SQL The TRY…CATCH language element is one of the most powerful tools in T-SQL for handling errors. It enables developers to encapsulate potentially problematic code within a TRY block, where errors can be anticipated and managed within a CATCH block. This structured approach makes it easier to handle exceptions and ensure that appropriate actions are taken when something goes wrong. TRY…CATCH syntax example BEGIN TRY\n -- Code that might generate an error\n INSERT INTO Employees (EmployeeID, Name) VALUES (1, 'John Doe');\nEND TRY\nBEGIN CATCH\n -- Error handling code\n PRINT 'An error occurred: ' + ERROR_MESSAGE();\nEND CATCH; In this example, if the INSERT statement within the TRY block fails, control immediately transfers to the CATCH block, where the error can be logged, reported, or handled appropriately. This method provides a clear and organized way to manage errors, making the code more predictable and easier to debug compared to using GOTO . TRY…FINALLY in T-SQL While T-SQL does not natively support a TRY…FINALLY construct as seen in other programming languages, a similar pattern can be achieved using a combination of TRY…CATCH and subsequent clean-up code. The idea behind TRY…FINALLY is to ensure that certain critical cleanup operations—like closing resources or resetting states—are performed regardless of whether an error occurs. Simulating TRY…FINALLY in T-SQL BEGIN TRY\n -- Code that might generate an error\n INSERT INTO Employees (EmployeeID, Name) VALUES (1, 'John Doe');\nEND TRY\nBEGIN CATCH\n -- Error handling code\n PRINT 'An error occurred: ' + ERROR_MESSAGE();\nEND CATCH;\n\n-- Final cleanup code\nIF @@TRANCOUNT > 0\n COMMIT TRANSACTION; In this pattern, the code after the END CATCH block acts as the FINALLY section, ensuring that critical operations, such as committing a transaction, are performed regardless of whether an error was caught. This approach ensures that your application’s state is consistent and that resources are properly managed. Advantages of BEGIN TRY/BEGIN CATCH over GOTO When it comes to error handling and control flow in T-SQL, the TRY…CATCH and TRY…FINALLY patterns offer several advantages over the traditional use of the GOTO statement. 1. More structured error handling The TRY…CATCH block provides a structured way to handle errors, encapsulating potentially problematic code within a TRY block and catching errors in a CATCH block. This structure makes it clear where errors are expected and how they are handled, leading to cleaner and more organized code. In contrast, GOTO lacks this structure, leading to fragmented error-handling code that can be difficult to follow. 2. Better readability With TRY…CATCH and TRY…FINALLY , the flow of control is predictable and linear, which improves the readability of the code. Developers can see at a glance how errors are handled and how resources are managed. GOTO , as we have already mentioned, can lead to “spaghetti code,” where the flow of control jumps around unpredictably. 3. Better maintainability Code that uses TRY…CATCH and TRY…FINALLY is generally easier to maintain than code that relies on GOTO . With structured error handling, you can easily modify or extend the error-handling logic without worrying about disrupting the control flow. In contrast, GOTO can create brittle code where changes in one part of the script can have unintended consequences elsewhere. 4. Reduced risk of logical errors Using GOTO can introduce logical errors, such as infinite loops or missed cleanup steps, which can be difficult to debug and resolve. TRY…CATCH and TRY…FINALLY reduce the risk of these errors by providing a clear, predictable flow of control. This makes your code less error-prone and easier to debug when issues do arise. 5. Improved resource management Although T-SQL doesn’t natively support a TRY…FINALLY construct, you can simulate it by placing cleanup code after a CATCH block. This approach ensures that critical resources, such as database connections or transactions, are properly managed and closed, even if an error occurs. This is much harder to guarantee with GOTO , where cleanup code might be skipped if the flow of control jumps unexpectedly. Scenarios where GOTO is more efficient than BEGIN TRY/BEGIN CATCH While BEGIN TRY/BEGIN CATCH blocks are a more commonly accepted method for handling errors in SQL, there are specific scenarios where GOTO can be more efficient, especially when errors are rare or do not occur at all. 1. Minimal error processing load In cases where errors are infrequent, the overhead of setting up BEGIN TRY/BEGIN CATCH blocks can be unnecessary. GOTO allows for a more streamlined flow, jumping directly to a retry or cleanup section without the added cost of error handling mechanisms that may not be triggered. 2. Simple retry logic When implementing retry logic that needs to redirect the flow of execution to an earlier point in the code, GOTO can be more straightforward. Instead of nesting multiple TRY/CATCH blocks, which can add complexity and potentially slow down execution, GOTO provides a direct path to reattempt the operation, making the code simpler and faster in such scenarios. 3. Handling multiple exit points If your code has multiple exit points where different sections of the code might need to skip to a common cleanup routine, GOTO can be more efficient. Instead of wrapping each potential exit in a TRY/CATCH block, GOTO can simplify the control flow, allowing for a single, centralized cleanup routine. 4. Performance-critical code paths In high-performance scenarios where every bit of overhead matters, GOTO can offer a performance edge by avoiding the additional processing involved in setting up and managing TRY/CATCH blocks, particularly when exceptions are rare and the main concern is the speed of the normal execution path. SQL error handling: Advanced strategy When it comes to implementing the best error handling strategy in SQL, [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio) with the integrated T-SQL Code Analyzer tool stand out as an essential resource. This tool instantly analyzes your T-SQL code, identifies potential pitfalls, and provides actionable prompts on how to enhance it. By using the insights from the T-SQL Code Analyzer, you can ensure that your SQL code is not only efficient but also follows the best coding practices. Note T-SQL Code Analyzer will be a major feature in SQL Complete version 7.0 and dbForge Studio for SQL Server 7.0, which are expected to launch in September 2024. What is T-SQL Code Analyzer? The T-SQL Code Analyzer is a powerful tool integrated into SQL Complete and dbForge Studio for SQL Server, designed to streamline the process of writing and optimizing T-SQL code. This feature allows developers to quickly identify and address potential issues in their SQL scripts, ensuring higher code quality and adherence to best practices. How to work with T-SQL Code Analyzer 1. Open a query document in SSMS and type or insert a piece of T-SQL code that you would like to analyze. Then right-click anywhere in the document and select Analyze Code from the shortcut menu. 2. Wait a few moments while the Analyzer checks the code and returns an Error List with the identified issues. Now you can examine them and change your code accordingly. Additionally, the information regarding the results of the analysis will be displayed in the Output window. Also note that every issue has been assigned a dedicated code, which is displayed in the Error List window. If you click it, you will be taken to the corresponding page in our product documentation, where you’ll be able to learn more about it. Code analysis profiles Before using the Analyzer, you can configure the rules that the analysis will follow. To do this, go to the SQL Complete menu, select Options , and navigate to Code Analysis > Profiles . Here, you can create, modify, and manage analysis profiles, including adding or removing them from your library and setting the active profile. You start with a predefined Default profile, which you can customize to create your own profiles. Each profile consists of rules grouped by specific goals, such as improving code readability, optimizing query performance, or avoiding deprecated constructs. In the Default profile, all rules are activated by default. To deactivate any rule, simply uncheck the corresponding box and save your changes. Let’s now test SQL Complete on a code sample that includes GOTO statements. As you can see, SQL Complete detected the use of GOTO in the code and issued a warning, recommending that it be avoided. Conclusion When it comes to error handling in SQL Server, choosing the right approach is crucial for maintaining the reliability, performance, and readability of your code. The comparison between BEGIN TRY/BEGIN CATCH and GOTO highlights the advantages of structured error handling over traditional control-of-flow mechanisms. While GOTO has its use cases, particularly in legacy systems, modern best practices favor the predictability and clarity offered by BEGIN TRY/BEGIN CATCH blocks. As you develop SQL-based applications, using tools like dbForge SQL Complete and dbForge Studio for SQL Server with integrated [T-SQL Code Analyzer](https://www.devart.com/dbforge/sql/studio/sql-analyzer.html) feature can further significantly your error-handling strategy, ensuring that your code adheres to best practices and remains robust, maintainable, and efficient. Video tutorial To help you get started with T-SQL Code Analyzer most effectively, we have prepared a bonus for you—a detailed tutorial that will help you get acquainted with the feature in just three minutes. [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbegin-try-begin-catch-vs-goto-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=BEGIN+TRY%2FBEGIN+CATCH+vs+GOTO+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fbegin-try-begin-catch-vs-goto-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/begin-try-begin-catch-vs-goto-in-sql-server.html&title=BEGIN+TRY%2FBEGIN+CATCH+vs+GOTO+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/begin-try-begin-catch-vs-goto-in-sql-server.html&title=BEGIN+TRY%2FBEGIN+CATCH+vs+GOTO+in+SQL+Server) [Copy URL](https://blog.devart.com/begin-try-begin-catch-vs-goto-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/benefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Benefit from Continuous Integration in Azure DevOps with dbForge Plugin By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) January 16, 2020 [0](https://blog.devart.com/benefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html#respond) 3832 The Devart team is glad to announce the first release of the Azure DevOps Plugin. Extending the range of supported CI systems, we strive to help the users of [dbForge DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) accelerate database development and get the most out of their DevOps environment. Azure DevOps Integration Devart users can now benefit from setting up automated processes for SQL Server database continuous integration using the fresh dbForge DevOps Automation Azure DevOps Plugin. The plugin allows configuring the CI process quickly and easily by narrowing it down to the predefined steps from the extension in the sequence you need. The steps represent basic elements of the workflow design and use a set of [SQL tools](https://www.devart.com/dbforge/sql/sql-tools/) for the workflow execution. dbForge DevOps Automation Azure DevOps Plugin helps implement the best DevOps practices in every phase of the database development lifecycle. The extension is aimed at maximizing productivity by enabling teams to organize database delivery in a reliable and compliant way. Tell Us What You Think [Get](http://marketplace.visualstudio.com/items?itemName=DevartSoftware.dbforge-devOps-Automation-sqlServer-extentions-tools) the newly released dbForge DevOps Automation Azure DevOps Plugin for SQL Server and let us know your opinion of it. Your feedback is highly appreciated and will help us advance our product line further. Availability dbForge DevOps Automation Azure DevOps Pugin for SQL Server is a free product that is supplied exclusively as a part of [dbForge SQL Tools.](https://www.devart.com/dbforge/sql/sql-tools/download.html) Tags [Azure DevOps plugin](https://blog.devart.com/tag/azure-devops-plugin) [dbForge DevOps Automation](https://blog.devart.com/tag/dbforge-devops-automation) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbenefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html) [Twitter](https://twitter.com/intent/tweet?text=Benefit+from+Continuous+Integration+in+Azure+DevOps+with+dbForge+Plugin&url=https%3A%2F%2Fblog.devart.com%2Fbenefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/benefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html&title=Benefit+from+Continuous+Integration+in+Azure+DevOps+with+dbForge+Plugin) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/benefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html&title=Benefit+from+Continuous+Integration+in+Azure+DevOps+with+dbForge+Plugin) [Copy URL](https://blog.devart.com/benefit-from-continuous-integration-in-azure-devops-with-the-new-dbforge-plugin.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-7-dbvisualizer-alternatives.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Best 7 DBVisualizer Alternatives By [Victoria Shyrokova](https://blog.devart.com/author/victorias) January 8, 2025 [0](https://blog.devart.com/best-7-dbvisualizer-alternatives.html#respond) 1037 Due to its wide compatibility, user interface, and broad range of functions, DBVisualizer is frequently chosen by database developers, administrators, analysts, and other specialists who work with databases. However, it’s not the only option for database management. This article will explore seven other IDEs and compare them by features. Keep reading this article to learn more about DbVisualizer’s popular alternatives, from dbForge Edge to DBeaver and Navicat, and decide which option suits your project. Table of contents Tools dbForge Edge DBeaver Navicat Premium RazorSQL Aqua Data Studio Beekeeper Studio DataGrip Comparison of features Conclusion Tools Let’s review the top seven competitors for DbVisualizer, their compatibility, and key highlights. dbForge Edge [Overview >](https://www.devart.com/dbforge/edge/) [Try for free >](https://www.devart.com/dbforge/edge/download.html) Installation. A Windows-native application. It is also available on Linux and macOS via compatibility solutions. dbForge Edge is a universal solution that supports numerous relational databases. It helps users solve multiple database management and administration tasks that range from database design to testing. It comprises four database IDEs with similar functionality and intuitive interfaces: dbForge Studio for MySQL, SQL Server, Oracle, and PostgreSQL. Key highlights: Code completion, syntax highlighting, and error detection Robust tools for query and data analysis A user-friendly visual editor for columns, indexes, statistics, check constraints, etc Visual query builder for designing queries of any complexity Query profiler that visualizes query execution plans dbForge Edge speeds up the work process for software and database developers, database administrators, DevOps engineers, data analysts, and managers who work with data. DBeaver [Overview >](https://dbeaver.com/) [Try for free >](https://dbeaver.io/download/) Installation. Compatible with Windows, macOS, and Linux. DBeaver is one of the most popular IDEs that support numerous database management systems, including MySQL and MariaDB. It has a wide range of features for database development and management across its versions (Lite, Enterprise, and Ultimate). Key highlights: A powerful SQL Editor with autocomplete and highlighting functions Advanced security features for data protection Data import and export from and to various popular formats AI smart assistant to generate complex SQL queries Comprehensive database administration with Task Scheduler For database administrators, software developers, data analysts, data scientists, system administrators, and QA Engineers, DBeaver brings a unique value of AI assistance in database development. Want to learn more about dbForge’s advantages in comparison to its competitors? Go to our [website](https://www.devart.com/dbforge/edge/dbvisualizer-vs-dbeaver-vs-edge.html) and find the best solution for you. [DBeaver alternatives: A comparison with dbForge Edge](https://www.devart.com/dbforge/edge/dbeaver-alternative.html) Explore the differences and similarities between DBeaver and dbForge Edge database management IDEs, the top-notch solutions for most of your routine tasks. Navicat Premium [Overview >](https://navicat.com/) [Try for free >](https://www.navicat.com/en/download/navicat-premium) Installation. Compatible with Windows, macOS, and Linux. Navicat Premium is a solution for database development with a worldwide community. It is compatible with multiple databases, including MySQL, PostgreSQL, MongoDB, MariaDB, SQL Server, Oracle, SQLite, and Redis. Navicat Premium has essential tools for data migration, execution of complex queries, and database design. Key highlights: Easy-to-navigate interface for efficient database administration Professional object designer that allows to create, modify, and design database objects without the need to write script Support for MongoDB 4 transactions to ensure data integrity Graphical view of the schema for better visualization A wide range of built-in tools (Code Completion, Query Builder, Code Snippet, etc.) for increased productivity With Navicat Premium, database administrators, developers, data analysts, and QA Engineers can enhance their productivity while working with databases. [Alternative to Navicat:\nAn in-depth comparison with dbForge Edge](https://www.devart.com/dbforge/edge/navicat-alternative.html) Find out the key differences and similarities between Navicat Premium and dbForge Edge database management solutions to choose the option that fits your project. RazorSQL [Overview >](https://razorsql.com/) [Try for free >](https://razorsql.com/download.html) Installation. Compatible with Windows, macOS, and Linux. RazorSQL is an IDE tested on more than 40 database management systems. It offers a wide range of features, from a Database Navigator to PHP Bridges that connect MySQL, PostgreSQL, and SQL Server remotely via PHP-enabled services. Key highlights: Compatibility with a variety of database management systems Visual tools that help to create, edit, and execute various database objects Command-Line support EditRocket code editor for SQL scripts, which supports over 20 programming languages Database administrators, data analysts, database architects, and system administrators can significantly benefit from RazorSQL’s features, which will improve their workflow. [Comparing RazorSQL, DBeaver, and dbForge Edge](https://www.devart.com/dbforge/edge/razorsql-vs-dbeaver-vs-edge.html) Wondering which database management solution would suit you better? Explore the strengths of RazorSQL, DBeaver, and dbForge Edge, and make a fact-based decision which of them fits you the most. Aqua Data Studio [Overview >](https://aquadatastudio.com/) [Try for free >](https://aquadatastudio.com/free-trial/) Installation. Compatible with Windows, macOS, and Linux. [Aqua](https://www.devart.com/dbforge/edge/aquafold-aqua-data-studio-vs-dbforge-edge.html) Data Studio is a versatile database management solution for relational, cloud, and NoSQL databases. It supports over 40 DBMSs. This IDE enables users to streamline their technical tasks, simplify visualization of a database, optimize its design, and, thus, accelerate productivity. Key highlights: Customizable dashboards to effortlessly interpret users’ data Comprehensive data modeling to streamline workflows Seamless database comparison and synchronization Automated change deployment Object search to quickly locate any database object Different authorization types for secure access For database developers, data and business analysts, data modelers and architects, and database administrators, Aqua Data Studio presents a unique product for seamless collaboration. [A detailed comparison of AquaFold Aqua Data Studio and dbForge Edge](https://www.devart.com/dbforge/edge/aquafold-aqua-data-studio-vs-dbforge-edge.html) Exlore the detailed comparison of dbForge Edge and Aqua Data Studio database management IDEs to learn which would suit your project better in terms of your routine tasks and requirements. Beekeeper Studio [Overview >](https://www.beekeeperstudio.io/) [Try for free >](https://www.beekeeperstudio.io/get) Installation. Compatible with Windows, macOS, and Linux. Beekeeper Studio is one of the most popular IDEs with a wide range of supported databases, including MySQL, PostgreSQL, SQLite, SQL Server, and Apache Cassandra. Key highlights: Autocompletion and syntax highlighting SSL and SSH encryption for data security Staging of changes for fast and simple table editing No need to write code for table creation Quick data export options Software developers, data analysts, database administrators, and DevOps engineers can benefit from Beekeeper Studio due to its syntax highlighting and quick data export features. [A detailed comparison of Beekeeper Studio and dbForge Edge](https://www.devart.com/dbforge/edge/beekeeper-studio-alternatives.html) Learn about the pros and cons of Beekeeper Studio database management solutions, and how it stands against dbForge Edge, to make an informed choice for your project. DataGrip [Overview >](https://www.jetbrains.com/datagrip/) [Try for free >](https://www.jetbrains.com/datagrip/download/) Installation. Compatible with Windows, macOS, and Linux. DataGrip is a powerful cross-platform tool for relational and most non-relational databases (NoSQL). Among them are PostgreSQL, MySQL, Oracle, Microsoft SQL Server, MongoDB, etc. It provides its users with versatile integrated tools to help them work with complex queries. Key highlights: A paid DataGrip AI Assistant to help users write better SQL queries A wide range of supported database management systems. Intelligent query console to keep track of the local query history. Context-sensitive code completion to write SQL code faster. Quick bugs detection Adjustable UI that suits any customer’s needs. DataGrip users are database developers, software engineers, backend developers, data analysts, data scientists, and DevOps engineers. [DataGrip vs DBeaver vs dbForge Edge](https://www.devart.com/dbforge/edge/datagrip-vs-dbeaver-vs-dbforge-edge.html) Learn about the key differences and similarities between DataGrip and dbForge Edge database management IDEs, and choose the solution that fits your project perfectly. Comparison of features Now, let’s move on to the precise comparison of DbVisualizer’s alternatives feature by feature. You need to consider them prior to deciding what alternative to DbVisualizer you want to choose. SQL editing and execution Features list DbVisualizer dbForge Edge DBeaver Navicat Premium Razor SQL Aqua Data Beekeeper Studio DataGrip Automatic SQL syntax check – + – – + – – + Code outlining – + + + – + – + Code snippets – + + + – + – + Customizable SQL formatting + + + + + + + + Execute current statement + + + + + + – – Quick access to favorite templates from SQL Editor – + + + – – – + SQL Editor with syntax coloring + + + + + + + + Bookmarks – + – – + – – + Text searching + + + + + + – + Document Outline window – + – + + – – + One-step access to the object editor – + + – – – – + SQL history window for the document + + + + + + + + Execution warnings + + – – + + – + GUI transaction support + + + + + – – + Data masking – – – – – + – – As you can see in the table, some IDEs have more functions when it comes to SQL editing and execution. For example, unlike DbVisualizer, dbForge Edge and DataGrip offer you automatic syntax checks, code outlining, code snippets, quick access to favorite templates, and one-step access to the object editor. Code completion Features list DbVisualizer dbForge Edge DBeaver Navicat Premium Razor SQL Aqua Data Beekeeper Studio DataGrip Code snippets and snippets and Snippet Manager + + + + – + – + Context-sensitive code completion + + + + + – + + One-click access to definitions of schema objects – + + – – + – + On-the-fly renaming of database objects – + – – – – – + Extended options for code formatting + + + – + + – + Auto-generation of table aliases – + + – – – – + Quick information about database objects – + – – – – – + Parameter information for stored routines – + – – + – – + JOIN clause auto generation – + – – – – – + When it comes to the Code Completion feature, dbForge Edge, DBeaver, and DataGrip offer a broader range of features. For instance, dbForge has parameter information for stored routines and JOIN clause auto-generation, unlike DbVisualizer. Visual query builder dbForge Edge supports Visual Query Builder (except for PostgreSQL). Compared to dbForge Edge, Aqua Data Studio, DBeaver, RazorSQL, DbVisualizer’s functionality for query builder is quite limited, while Beekeeper and DataGrip do not have a Visual Query Builder at all. Database design DbVisualizer, RazorSQL, and Beekeeper Studio do not have any functionality for Visual Database Design. dbForge Edge (except for PostgreSQL), on the other hand, has a full set of functions (from visualization of tables to zooming in and out) for database design. Table designer In Navicat Premium and dbForge Edge, users can find a full set of functions for Table Designer (flat table editor, partitioning support etc.) DbVisualizer, on the other hand, does not have partitioning support or a definition of the data types for new columns. Object editor Features list DbVisualizer dbForge Edge DBeaver Navicat Premium RazorSQL AquaData Beekeeper Studio DataGrip Check constraint + + + + + – – + Foreign key + + + + – + – + Index + + + + – + – + Stored function + + + + + + – – Stored procedure + + + + + + – + Table + + + + + + – + Trigger + + + + – + – – View + + + + + + – + Linked Server + – + + – + – + Undo option for the object editor + + + + + + – + The table above shows that dbForge, DBeaver, Navicat Premium, and Aqua Data contain the most extensive functionality for Object Editor, compared to other IDEs. Debugger DbVisualizer, RazorSQL, Beekeeper Studio, and DBeaver do not offer debugging functionality. However, alternatives like Aqua Data Studio, dbForge Edge (for the majority of RDBMS), DataGrip, and Navicat Premium have the Debugger feature. Database explorer Features list DbVisualiver dbFoge Edge DBeaver Navicat Premium RazorSQL Aqua Data Beekeper Studio DataGrip Multiple database connections allowed + + + + + + – + Browse and navigate through objects + + + + – + – + Detailed object properties and data browsing in the Object Viewer window – + + + – + – + Dependency tree browsing for each object – + – – – + – – Quick template script generation + + + – + + – + Send To command – + – + – + – + Refactoring of database objects – – – + + – – + Quick filter + – + + + – – + Data editor dbForge Edge offers you a broad functionality for data editing. Other IDEs (Aqua Data Studio, DataGrip, Navicat Premium, DbVisualizer, and RazorSQL) have the Data Editor feature but with limited functionality. Beekeeper Studio does not provide it. Schema and data comparison and synchronization Schema and Data Comparison functionality is fully implemented in dbForge Edge only.  Other alternatives (e.g., DataGrip or RazorSQL) only come with a comparison of database objects or a Command-Line interface without other useful functions. Data analysis As for the Data Analysis feature, DbVisualizer only has a chart building wizard and RazorSQL has the data search on a live database wizard without other crucial functions. dbForge Edge, on the other hand, has all the necessary functions for comprehensive data analysis (data report designer, master-detail browser, generation wizard, pivot table designer, etc.) Performance tuning Features list DbVisualizer dbForge Edge DBeaver Navicat Premium RazorSQL Aqua Data Beekeeper Studio DataGrip Visual SQL Explain plan + + + + – + – + Session statistics displayed in a UI – + – + – + – + Plan of the query displayed in a tree view + + + + + + – + Profiling history that can be saved for further analysis – + + – – – – – Compare profiling results with differences highlighted – + – – – – – – Printing of profiling results – + + – – + – – Regarding the Performance Tuning feature, dbForge Edge seems to be the only IDE with all the functions necessary for effective performance tuning. Test data generation Only dbForge Edge has a full range of functions for test data generation. DBeaver and Navicat only have a few functions for this feature. Other IDEs (Aqua Data Studio, DataGrip, RazorSQL, DbVisualizer, and Beekeeper Studio) lack this feature. Database documentation Unlike other IDEs discussed in this article, only dbForge Edge has a Database Documenter (with the exception for PostgreSQL. Database administration Features list DbVisualizer dbForge Edge DBeaver Navicat Premium RazorSQL Aqua Data Beekeeper Studio DataGrip Copy databases – + – – – – – – Restore databases – + + + – + – – Database snapshot – + – – – – – – Database backup + + + + + + – – Database backup to SQL and ZIP + + – + – – – – Database backup as a scripts folder – + – – – – – – CLI wizard for Database Backup – + – – + – – – The Administration feature is the most fully implemented in dbForge Edge. Other solutions (DbVisualizer, DBeaver, Navicat, RazorSQL, and Aqua Data Studio offer only a few database administration functions. In DataGrip and Beekeeper, there is no database administration functionality. User interface Features list DbVisualizer dbForge Edge DBeaver Navicat Premium RazorSQL Aqua Data Beekeeper Studio DataGrip Start page with easy access to main product features – + – – – – – + Rich user settings + + + + + + – + UI skins – + – + + – + + Customizable window layout – + + + – + – + Tool windows – + – – – – – + Multiple shortcut schemes + + + – + – – + Syntax highlight customization – + + + + + – + Tabbed groups for documents – + – – – – – + Toolbar customization – + + – + + – + Wizard for sharing common code standards and templates – + – – + – – + As you can see from the table above, dbForge Edge and DataGrip are the only IDEs that have a bunch of functions for intuitive user interface. Conclusion DbVisualizer remains a solid choice for database management, but there are several powerful alternatives. dbForge Edge, DBeaver, DataGrip, Aqua Data Studio, Beekeeper, RazorSQL, and Navicat Premium each offer unique features for enhancing users’ database management experience. However, dbForge stands out for those looking for such powerful tools as visual query builder, data analysis, and performance tuning. Try dbForge today and take your database management to the next level. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-7-dbvisualizer-alternatives.html) [Twitter](https://twitter.com/intent/tweet?text=Best+7+DBVisualizer+Alternatives&url=https%3A%2F%2Fblog.devart.com%2Fbest-7-dbvisualizer-alternatives.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-7-dbvisualizer-alternatives.html&title=Best+7+DBVisualizer+Alternatives) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-7-dbvisualizer-alternatives.html&title=Best+7+DBVisualizer+Alternatives) [Copy URL](https://blog.devart.com/best-7-dbvisualizer-alternatives.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-data-integration-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [SSIS Components](https://blog.devart.com/category/products/ssis-components) [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [ODBC](https://blog.devart.com/category/odbc) Best Data Integration Tools for 2025: Features, Pricing, and Use Cases By [Victoria Shyrokova](https://blog.devart.com/author/victorias) February 27, 2025 [0](https://blog.devart.com/best-data-integration-tools.html#respond) 352 Dealing with different data sources without efficient data integration tools turns valuable information into a liability. These solutions give you a unified view of your data, eliminating silos and ensuring data consistency, accuracy, and accessibility. But, where do you start? Navigating the 2025 data integration market, with the shift towards scalable cloud platforms, self-service integrations, and AI automation, can be tough. In this article, we’ll break down the best data integration tools in 2025 and give you practical tips to help you find the perfect fit for your business. Table of contents Top data integration tools for enterprises in 2025 Best data integration tools for small and medium-sized businesses (SMBs) Leading no-code and low-code data integration tools Open-source and developer-focused data integration tools Specialized data integration solutions for business intelligence (BI) and reporting How to choose the right data integration tool for your business? Conclusion Top data integration tools for enterprises in 2025 If you’re in a large organization, you need robust, scalable, and secure tools for data integration. The list below includes solutions that offer comprehensive automation, cloud integrations, and API support, with a focus on scalability and security. Microsoft Azure Data Factory [Azure Data Factory](https://azure.microsoft.com/en-us/products/data-factory) (ADF) is a cloud-native service from Microsoft designed to create and manage ETL and ELT processes for enterprise data workflows. It’s strong for basic  orchestration and data movement, and scales dynamically to handle varying data loads. Plus, since it’s built right into Azure, you get seamless integration with other Microsoft services and the benefit of Azure’s strong security. Pricing: Consumption-based; charges for pipeline runs, data flows, and runtime. Use cases: Orchestrate data pipelines for Azure SQL and Cosmos DB data warehousing. Automate real-time CDC data integration. Ingest and prepare IoT data streams for downstream analytics (with Stream Analytics). Cleo Integration Cloud [Cleo Integration Cloud](https://www.cleo.com/cleo-integration-cloud) (CIC) is a better fit for manufacturing, logistics, and retail organizations or those with complex supply chain and EDI needs. With its API-first design, CIC lets you create custom integrations, automate complex B2B workflows, and streamline data exchange with trading partners. Pricing: Subscription-based; tailored to organizational size and needs. Use cases: Automate EDI-based order processing and real-time inventory updates with suppliers. Track shipments in real time. Synchronize product catalogs and pricing with distributors with automated API integrations. Devart ODBC Drivers [Devart ODBC Drivers](https://www.devart.com/odbc/) offer a streamlined and secure way to connect your applications to over 25 popular databases like SQL Server, Oracle, and MySQL — no need to install extra software. You can also connect directly to 60+ cloud data sources using standard HTTP or a proxy server. In addition, they work on any system and integrate with a wide range of third-party data tools. Pricing: Flexible licensing; tailored to desktop, server, or enterprise needs. Offers a 30-day free trial. Use cases: Integrate real-time data from CRMs into business intelligence tools for reporting. Migrate data easily between on-premise and cloud databases. Create custom data connections for internal apps. Best data integration tools for small and medium-sized businesses (SMBs) SMBs need practical, accessible data integration solutions that are easy to set up and use, even without a big IT team. Here, we’ve selected tools that deliver value quickly, focusing on user-friendliness, budget-friendly pricing, and accessible support. Hevo Data If fast, no-code pipeline deployment is your priority, [Hevo Data](https://hevodata.com/) is the best option. Just point and click to automate data flows, including real-time replication from over 150 pre-built sources, like Firebase Analytics, NetSuite, and BigCommerce. Plus, you don’t have to deal with any ETL infrastructure, and it comes with a library of pre-built integrations to speed up data onboarding. Pricing: Subscription-based; charges based on number of records processed. Offers a 14-day free trial. Use cases: Automate customer data sync (CRM to marketing). Replicate e-commerce sales data for real-time reporting. Streamline marketing analytics data flows. Celigo [Celigo](https://www.celigo.com/) is a strong iPaaS for SMBs automating workflows between cloud apps like NetSuite, Salesforce, and Shopify. Unlike Hevo Data, it focuses on streamlining business processes using a low-code UI with pre-built templates for ERP, CRM, and finance systems. As an added bonus, it offers AI-assisted integration design and mapping. Pricing: Tiered subscriptions based on features and usage. Offers a 30-day free trial with limited endpoints. Use cases: Automate order-to-cash workflows between ERP and CRM. Synchronize customer data between marketing automation and CRM. Automate inventory management across supply chain applications. Devart Excel Add-ins Need to perform a quick analysis directly within Excel? [Devart Excel Add-ins](https://www.devart.com/excel-addins/) let you connect your databases and cloud applications without leaving your spreadsheet. You can use SQL or its simple drag-and-drop query builder to get exactly what you need, refresh it instantly, and make edits to the source data without switching applications. Pricing: Flexible one-time subscription; per add-in, Database/Cloud pack, or Universal pack. 30-day free trial available. Use cases: Build real-time financial reports and automate data refreshes. Analyze sales performance across multiple cloud platforms without exporting CSVs. Create quick data visualizations querying and updating database tables from Excel. Leading no-code and low-code data integration tools No-code and low-code tools for data integration allow anyone to automate workflows with visual tools, drag-and-drop interfaces, and pre-built components. Among these solutions, the best are: Jitterbit [Jitterbit](https://www.jitterbit.com/) ‘s Harmony platform simplifies API integration with a drag-and-drop interface to create standards-based HTTP/REST connectors and custom APIs. Additionally, it comes with over 200 pre-built connectors for popular SaaS apps and on-prem systems, so it’s really good for connecting legacy SQL applications to new cloud systems. Pricing: Subscription-based; tiered by features and usage volume. Offers a 14-day free trial. Use cases: Automate order fulfillment between online stores and ERP systems. Sync patient data between healthcare applications. Connect financial systems for real-time reporting. Devart dotConnect If you want to build a custom integration and need fine-grained control, [dotConnect](https://www.devart.com/dotconnect) is a strong option. Built specifically for C# and .NET, it provides high-performance access to major databases through ADO.NET data providers. Plus, it simplifies data manipulation with a visual ORM designer, robust LINQ, EF Core, and Entity Framework support, and database-specific optimizations. Pricing: Subscription-based; varies based on your chosen ADO.NET data providers, license type, and edition. 30-day free trial available. Use cases: Build custom data migration tools. Develop data-driven web services. Create internal applications that connect to diverse database systems. Open-source and developer-focused data integration tools Developer-focused tools are the smartest choice for in-house development teams building custom data pipelines and workflows. Here are some top options for extensive customization and direct data control: Apache NiFi [Apache NiFi](https://nifi.apache.org/) is a popular open-source ETL tool. Its visual, flow-based interface makes building real-time data pipelines pretty easy, even at scale. In addition, its unique features (e.g., flow files, provenance events, and Groovy processors) give you plenty of flexibility. This makes it a go-to for teams managing high-volume data ingestion and pre-processing. Pricing: Free to use. Use cases: Real-time data ingestion from IoT devices. Building complex data routing and transformation pipelines. Managing data flow in event-driven architectures. Devart DAC (Data Access Components) Unlike generic data access components, [Devart DAC](https://www.devart.com/dac.html) offers developers in Delphi, C++ Builder, and Lazarus fast, native database access. You can work seamlessly with most popular databases and build cross-platform, database-heavy apps without complex configurations. Pricing: Subscription-based; varies based on your chosen data access component, license type, and edition. 30-day free trial available. Use cases: Developing high-performance desktop and server applications. Building data-centric applications with optimized database interactions. Implementing complex data access logic within Delphi and .NET environments. Specialized data integration solutions for business intelligence (BI) and reporting There are also many data integration tools with simplified real-time reporting, advanced analytics, easy data visualizations, and direct integrations with leading BI platforms. Our top two include: Adverity Want to get a clear picture of your marketing and sales performance? You can use [Adverity](https://www.adverity.com/) ’s library of 600+ pre-built connectors to pull data from a wide range of sources. It automates data pipelines for real-time performance reporting, using AI to handle complex data transformation, mapping, and continuous monitoring. Pricing: Custom pricing; based on organization needs and requirements. Use cases: Analyze real-time campaign performance across multiple ad networks to optimize bidding and targeting. Track customer journey across all sales channels to identify conversion bottlenecks and improve ROI. Create unified dashboards to monitor cross-channel campaign effectiveness and inform marketing strategy. Devart SSIS Components [Devart SSIS Components](https://www.devart.com/ssis/) optimize the ETL processes inside your SSIS packages. Expect support for a wide range of databases, warehouses, and cloud applications, along with quick bulk data transfers and streamlined lookup transformation. This combination lets you build efficient data pipelines much faster. Pricing: Flexible licensing; tailored to desktop, server, or enterprise needs. Offers a 30-day free trial. Use cases: Build robust data warehouses with complex ETL processes. Integrate data from cloud applications like Salesforce and Dynamics 365 into SSIS. How to choose the right data integration tool for your business? When choosing data integration tools, focus on your specific needs. Here are a few tips: For growth, ensure the tool scales with your data volume and meets compliance standards. Match the tool’s complexity to your team’s skills for rapid deployment. Cloud-based tools often suit SMBs, while complex enterprise integrations may require on-premise solutions. Factor in budget and existing infrastructure when deciding. Conclusion We’ve explored diverse data integration solutions, each tailored for specific needs and challenges. As AI automation, self-service, and cloud-based features become standard, it’s key to pick a tool that can remain flexible and scale alongside your organization. Devart has a wide suite of data integration solutions with free trials, so it’s a good place to start. If you need help choosing the right fit, you can contact the sales team for more information. Tags [dac](https://blog.devart.com/tag/dac) [excel add-ins](https://blog.devart.com/tag/excel-add-ins) [odbc](https://blog.devart.com/tag/odbc) [SSIS](https://blog.devart.com/tag/ssis) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-data-integration-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Best+Data+Integration+Tools+for+2025%3A+Features%2C+Pricing%2C+and+Use+Cases&url=https%3A%2F%2Fblog.devart.com%2Fbest-data-integration-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-data-integration-tools.html&title=Best+Data+Integration+Tools+for+2025%3A+Features%2C+Pricing%2C+and+Use+Cases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-data-integration-tools.html&title=Best+Data+Integration+Tools+for+2025%3A+Features%2C+Pricing%2C+and+Use+Cases) [Copy URL](https://blog.devart.com/best-data-integration-tools.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/best-database-diagram-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Best Database Diagram Tools– Free and Paid By [Rosemary Asufi](https://blog.devart.com/author/rosemarya) May 5, 2025 [0](https://blog.devart.com/best-database-diagram-tools.html#respond) 71 Scalable systems thrive on schema clarity—without it, you pay in rework. That’s why [96%](https://icepanel.medium.com/state-of-software-architecture-report-2024-31eab5fe2c88) of engineers now use visual tools, particularly database diagram tools, to map and manage their databases. Diagramming tools bring structure to complexity. They surface design flaws early, improve collaboration across teams, and turn undocumented systems into shared infrastructure. These advantages are fueling widespread adoption. But not every tool is built to support the way modern teams design, scale, and deliver. This guide profiles the 10 best database diagram tools of 2025, free and paid, selected for team building with speed, scale, and clarity in mind. Let’s dive in! Table of contents What is a database diagram tool? How the top tools stack up How we selected the top 10 database diagram tools 1. dbForge Studio for SQL Server 2. dbdiagram.io 3. Lucidchart 4. QuickDBD 5. ERD Plus 6. DrawSQL 7. Miro 8. Creately 9. DbSchema 10. SqlDBM Other notable database diagram tools How to choose a database diagram tool Built-in database diagramming in management tools Why choose dbForge Studio for SQL Server? Conclusion What is a database diagram tool? A database diagram tool is a type of data modeling tool, primarily focused on the visual design of database schemas such as ER diagrams. It enables engineering teams to design, inspect, and maintain schema architecture with greater precision, especially as systems scale and evolve. At the heart of these tools are Entity-Relationship Diagrams (ERDs)—visual representations of tables, keys, and relationships. But modern diagramming tools go beyond basic visuals. They connect directly to live databases, support forward and reverse engineering, and simplify SQL database schema design. The use cases include: Greenfield design: A clean slate needs a clear plan. ERD tools help teams design schemas visually—before writing any SQL. It’s the smartest way to shape structure while ideas are still evolving. Migration planning: Platform shifts and refactors are risky. Diagrams expose structural issues early—missing keys, broken relationships, orphaned tables—before they become production problems. System documentation: When your schema only exists in someone’s head, continuity is at risk. ERDs give teams a reliable, shared reference that stays useful long after handoffs. Reverse engineering: Inherited a database with no docs? Plug it into an ERD tool and get instant clarity. It’s the fastest way to understand what you’re really working with. In short, database diagram tools are about clarity, risk reduction, and faster delivery. Now, let’s explore the best database diagram tools in the market. How the top tools stack up Here’s a quick snapshot of how the leading database diagram tools compare across essential features—platform, SQL Server support, collaboration options, and more. Tool Name Free/Trial Platform SQL Server support Team work Export Trial Best for Rating dbForge Studio Free plan + 30-day trial Desktop Yes Yes (version control) SQL, reports 30 days Enterprise SQL Server teams ⭐ 4.8 dbdiagram.io Free plan Web Yes (via SQL) Yes (link sharing) PNG, SQL, PDF N/A Fast visual sharing ⭐ 4.3 Lucidchart Free plan Web Yes (manual) Yes (real-time) PDF, PNG, SQL 7 days Cross-functional teams ⭐ 4.5 QuickDBD Free plan Web Yes Yes (real-time) SQL, PDF, PNG N/A Keyboard-first devs ⭐ 4.8 ERD Plus Free Web Limited No Image, PDF N/A Students & educators ⭐ 4.3 DrawSQL Free plan + 14-day trial Web Yes Yes (real-time) SQL, PNG, PDF 14 days Design teams ⭐ N/A Miro Free plan Web No Yes (chat, video, real-time) Image, PDF N/A Workshops and early planning ⭐ 4.7 Creately Free plan Web/Desktop Yes (manual modeling) Yes (live cursors) PDF, PNG, SVG N/A Visual thinkers ⭐ 4.4 DbSchema 30-day trial Desktop Yes Limited (file sharing) SQL, PDF, PNG 30 days Cross-platform devs ⭐ 4.0 SqlDBM Free trial Web Yes Yes (versioning, sharing) SQL, image, PDF 14 days Cloud-native teams ⭐ 5 From fast, browser-based apps to fully featured development environments, the tools in this list cover a wide range of use cases. Some are ideal for quick diagrams and lightweight planning, while others are built for deep integration, versioning, and team-scale workflows. If you’re looking for deeper insights into each tool, keep reading. How we selected the top 10 database diagram tools Our list is based on deep editorial review, real-world usability, and team fit. Each tool was chosen for its balance of functionality, clarity, and practical value to modern development teams. Selection criteria included usability, feature depth, workflow fit, collaboration support, and community feedback. Let’s explore the tools to help you make an informed decision. 1. dbForge Studio for SQL Server Best for full-featured SQL Server development and visual schema design Free trial available + free Express Edition From $259.95/user (one-time or subscription pricing Rating: 4.8/5 ( [Capterra](https://www.capterra.com/p/241291/dbForge-Studio-for-SQL-Server/reviews/?utm_source) ) [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is an enterprise-grade IDE and one of the most complete SQL visual schema tools available for SQL Server database professionals. It brings together visual database design, SQL editing, data comparison, source control, and CI/CD tools—all inside a single, unified desktop environment. Most ERD tools stop at diagramming—dbForge goes further. It combines advanced ERD design with live schema editing, version control integration, and deployment features. It’s built for serious engineering teams managing complex SQL Server environments, where clarity, control, and speed all matter. Features and integrations Features include a visual ERD designer with real-time sync to live databases, a powerful SQL editor with code completion and formatting, schema/data comparison tools, and built-in DevOps capabilities like version control, test automation, and CI/CD pipeline support. Integrations include Git, SVN, TFS, Jenkins, Microsoft Azure, and Amazon RDS. It’s fully optimized for SQL Server environments and supports efficient collaboration in larger development teams. Pros and cons Pros Cons Visual ERD builder with bidirectional schema sync Windows-only; no native macOS or Linux version (works on macOS and Linux through crossover by CodeWeavers) Built-in tools for schema and data comparison May feel heavy for teams needing just basic diagramming DevOps support with version control and automation Fast onboarding with clean UI and excellent documentation One IDE for design, coding, and deployment Learn more about dbForge Studio for SQL Server: ▶ [Download Free Trial](https://www.devart.com/dbforge/sql/studio/download.html) ▶ [Compare Editions and Pricing](https://www.devart.com/dbforge/sql/studio/ordering.html) 2. dbdiagram.io Best for quick, browser-based schema visualization and sharing Free plan available Paid plans from $8/month (billed annually) Rating: 4.3/5 ( [G2](https://www.g2.com/products/dbdiagram-io/reviews?utm_source) ) [dbdiagram.io](https://dbdiagram.io/) is a lightweight, browser-based tool for fast ERD creation using either a custom DSL or raw SQL. Designed for simplicity and speed, it’s ideal for teams that need to sketch, share, and iterate on schema designs—without the overhead of installing full-scale software. The tool is purpose-built for simplicity. Developers can generate ERDs in seconds using minimal syntax, then export diagrams or share them with a link. It’s especially useful for remote teams, early-stage planning, or fast visual documentation of existing structures. Features and integrations Features include real-time collaboration, export options (PDF, PNG, SQL, etc.), and a clean DSL for quickly describing schema components. It also supports reverse engineering via SQL import from databases like PostgreSQL, MySQL, and SQL Server. Integrations include GitHub and dbdocs.io for embedding diagrams into project documentation or version-controlled repositories. Pros and cons Pros Cons Instant ERD creation via browser—no install required Limited advanced features for enterprise workflows Supports SQL and custom DSL input Collaboration features limited on the free tier Easy export to image or SQL Not ideal for deep schema editing or deployment work Great for documentation and quick sharing Learn more about dbdiagram.io: ▶ [Try it free in your browser](https://dbdiagram.io/) ▶ [Compare plans and pricing](https://dbdiagram.io/pricing) 3. Lucidchart Best for collaborative, cloud-based ERD design Free plan available Paid plans from $9/month (billed annually) Rating: 4.5/5 ( [G2](https://www.g2.com/products/lucid-software-inc-lucid-visual-collaboration-suite/reviews#reviews) ) [Lucidchart](https://www.lucidchart.com/pages/examples/er-diagram-tool) is a browser-based ERD and diagramming platform built for team collaboration. With real-time editing, version history, and broad template support, it’s a strong fit for cross-functional teams mapping out databases or visualizing system architecture in distributed environments. Lucidchart stands out for its flexibility. It allows teams to import database schemas or build diagrams from templates, while also supporting live collaboration with commenting and version history. It’s cloud-native and easy to onboard, making it ideal for distributed teams or hybrid work environments. Features and integrations Features include data import from SQL and CSV files, drag-and-drop ERD shapes, real-time collaboration, Salesforce schema visualization, and over 1,000 templates. Its ERD tool supports crow’s foot notation, primary/foreign key linking, and schema export. Integrations include Google Workspace, Microsoft 365, Atlassian (Confluence/Jira), Salesforce, Slack, GitHub, and AWS. Pros and cons Pros Cons Real-time collaboration with version tracking Database sync is manual—not bidirectional Works on any device via browser Advanced features locked behind higher-tier plans Import/export options for SQL, CSV, and Salesforce schemas Performance may lag with large, complex diagrams Extensive template and shape library Learn more about Lucidchart: ▶ [Try Lucidchart for free](https://www.lucidchart.com/pages/examples/er-diagram-tool) ▶ [View pricing plans](https://lucid.app/pricing/lucidchart) ▶ [Explore ERD templates and documentation](https://www.lucidchart.com/pages/examples/er-diagram-tool) 4. QuickDBD Best for fast, keyboard-first ER diagramming Free plan available Paid plans from $14/month or $95/year Rating: 4.8/5 ( [G2](https://www.g2.com/products/quickdbd/reviews?utm_source) ) [QuickDBD](https://www.quickdatabasediagrams.com/) is minimalist diagramming software for developers, built to translate typed schemas into visuals in seconds. Users can type schema definitions in plain text and instantly generate professional ER diagrams—no mouse or drag-and-drop required. It’s ideal for developers who think in code but need a clean visual output. QuickDBD is built for momentum. While other tools prioritize feature depth, QuickDBD streamlines the diagramming process for developers who just want to sketch ideas, share structure, and get back to building. Features and integrations Features include a real-time diagram preview from plain text input, support for exporting to SQL, image, PDF, and RTF formats, as well as real-time collaboration and private/public sharing. It runs entirely in the browser, with no installation required. Currently, QuickDBD is offering promotional free access to Pro features in exchange for public feedback or honest reviews. Pros and cons Pros Cons Extremely fast and lightweight—designed for developers Limited design customization Keyboard-based diagram creation No bidirectional live database sync Exports to SQL, image, PDF, and more Lacks advanced schema management and validation Real-time collaboration and sharing Basic UI; not ideal for large-scale enterprise work Learn more about QuickDBD: ▶ [Try the app now](https://app.quickdatabasediagrams.com/#/) ▶ [Explore pricing](https://www.quickdatabasediagrams.com/#pricing) ▶ [Read FAQs and roadmap](https://www.quickdatabasediagrams.com/) 5. ERD Plus Best for academic use and quick, no-login ERD creation Completely free Rating: 4.3/5 ( [G2](https://www.g2.com/products/quickdbd/reviews?utm_source) ) [ERD Plus](https://erdplus.com/) is a completely free, browser-based ERD generator designed for academic use. It supports basic ER diagrams, relational schemas, and normalization forms, making it a go-to for students and instructors who need fast, no-login tools to illustrate core database concepts. ERD Plus keeps things simple and distraction-free. It’s especially useful in classrooms and introductory database courses where the goal is to understand relationships—not navigate complex software. No logins, no paywalls—just clean diagrams. Features and integrations Features include support for Entity-Relationship Diagrams, Relational Schemas, UML Class Diagrams, and multi-valued attributes. It runs fully in-browser and allows for exporting diagrams as image files or PDFs. While basic in design, it’s fast and gets the job done. No third-party integrations are provided, as the tool is focused on standalone academic use. Pros and cons Pros Cons 100% free, no account or install needed Not built for enterprise or production use Supports ER, relational, and UML diagrams Lacks real-time collaboration or export to SQL Extremely lightweight and fast No integration with live databases or CI/CD workflows Great for teaching and learning database fundamentals Learn more about ERD Plus: ▶ [Use ERD Plus now](https://erdplus.com/) ▶ [Access example use cases and educational materials](https://erdplus.com/) 6. DrawSQL Best for collaborative, cloud-based ERD design Free plan available Paid plans from $19/month Rating: N/A [DrawSQL](https://drawsql.app/) is a collaborative, web-based ERD platform that helps teams design, visualize, and document database schemas. Its real-time editing, intuitive interface, and SQL export make it especially useful for planning complex systems and maintaining up-to-date documentation. DrawSQL stands out for its emphasis on collaboration and ease of use. Its real-time editing capabilities and version control features make it ideal for teams working on complex database structures. Additionally, the ability to generate SQL scripts from diagrams streamlines the development process. Features and integrations Features include real-time collaboration, version history, export options (SQL, PNG, PDF), and a library of templates to jumpstart your designs. It supports major relational databases like MySQL, PostgreSQL, and SQL Server, ensuring compatibility with a wide range of projects. Pros and Cons Pros Cons Real-time collaboration with team members Limited to relational databases Intuitive drag-and-drop interface Advanced features require a paid plan Export diagrams to SQL, PNG, and PDF formats No offline mode available Extensive template library for quick diagram creation Learn more about DrawSQL: ▶ [Try DrawSQL for free](https://drawsql.app/) ▶ [View pricing plans](https://www.softwaresuggest.com/drawsql) ▶ [Explore features and templates](https://drawsql.app/features) 7. Miro Best for whiteboard-style database brainstorming and collaboration Free plan available Paid plans from $8/month per user (billed annually) Rating: 4.7/5 ( [G2](https://www.g2.com/products/miro/reviews#reviews) ) [Miro](https://miro.com/) is a flexible online whiteboard used for brainstorming and planning, including early-stage database design. While not a dedicated ERD tool, it supports collaborative schema sketching, system mapping, and team workshops—especially valuable during the exploratory phases of architecture planning. Miro’s strength is its flexibility. It’s not built for databases specifically, but its rich diagramming features, templates, and integrations make it a go-to tool for high-level system planning. It’s especially valuable when database design is part of a broader product or architecture discussion. Features and integrations Features include infinite canvas, drag-and-drop diagramming, customizable templates, sticky notes, mind maps, and voting tools. Miro also includes real-time collaboration, comments, video calls, and AI features for summarization and diagram automation. Integrations include Jira, Confluence, GitHub, Microsoft Teams, Google Workspace, Slack, Asana, Notion, and over 160 other tools. Pros and cons Pros Cons Extremely flexible for early-stage schema design Not a dedicated database diagramming tool Rich collaboration features (chat, comments, video, AI) Requires manual formatting for ERD-specific notation Ideal for team workshops, planning, and brainstorming Lacks live schema syncing or SQL generation Extensive template library and integration ecosystem Learn more about Miro: ▶ [Try Miro for free](https://miro.com/) ▶ [Compare pricing plans](https://miro.com/pricing/) 8. Creately Best for visually rich ERDs and cross-functional collaboration Free plan available Paid plans from $5/month per user Rating: 4.4/5 ( [G2](https://www.g2.com/products/creately/reviews#reviews) ) [Creately](https://creately.com/) is a visual collaboration platform that makes ERD creation accessible to both technical and non-technical users. With a drag-and-drop interface, live collaboration, and support for multiple ER notations, it’s well-suited for teams designing data flows alongside broader system diagrams. Creately stands out for its combination of simplicity and functionality. The platform’s drag-and-drop interface, coupled with real-time collaboration features, makes it an excellent choice for teams looking to design ER diagrams without a steep learning curve. Features and integrations Features include support for multiple ERD notations (Chen’s, Crow’s Foot), real-time collaboration with live cursors, version history, and an extensive library of templates and shapes. Users can also embed documents and assets to centralize data around information system projects. Integrations encompass popular tools such as Google Workspace, Microsoft Teams, Slack, and Confluence, facilitating seamless collaboration across different platforms. Pros and cons Pros Cons Intuitive drag-and-drop interface Occasional performance issues with large diagrams Real-time collaboration with live cursors Some advanced features require higher-tier plans Extensive template and shape library Limited offline functionality Supports multiple ERD notations Learn more about Creately: ▶ [Try Creately for free](https://creately.com/) ▶ [View pricing plans](https://creately.com/plans/) 9. DbSchema Best for cross-platform, schema-centric database modeling Free trial available Paid plans from $63 (one-time license) Rating: 4.0/5 ( [G2](https://www.g2.com/products/dbschema/reviews#reviews) ) [DbSchema](https://dbschema.com/) is a cross-platform database design and management tool that supports both relational and NoSQL databases. Its visual editor, schema sync, and support for diverse platforms make it ideal for teams managing hybrid environments or planning migrations across systems. DbSchema stands out for its ability to handle complex database structures visually, regardless of the underlying database system. Its platform-independent approach and robust feature set make it a valuable tool for teams dealing with multiple database types or planning migrations. Features and integrations Features include visual schema design, interactive layouts, schema synchronization, data explorer, and documentation generation. It supports reverse engineering from existing databases and offers tools for designing and deploying schema changes. DbSchema integrates with various databases, including MySQL, PostgreSQL, SQL Server, Oracle, MongoDB, and more, providing flexibility for teams working in heterogeneous database environments. Pros and cons Pros Cons Supports a wide range of relational and NoSQL databases Interface may be overwhelming for new users Visual schema design and synchronization tools Some advanced features require additional configuration Platform-independent (runs on Windows, macOS, Linux) Comprehensive documentation and support resources Learn more about DbSchema: ▶ [Download Free Trial](https://dbschema.com/download.html) ▶ [View pricing plans](https://dbschema.com/purchase.html) 10. SqlDBM Best for cloud-native, collaborative data modeling at scale Free trial available Paid plans from $29.4/month Rating: 5/5 ( [G2](https://www.g2.com/products/sqldbm/competitors/alternatives?utm_source) ) [SqlDBM](https://sqldbm.com/) is a cloud-native data modeling tool built for collaborative database design. It supports platforms like Snowflake, SQL Server, and PostgreSQL, and includes features like version control, reverse engineering, and integration with dbt and Confluence—making it ideal for modern data teams. SqlDBM stands out for its cloud-native approach, eliminating the need for installations or complex setups. Its intuitive interface and collaborative features make it ideal for distributed teams working on data modeling projects. Features and integrations Features include visual database modeling, version control, reverse and forward engineering, and documentation generation. SqlDBM also offers integrations with platforms like dbt, Confluence, and Jira, enhancing its utility in modern data workflows. Pros and cons Pros Cons Cloud-based with no installation required Advanced features may require higher-tier plans Supports multiple database platforms Limited offline functionality Collaborative features for team-based modeling Interface may have a learning curve for new users Integration with popular tools like dbt and Confluence Learn more about SqlDBM: ▶ [Try SqlDBM for free](https://sqldbm.com/) ▶ [View pricing plans](https://sqldbm.com/Pricing/) ▶ [Explore features and integrations](https://sqldbm.com/) Other Notable Database Diagram Tools While the tools listed above are among the most widely used in 2025, there are a few additional options worth mentioning for teams with different needs: SmartDraw : A powerful diagramming tool that supports database modeling alongside flowcharts, org charts, and more. It offers templates for ER diagrams and integrates with tools like Confluence and Google Workspace. Canva : Although primarily known for graphic design, Canva offers templates and easy-to-use features that can be adapted for creating simple ER diagrams, especially for presentations or non-technical stakeholders. EdrawMax : A versatile diagramming platform supporting over 280 types of diagrams, including ER diagrams. It offers rich templates, drag-and-drop functionality, and cross-platform support. Visual Paradigm : A professional modeling tool offering database design, UML diagrams, BPMN, and more. It supports database engineering with forward and reverse engineering features and is ideal for large-scale system planning. ClickUp : A project management platform with powerful Whiteboard and Mind Map features for ERD creation. It offers templates, drag-and-drop entity mapping, and real-time collaboration—ideal for teams building complex data models alongside broader project workflows. How to choose a database diagram tool With dozens of ERD tools available, picking the right one depends less on features alone—and more on how well the tool fits into your team’s workflow, stack, and priorities. Here’s how to evaluate what matters: 1. Web-based vs. desktop Web tools like dbdiagram.io, DrawSQL, and SqlDBM are ideal for remote teams, quick access, and easy sharing. They run in the browser, require no setup, and often include real-time collaboration. Desktop tools like dbForge Studio and DbSchema, on the other hand, offer deeper control, live database integration, and richer offline capabilities—ideal for complex enterprise environments. 2. Free vs. paid If you’re prototyping, teaching, or diagramming casually, free data modeling software like ERD Plus, QuickDBD (free tier), or Creately (free plan) may cover your needs. But for production systems, audits, or CI/CD workflows, paid tools like dbForge, SqlDBM, or Lucidchart offer advanced features, support, and scalability that free versions can’t match. 3. Collaboration features Team collaboration is non-negotiable for modern development. Tools like Lucidchart, Miro, and DrawSQL are purpose-built for real-time teamwork, complete with live cursors, comments, and sharing links. If your team works asynchronously or across time zones, prioritize tools with built-in version control and cloud access. 4. Integration with your database stack Not all tools speak the same SQL dialect. Ensure compatibility with your stack—whether it’s SQL Server (dbForge, SqlDBM), PostgreSQL (DbSchema, SqlDBM), MySQL (DbSchema, QuickDBD), or even MongoDB (less common in ERD tools). The tighter the integration, the more value you’ll get from reverse engineering, live sync, and schema deployment. 5. Ease of use vs. feature depth If speed and simplicity matter more than feature depth, lean toward tools like QuickDBD or dbdiagram.io. If your team needs schema validation, automation, DevOps pipelines, or regulatory compliance, tools like dbForge Studio and DbSchema provide more robust environments. Built-in Database Diagramming in Management Tools In addition to standalone tools, it’s also worth mentioning that some database management environments offer built-in diagramming features. One notable example is SQL Server Management Studio (SSMS), which includes native support for creating and managing database diagrams directly within SQL Server environments. Here are the key diagramming features built into SSMS in detail: Create new diagram : Start a fresh layout from any database. This gives you a canvas to visualize selected tables and how they relate. Add related tables : Automatically pull in tables that share relationships with those already on the diagram, helping expose joins and dependencies you might otherwise overlook. Remove tables : Clean up clutter or focus the diagram by removing unnecessary tables from view. This doesn’t affect the actual database—just the visual model. Auto arrange : Instantly reorder and space out tables to reduce overlap and improve readability. A time-saver when working with large diagrams. Auto-size tables : Resize each table box to fit its content, ensuring columns are visible without manual adjustment. Show key columns / relationships : Highlight primary keys, foreign keys, and their connections to make data relationships more explicit. Copy to clipboard : Take a snapshot of the diagram for use in documents, presentations, or quick team discussions. Export to Word/PDF : Save diagrams as files for sharing or offline review, particularly useful when documenting systems or preparing audits. While SSMS offers basic diagramming for SQL Server environments, teams that need more than static visuals will find dbForge Studio a major step up. Why choose dbForge Studio for SQL Server? [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is a full development environment designed for teams that view database design, deployment, and performance as strategic priorities. It brings advanced schema modeling, live database editing, version control, and DevOps workflows together in one streamlined platform. At its core is a powerful, bi-directional ERD designer—tightly coupled with live database editing, schema comparison, and deployment tools. But the real advantage lies in how it supports professional workflows at scale: Version control and automation ensure that changes are tracked, reviewed, and deployed with precision—critical for teams managing sensitive data or compliance-heavy systems. Built-in DevOps support bridges the gap between design and release, enabling CI/CD pipelines for database code alongside application code. Team collaboration features reduce friction across development, QA, and operations—turning schema diagrams into shared, actionable assets. dbForge Studio is built for teams who need more than a diagram. It’s for those who manage databases at scale, prioritize repeatability, and demand full visibility across environments. [Start your free trial](https://www.devart.com/dbforge/sql/studio/download.html) of dbForge Studio for SQL Server today and experience the difference of a development environment built for clarity, control, and scale. Conclusion A solid data architecture begins with well-planned schema design. And in 2025, diagramming isn’t just a convenience—it’s a necessity. The right tool does more than visualize tables. It gives teams the clarity to scale, the precision to reduce risk, and the flexibility to stay aligned across workflows. Whether you’re mapping new systems or managing production databases, choose the best tool for database diagram creation that fits how your team works—not just what your database supports. For SQL Server teams, dbForge Studio is a standout. It offers more than ERDs: integrated DevOps support, live schema editing, and deployment pipelines make it an all-in-one environment for serious development. FAQ What is a database diagram tool? A database diagram tool is software used to visually design and document database structures, typically using Entity-Relationship Diagrams (ERDs). These tools help teams map tables, keys, and relationships, making it easier to plan, communicate, and maintain database architecture. What’s the best free tool for drawing ER diagrams? Tools like ERD Plus and dbdiagram.io are widely considered among the best free options. They allow quick ER diagram creation in the browser, with features like SQL export, collaborative sharing, and no installation required. Can I design SQL Server schemas online? Yes. Cloud-based tools like SqlDBM and dbdiagram.io support SQL Server schema design. While dbdiagram.io is ideal for quick sketches, SqlDBM offers deeper modeling capabilities and version control for team use. How does dbForge Studio help with database diagramming? dbForge Studio for SQL Server provides a powerful visual ERD designer with live schema editing, bidirectional sync, and integrated DevOps tools. It’s especially valuable for engineering teams managing complex SQL Server environments who need full control and clarity. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [Rosemary Asufi](https://blog.devart.com/author/rosemarya) As a technical content writer, I bring a unique blend of analytical precision and creativity to every article. I'm passionate about simplifying complex topics around data, connectivity, and digital solutions, making them accessible and practical for audiences across different industries. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-database-diagram-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Best+Database+Diagram+Tools%E2%80%93+Free+and+Paid%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fbest-database-diagram-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-database-diagram-tools.html&title=Best+Database+Diagram+Tools%E2%80%93+Free+and+Paid%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-database-diagram-tools.html&title=Best+Database+Diagram+Tools%E2%80%93+Free+and+Paid%C2%A0) [Copy URL](https://blog.devart.com/best-database-diagram-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-database-documentation-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Top 9 Database Documentation Tools of 2024 — Free and Paid Options Unwrapped By [Victoria Shyrokova](https://blog.devart.com/author/victorias) May 13, 2024 [0](https://blog.devart.com/best-database-documentation-tools.html#respond) 1522 Database documentation tools usually aren’t the priority for the teams working on small projects. When coping with routine development tasks, it’s easy to overlook the importance of creating comprehensive database documentation since there are always more urgent tasks. However, when you become the only person who knows how things work, collaboration and making the data usable will eventually pose a real challenge. According to the 2023 Developer Survey by Stackoverflow, over 70% of developers encounter knowledge silos at work at least 1-2 times a week, needing assistance from their team members or colleagues outside of their teams. As a result, productivity frictions happen. Moreover, even to the developers in charge, the database structure may become unclear. Human memory has limitations, and creating clear database documentation is as essential as providing software documentation or adding concise comments within one’s code. It’s a way of making a database readable and easy to understand for everyone. In this article, we will compare nine database documentation tools available in 2024 to determine which are most helpful without requiring much integration effort. Evaluation Criteria for Database Documentation Tools It’s easy to get lost in the wide range of paid and free database documentation tools that support different DBMSs and boast many features needed for specific tasks. Nevertheless, only some tools can handle certain tasks you have in mind. That’s why we have prepared a list of evaluation criteria to provide unbiased insight into how these tools perform against one another. Let’s look at the most common criteria affecting the choice of database documentation tools. Supported DBMS Database documentation tools can either be used only with a specific DBMS or cover a range of database management systems. Naturally, you might have to pay attention to these criteria and narrow down your choice to Oracle database documentation tools only if you use Oracle or MySQL database documentation tools if you use MySQL. Documentation format If your team is used to working with a certain documentation format, it makes sense to choose a database documenter tool that creates similar documentation, be it a PDF file or an HTML page with clear navigation. Ease of use Database documentation should make your team’s collaboration easier. Unfortunately, for some teams, it’s just a formality, and the documents aren’t frequently updated. That’s why it’s essential to pay attention to the ease of use of the database documentation tools: their ability to be integrated in CI/CD, the ways to automate regular updates and organize everything in a concise way. To provide an unbiased evaluation of ease of use, we advise you to consider the following criteria: Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Potential automation Thus, the Ease-of-use score might range from 0 to 4 based on the number of satisfied criteria. Customization options Using corporate styles is a way of making every document look professional and precise, highlighting that it belongs to a particular company. However, not all database documenter tools provide you with an option to change a logo or customize typography settings. Moreover, only the best database documentation tools let you use WYSIWYG-based templates to visually adjust the content or define which tables and objects should be excluded. To evaluate customization options, we have listed the following features: Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations The Customization score might range from 0 to 4 based on the number of satisfied criteria. Licensing terms Some of the free database documentation tools can be used only with the GPL-licensed products, while in other cases, you must get a license to use the tool. The license terms are different for each of the database documentation tools. Some of them require purchasing a separate license for each installation or user. Trial period and pricing You cannot be sure you have found the right database documentation tool unless you give it a go. That’s why having a trial version showcasing the tool’s full potential is the best way to ensure the tool is worth buying. Database Documentation Tools Comparison Chart Now that we have listed the most important evaluation criteria, let’s move on to the tools that assist teams in creating concise documentation. Our chart comprises nine paid and free database documentation tools ranked based on the most important factors you have to consider when making a choice. Tool/Criteria Supported DBMS Documentation format Ease of use (max. 4) Customization options (max. 4) Pricing starts from dbForge Edge SQL Server, MariaDB, MySQL, Oracle HTML, PDF, Markdown 4 4 $ 699.9 5/yr. per license dbdocs MySQL, PostgreSQL, SQL Server Cloud-based 3 3 $720/yr. per 3 paid projects Dataedo SQL Server, Oracle, MySQL, PostgreSQL, MariaDB Web catalog, HTML, PDF, Excel 4 4 $18000/yr. per 3 users ApexSQL Doc SQL Server databases, SSAS models, SSIS packages CHM, HTML, DOC, Markdown, PDF 4 4 $ 729.47/yr. per instance Redgate SQL Doc SQL Server HTML, PDF, DOC, Markdown 4 4 $279/yr. per user SchemaSpy SQL, can be adjusted for PostgreSQL HTML 1 0 Free Database Note Taker SQL Server, MySQL HTML, XML 1 1 Free dbdesc SQL Server, Oracle, MySQL, Microsoft Access, Firebird HTML, RTF, DOC, PDF 3 4 $99 per license TechWriter for databases Access,  MySQL, Oracle,  SAS, SQL Server, PostgreSQL, DB2 PDF,  CHM, RTF,  HTML, XPS and XML 3 3 $600/yr. per license Certainly, the database documentation tools listed above require installation for you to evaluate them, as not all features are immediately apparent. It’s clear that it might take days, if not weeks, to test them all and decide which solution is optimal. That’s why we have come up with detailed descriptions that you can check to get a general idea of what each tool is capable of. [dbForge Edge](https://www.devart.com/dbforge/edge/) [Check the tool >](https://www.devart.com/dbforge/edge/) [30-day free trial >](https://www.devart.com/dbforge/edge/download.html) Supported DBMS: SQL Server, MySQL, MariaDB, Oracle Platforms: Windows, macOS, Linux Documentation format: HTML, PDF, Markdown dbForge Edge is an all-in-one secure IDE for database development, design, management, administration, and testing, that comes with an intuitive and easy-to-use database documenter tool for automated documentation generation. This ultimate solution possesses a full set of features for visualization, customization, integration into CI/CD, and robust templates one can tweak according to their needs and branding styles. With dbForge Edge, you get everything you need to master databases at once, never having to use separate tools for different tasks again. Ease of use: Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Potential automation Customization: Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Tutorials: available Used by: software and database developers, architects, managers, and analysts Licensing terms: Yearly recurrent subscription or perpetual license Special offers for enterprise clients Free licensing for MVPs Trial: 30 days Pricing: from $699.95/yr. per license [Check pricing options >](https://www.devart.com/dbforge/edge/ordering.html) dbdocs [Check the tool >](https://dbdocs.io/) [Live demo >](https://dbdocs.io/Holistics/Ecommerce) Supported DBMS: MySQL, PostgreSQL, SQL Server Platforms: Windows, macOS, Linux Documentation format: Cloud-based dbdocs is a simple SQL and MySQL database documentation tool launched from the command line that uses DBML language to define and describe schemas and generates cloud-based documentation where one can browse and check relationships between tables and fields. Ease of use Documentation generation via command line Integration with CI/CD processes Potential automation Customization Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Used by: developers, database administrators Licensing terms: Monthly recurrent subscription Yearly recurrent subscription White label Trial: free limited version Pricing: from $720 per year [Check pricing options >](https://dbdocs.io/pricing) Dataedo [Check the tool >](https://dataedo.com/) [Live demo >](https://demo.dataedo.com/) Supported DBMS: SQL Server, Oracle, MySQL, PostgreSQL, MariaDB Platforms: Windows Documentation format: Web catalog, HTML, PDF, Excel Dataedo is a set of holistic Oracle, MySQL, PostgreSQL, and SQL database documentation tools empowering teams to collaborate on database documentation. Within this solution, there are options to catalog, visualize, profile, and document data while staying in sync. Ease of use Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Potential automation Customization Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Tutorials: available Used by: developers, database administrators Licensing terms: Yearly recurrent subscription PoC license for 3 months Trial: no Pricing: from $18.000 per year [Check pricing options >](https://dataedo.com/pricing) ApexSQL Doc [Check the tool >](https://www.apexsql.com/sql-tools-doc/) [Download >](https://www.apexsql.com/register/138546/) Supported DBMS: SQL Server databases, SSAS models, SSIS packages Platforms: Windows, macOS, Linux Documentation format: CHM, HTML, DOC, Markdown, PDF The ApexSQL tool for creating database documentation is part of the ApexSQL DevOps toolkit. It helps automate SQL database management, working as an interface that is easy to use for non-techies, as well as fit for more complex tasks handled by experienced developers. Ease of use Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Potential automation Customization Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Used by: developers, technical architects Licensing terms: Yearly recurrent subscription per user Trial: yes Pricing: from $729.47 per year [Check pricing options >](https://shop.quest.com/682/purl-apexsql-fundamentals-toolkit-for-mysql-pd?x-adcode=PD) Redgate SQL Doc [Check the tool >](https://www.red-gate.com/products/sql-doc/) [Download >](https://www.red-gate.com/products/sql-doc/trial/) Supported DBMS: SQL Server Platforms: Windows, Linux Documentation format: HTML, PDF, DOC, Markdown SQL Doc is a solution that helps automate manual database documentation routines, assisting in knowledge sharing and distributing documentation in a convenient format. The solution automatically gets information on object definitions and dependencies, and you can describe them in more detail manually. Ease of use Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Potential automation Customization Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Tutorials: available Used by: developers, technical architects Licensing terms: Yearly recurrent subscription Trial: yes Pricing: from $279 per year per 1 user [Check pricing options >](https://www.red-gate.com/products/sql-doc/pricing) SchemaSpy [Check the tool >](https://schemaspy.org/) [Live demo >](https://schemaspy.org/samples/epivirusurf/) Supported DBMS: SQL, PostgreSQL Platforms: Windows, macOS, Linux Documentation format: HTML SchemaSpy is a free SQL and PostgreSQL database documentation tool that works from a command line and can be configured to process and describe different types of databases, representing the schemas in a format that can be displayed in a browser. Ease of use Documentation generation via command line Used by: developers Licensing terms: Free, distributed under GNU LGPL v.3.0 license Database Note Taker [Check the tool >](https://databasenotetaker.com/) [Download >](https://databasenotetaker.com/pages/evaluate.aspx) Supported DBMS: SQL Server, MySQL Platforms: Windows Documentation format: HTML, XML Database Note Taker is a free solution with a simple interface that provides a sufficient way to generate and share database documentation based on the notes added for tables and objects. Whenever there are changes in the database structure, you will be able to refresh it, spot the missing descriptions, and update your docs on time. Ease of use Documentation generation via GUI Customization Selection of objects for documentation Interface Used by: developers Licensing terms: Free for personal and commercial use dbdesc [Check the tool >](http://dbdesc.com/) [Download >](http://dbdesc.com/download.html) Supported DBMS: SQL Server, Oracle, MySQL, Microsoft Access, Firebird Platforms: Windows Documentation format: HTML, RTF, DOC, PDF dbdesc is a database documentation tool that generates data from tables, keys, definitions, objects, and dependencies, and provides an easy way to share it in several commonly used formats, which can be adjusted to one’s needs. Ease of use Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Customization Theme selection for documentation Customization of cover page appearance Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Used by: developers Licensing terms: Perpetual license Trial: yes Pricing: $99 per license [Check pricing options >](http://dbdesc.com/purchase.html) TechWriter for Databases [Check the tool >](https://techwriter.me/techwriter-for-databases.aspx) [Download >](https://techwriter.me/account/login.aspx?ReturnUrl=%2ffree-downloads.aspx) Supported DBMS: Access, MySQL, SAS, SQL Server, PostgreSQL, DB2 Platforms: Windows Documentation format: PDF, CHM, RTF, HTML, XPS and XML TechWriter for Databases from Quarksoft has a set of basic features to generate database documentation using a clear interface with the WYSIWYG editor for formatting. Within this tool, one can use advanced filtering, showcase annotations, and document selected objects. Also, if you are looking for a sufficient database documentation tool, this solution can become handy. Ease of use Documentation generation via GUI Documentation generation via command line Integration with CI/CD processes Customization Theme selection for documentation Selection of objects for documentation Export and import of settings from pre-saved configurations Interface Used by: developers, database administrators, analysts Licensing terms: Monthly recurrent subscription Perpetual license Expended support Trial: no Pricing: $49.99 per month [Check pricing options >](https://techwriter.me/buynow.aspx) Which Database Documentation Tool to Choose? Creating clear database documentation cannot be underestimated, as it helps facilitate collaboration within a team and assists in keeping track of any important changes. However, with so many database documentation tools arising, the choice cannot be based only on the provided functionality. When choosing a solution to handle database documentation, we encourage you to look into the following matters: Scalability and budget Currently, you might have a small team that doesn’t require extra users within a plan. Maybe your needs can even be satisfied with a free database documentation tool. However, at some point, you might have to migrate to using another tool that has more options, and the documentation consistency will be interrupted. That’s why it’s essential to plan ahead for the functionality that might be required to complement your project’s growth. Even though getting a database documentation tool free of charge can seem a good deal at the start, it often isn’t in the long run. Regular updates Some of the solutions aren’t frequently updated. At some point, they might lack essential features. For instance, not all database documentation tools support the latest versions of DBMSs. Also, the lack of updates might pose a threat to security. We recommend you pay attention to the tools that are updated regularly. Support Even the most experienced developers sometimes have to contact a support team to get assistance in a critical moment. The tools that provide advanced technical support have an extended community of users, and a complete knowledge base is always the best choice when you aim for stability. Solution maturity The solutions that have a story of continuous support and years of experience are often the most stable and the fastest to fix any issues. If there’s a choice between a startup solution and a tried-and-true tool, it’s better to choose the second option, especially if your budget allows it. Wrapping Up Having an idea which database documentation tools are there, you can safely try some of them out, and check if they are going to work for you and your team. No matter which option you choose, you are now fully aware of all the features that are worth paying attention to. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-database-documentation-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Top+9+Database%C2%A0Documentation+Tools+of+2024+%E2%80%94%C2%A0Free+and+Paid+Options+Unwrapped&url=https%3A%2F%2Fblog.devart.com%2Fbest-database-documentation-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-database-documentation-tools.html&title=Top+9+Database%C2%A0Documentation+Tools+of+2024+%E2%80%94%C2%A0Free+and+Paid+Options+Unwrapped) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-database-documentation-tools.html&title=Top+9+Database%C2%A0Documentation+Tools+of+2024+%E2%80%94%C2%A0Free+and+Paid+Options+Unwrapped) [Copy URL](https://blog.devart.com/best-database-documentation-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-database-for-data-analytics.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Choose the Right Database for Data Analytics By [Victoria Shyrokova](https://blog.devart.com/author/victorias) March 27, 2025 [0](https://blog.devart.com/best-database-for-data-analytics.html#respond) 371 You start a query, grab a coffee, and come back to… a report that’s still loading. We’ve all been there. When your database wasn’t designed for analytics, even basic reporting can feel painfully slow. Databases aren’t one-size-fits-all, especially when it comes to analytics. The system that works fine for storing and retrieving customer transactions or app data isn’t necessarily built to process complex queries on massive datasets. Some databases struggle with concurrency, others with indexing, and many can’t handle the kind of distributed processing that analytics demands. That’s why picking the right database isn’t just a matter of choosing the fastest option—it’s about finding a system designed for large-scale data processing, real-time querying, and efficient storage. Of course, optimization tools like [dbForge Edge](https://www.devart.com/dbforge/edge/) can help fine-tune performance, automate indexing, and prevent slowdowns. But, even the best optimization tools can only do so much if the database itself isn’t built for analytics. To get the best results, it’s crucial to start with the right foundation. This guide breaks down how to choose the best database for data analytics—so you can make the right choice from the start. Continue reading to learn more! Table of contents Understanding data analytics needs Key factors in choosing the best database for analytics Types of databases for analytics Comparing 6 top-rated database systems for data analytics How dbForge Edge enhances data analytics Conclusion Understanding data analytics needs A high-performance database for data analytics starts with one key factor: understanding how data is structured, stored, and processed. The right database structure directly impacts query speed, scalability, and efficiency—determining whether insights flow smoothly or systems lag under pressure. Types of data used in analytics Here’s a quick breakdown of the data used in analytics Structured data: Think of neatly organized tables, like financial records or customer databases. It’s perfect for transactional processing and BI reporting. Semi-structured data: A bit of a wildcard—it has some structure but isn’t locked into a strict format. Examples include JSON files, emails, or XML—they contain tags and metadata but don’t follow a rigid schema. Unstructured data: This is information without a predefined format or schema requiring distributed storage and processing. It includes text, images, videos, and logs. Key analytical workloads Most databases are designed for one of two workloads: OLTP for high-speed transactions or OLAP for analytical insights. Choosing the right database for analytics is critical because mismatched workloads cause slow queries, resource strain, and performance bottlenecks. Here’s how they differ: OLTP (Online Transaction Processing): Designed for rapid, high-volume transactions with strong consistency mechanisms. Uses row-based storage for fast reads and writes—ideal for e-commerce, banking, and order processing. OLAP (Online Analytical Processing): Built for heavy-duty number crunching, making it perfect for BI dashboards, sales forecasting, and trend analysis. It uses columnar storage to speed up complex queries across massive datasets. Workload intensity: Read vs. write optimization Beyond OLTP and OLAP, the best database for big data analytics depends on whether workloads are read-heavy, write-heavy, or balanced. Workload type Description Best for Common databases Read-heavy (analytics) Optimized for fast queries on large datasets using columnar storage. Designed for high-speed analytics and aggregations. BI, reporting, real-time dashboards Snowflake, Redshift, ClickHouse Read-heavy (transactional) Designed for frequent, small read operations with row-based storage. Prioritizes consistency and quick lookups. E-commerce, financial transactions, authentication PostgreSQL, MySQL Write-heavy Built for high-ingestion workloads, utilizing distributed NoSQL or NewSQL to handle large-scale writes efficiently. Logging, IoT, event-driven applications Cassandra, DynamoDB Mixed read-write Handles both frequent reads and heavy writes, typically using HTAP for real-time insights, making it the best database for real-time analytics in hybrid workloads. Real-time analytics, risk assessment, hybrid workloads Google Spanner, SingleStore Batch vs. real-time processing Not all data needs to be processed instantly—some insights come from analyzing historical trends, while others require real-time action. Here’s a quick look at how batch and real-time processing stack up. Processing type Description Use cases Common databases Processing types Processes data in scheduled intervals (hours, days). High-latency but cost-efficient for large datasets. Financial reporting, trend analysis, historical analytics Snowflake, Amazon Redshift, Google BigQuery Continuously ingests and processes data with minimal latency for real-time decision-making. Fraud detection, IoT monitoring, AI-driven recommendations Apache Druid, Rockset, ClickHouse, TimescaleDB Industry-specific examples Industry Database Type Common Databases Retail Relational (SQL) – Manages inventory tracking, order processing, and sales reports. PostgreSQL NoSQL – Powers real-time product recommendations and customer personalization. DynamoDB Healthcare Relational (SQL) – Stores Electronic Health Records (EHRs) for HIPAA compliance. MySQL, PostgreSQL, Oracle, SQL Server NoSQL – Handles large unstructured datasets like MRI scans and genomic data. MongoDB Finance OLAP – Enables high-speed analytics on stock market data. ClickHouse, KDB+ NoSQL – Stores log data for fraud detection and compliance monitoring. Apache Cassandra Key factors in choosing the best database for analytics Finding the best database for analysis requires striking the perfect balance between speed, scalability, and long-term reliability. Here’s what to keep in mind. Scalability & performance A scalable database must grow with your workload, but choosing the right approach matters. You can scale up (add CPU, RAM, or storage) for quick performance boosts, but hardware limits make this costly. Scaling out (distributing workloads across multiple servers) ensures long-term efficiency, though poor sharding and load balancing can slow queries and increase costs. Tip: The best databases scale both ways—handling today’s demands while preparing for tomorrow’s growth. Handling growing datasets without bottlenecks The more data you store, the harder it becomes to maintain speed and cost efficiency. The best database for analytics must handle growing volumes without performance trade-offs. Here’s what to look for: Batch processing: Processes data in scheduled intervals. It’s cost-effective but introduces latency. Streaming processing: Processes data continuously for real-time insights, but requires more resources. Distributed storage solutions: Distributes data across nodes to improve performance, scalability, and prevent slow queries. Tip: A database designed for real-time ingestion and distributed storage keeps insights flowing—without breaking the bank. Query speed & optimization A slow database costs more than time—it costs opportunities. Optimizing for speed ensures real-time insights without wasted resources. Here’s what makes a database fast: Indexing & partitioning: Organizes data efficiently, reducing query times. Columnar storage: Optimized for analytics, cutting aggregation times and lowering storage costs. Intelligent caching: Prevents redundant processing by storing frequent query results for instant retrieval. Data integration & compatibility A database for data science should connect smoothly with analytical tools to turn raw data into insights. One that doesn’t? It locks your data in silos, killing efficiency. To avoid costly inefficiencies, your database should: Connect with BI tools like Power BI, Tableau, and Looker—no messy exports or workarounds. Support ETL Pipelines & APIs for efficient Extract, Transform, Load (ETL) automation. Handle both structured & unstructured data without forcing you into multiple systems. Security & compliance Most breaches aren’t caused by hackers—they’re caused by misconfigured databases. Here’s how to secure yours: End-to-end encryption & strict access control: Protect sensitive data with encryption, Role-Based Access Control (RBAC), and Multi-Factor Authentication (MFA). Compliance with GDPR, HIPAA, and SOC 2: Mishandling financial, healthcare, or customer data can lead to major fines and legal action. Automated security monitoring: The best databases detect vulnerabilities before attackers do, preventing costly breaches. Types of databases for analytics Not all databases handle analytics the same way. Some are built for structured reporting, others for real-time processing, and a few balance both. Below are some analytical database examples suited for different workloads. Relational databases (SQL) Relational databases are the bedrock of high-stakes industries, where a single inconsistency can mean financial loss, compliance failure, or worse. Built to organize and manage structured data with precision, they power banking, healthcare, and enterprise systems that demand absolute reliability. Key aspects of relational databases Category Details Benefits – ACID compliance ensures reliable transactions without data corruption.- SQL supports complex queries and structured data analysis.- Indexing, caching, and materialized views speed up queries at scale. Challenges – Scaling requires complex partitioning and replication, increasing costs.- Row-based storage slows large-scale aggregations and analytics. Best For – Banking & finance: Ensures accuracy for regulatory compliance.- Healthcare & compliance: Maintains secure, structured records.- ERP systems: Handles inventory, payroll, and financial reporting. Examples PostgreSQL, MySQL (widely used in MySQL database hosting), Microsoft SQL Server, Oracle, IBM Db2. NoSQL databases NoSQL databases scale horizontally across nodes (when sharded), handling semi-structured and unstructured data. They’re the go-to for big data, real-time applications, and distributed workloads that outgrow traditional SQL databases. Key aspects of NoSQL databases Category Details Benefits –  Have no rigid schemas, making it easy to store unstructured data.- They scale without limits by distributing data across multiple nodes.- Their high-speed writes optimize real-time ingestion and fast analytics. Challenges – Eventual consistency may cause delays in data accuracy.- Limited query capabilities make joins and aggregations difficult. Best For – Big data applications that handle growing, evolving datasets.- Real-time analytics for IoT, event-driven apps, and monitoring.- Scalable web apps that require fast, flexible data storage. Examples MongoDB, Cassandra, DynamoDB, Couchbase, Firestore Columnar databases Columnar databases store data vertically instead of in rows, reducing disk I/O and enabling faster aggregations, filtering, and BI reporting. They’re built for analytics, not transactions. Key aspects of columnar databases Category Details Benefits – Columnar storage accelerates aggregations and reporting.- Compression reduces costs and enhances performance.- Massively parallel processing (MPP) allows rapid and efficient data retrieval. Challenges – Not ideal for transactional workloads—optimized for reads, not writes.- Requires preprocessing, reducing flexibility for real-time updates. Best For – Business intelligence needing quick data aggregation and insights.- Large-scale analytics handling petabyte-sized structured datasets.- Data warehousing for trend analysis and historical reporting. Examples Amazon Redshift, Google BigQuery, Snowflake, ClickHouse, Apache Parquet Real-Time databases When milliseconds matter, real-time databases process high-velocity data streams for instant insights—powering everything from fraud detection to AI-driven personalization. Key aspects of real-time databases Category Details Benefits – Detect anomalies instantly for fraud prevention, cybersecurity, and AI.- Handles high-concurrency data streams for fast ingestion and queries.- They integrate smoothly with Kafka, Kinesis, and event-driven pipelines. Challenges – Higher storage and compute costs due to continuous processing.- Requires advanced data streaming architectures to manage high-velocity workloads. Best For – Fraud detection in banking and financial transactions.- IoT analytics processing real-time sensor data.- AI-driven recommendations for personalization and predictive modeling. Examples Apache Druid, Rockset, TimescaleDB, Tinybird, Materialize Comparing 6 top-rated database systems for data analytics No single database fits every workload. Some handle real-time queries, others excel at batch processing, and a few do both. This guide compares six top databases—their strengths, limitations, and best use cases—to help you choose the right one. Let’s dive in. 1. Amazon Redshift – Best for traditional data warehousing Company: Amazon Web Services, Inc. (AWS) | Founded: 2012 | Country: United States Amazon Redshift is a cloud-based, SQL-driven data warehouse optimized for large-scale batch analytics, BI reporting, and structured data workloads. It integrates tightly with AWS services, making it a go-to solution for enterprises already invested in the AWS ecosystem. Key features include: MPP architecture: Columnar storage enhances query speed on large datasets. Batch processing: Requires periodic maintenance (vacuuming, indexing) to sustain performance. AWS integration: Connects with S3, Glue, QuickSight, and other AWS tools. Limited auto-scaling: Redshift Spectrum enables external queries but lacks elastic scaling. Security & compliance: Supports encryption and RBAC but lacks column-level security. Pricing: Pay-as-you-go and reserved instances; costs rise with high concurrency. Strengths & limitations Pros Cons Cost-efficient at scale with reserved pricing. Requires manual tuning (vacuuming, indexing). Smooth AWS integration for easy data ingestion. Lacks auto-scaling, limiting flexibility. Optimized for petabyte-scale workloads. Query slowdowns under high concurrency. Best for: AWS-heavy enterprises that need cost-efficient, high-performance batch analytics. 2. Snowflake – Best for multi-cloud scalability & data sharing Company: Snowflake Inc. | Founded: 2012 | Headquarters: United States Snowflake is a fully managed, multi-cloud data warehouse built for scalability, cost efficiency, and AI-driven analytics. Unlike traditional systems, it separates storage and compute, preventing resource contention. Key unique features include: Elastic compute & storage: Instantly scales without downtime or resource contention. Automated performance optimization: Built-in caching, clustering, and query pruning improve speed and reduce costs. Zero-copy cloning & time travel: Enables dataset duplication and historical data access without replication overhead. BI & cloud integration: Natively connects with Tableau, Power BI, Looker, and multi-cloud storage. Security & compliance: Automates RBAC, key management, and data masking for GDPR HIPAA compliance. Pay-as-you-go pricing: Requires careful warehouse optimization to avoid overspending. Strengths & limitations Pros Cons Auto-scaling eliminates performance bottlenecks. Costs escalate if warehouses aren’t optimized. Multi-cloud support (AWS, Azure, GCP). No reserved pricing, making it pricier for stable workloads. Handles structured & semi-structured data efficiently. Per-second billing requires careful cost monitoring. Best for: Multi-cloud analytics, AI-driven workloads, and real-time data sharing. 3. Google BigQuery – Best for Ad-Hoc & real-time BI analytics Company: Google LLC | Launched: 2010 | Headquarters: United States Google BigQuery is a fully serverless, auto-scaling cloud data warehouse designed for real-time, ad-hoc analytics on massive datasets. Unlike traditional warehouses, it eliminates resource provisioning, automatically adjusting compute power based on query complexity. Key features include: Serverless architecture: No provisioning—compute scales dynamically. Federated queries: Query live data across Google Cloud and external sources. AI & ML integration: Works with Vertex AI for advanced modeling. Security & compliance: IAM-based access control and automated GDPR/HIPAA compliance. Pay-per-query pricing: Charges $5 per TB scanned, requiring cost optimization. Strengths & limitations Pros Cons Serverless, eliminating the need for infrastructure management Costs rise quickly with frequent, complex queries Auto-scales instantly, handling thousands of concurrent users Lacks reserved pricing, making it less cost-efficient for predictable workloads Native AI/ML integration enables advanced analytics Best for: Ad-hoc analytics, federated queries, and machine learning workloads. 4. Microsoft Azure Synapse Analytics – Best for SQL-based enterprise workloads Company: Microsoft Corporation| Launched: 2019 (Rebranded from Azure SQL Data Warehouse) | Headquarters: United States Azure Synapse Analytics is Microsoft’s enterprise-grade data platform, designed for SQL-based analytics, data warehousing, and hybrid transactional/analytical processing (HTAP). Unlike serverless platforms like BigQuery and Snowflake, Synapse relies on dedicated SQL pools, meaning users must manually provision and optimize resources. Key features includes: Hybrid OLTP-OLAP engine: Supports both transactional and analytical workloads. Microsoft integration: Works with Power BI, Azure Data Factory, and SQL Server. Performance-optimized SQL queries: Ideal for batch analytics and complex SQL-based workloads. Security & compliance: Built-in encryption, GDPR/HIPAA compliance, and RBAC. Reserved pricing: Fixed costs but requires manual provisioning. Strengths & limitations Pros Cons Deep Microsoft ecosystem integration. Manual provisioning needed for scaling. Strong SQL support for enterprise analytics. Scaling isn’t as dynamic as Snowflake. HTAP capabilities for hybrid workloads. Reserved pricing can lead to underutilization. Best for: Large enterprises with SQL-heavy analytics and Microsoft ecosystem users. 5. ClickHouse – Best for high-speed, real-time analytics ClickHouse is an open-source, high-performance columnar database optimized for fast analytics on large datasets with near-real-time query performance. Unlike traditional SQL databases, it stores data in columns instead of rows, significantly boosting aggregation speed and reducing disk I/O. Designed for event-driven analytics, ClickHouse powers financial trading, log processing, and IoT workloads. Key features include: Columnar storage engine: Processes analytical queries up to 10 times faster than row-based databases. Distributed query processing: Supports sharding and replication for efficient horizontal scaling. Smooth integration: Works with Kafka, Spark, Grafana, and Tableau for real-time data visualization. Security & compliance: RBAC, TLS encryption, and built-in data masking for sensitive data protection. Open-source flexibility: Free to use but requires dedicated infrastructure for production environments. Strengths & limitations Pros Cons Blazing-fast aggregations—perfect for large-scale analytics. Requires expert tuning for optimal performance. Handles high-concurrency workloads with fast, parallel query execution. No auto-scaling or fully managed cloud service. Open-source flexibility with cost-effective scalability. Slower writes than NoSQL for high-ingestion workloads. Best for: Real-time analytics (financial markets, IoT, high-frequency event processing). 6. Apache Druid – Best for time-series & event-driven analytics Launched: 2011| Company: Originally developed by MetaMarkets | Headquarters: United States Apache Druid is a real-time analytics database built for high-ingestion, event-driven workloads. It excels in low-latency queries on massive time-series datasets, making it a go-to for streaming analytics, anomaly detection, and operational intelligence in finance, ad tech, and cybersecurity. Key features include: Real-time data ingestion: Supports continuous data streams from Kafka, Kinesis, and Spark. Columnar storage for fast queries: Optimized for low-latency time-series analytics. Horizontal scalability: Uses deep storage and segment replication to handle petabyte-scale data. BI & visualization integration: Works with Looker, Superset, and other BI tools. Security & compliance: RBAC, TLS encryption, and audit logging ensure data protection. Deployment flexibility: Open-source and self-hosted but also available in cloud-managed solutions. Strengths & limitations Pros Cons Handles real-time streaming analytics with sub-second query speeds Storage-heavy architecture increases infrastructure costs Scales efficiently for large event-driven workloads Complex setup and maintenance compared to serverless data warehouses Supports time-series, log, and operational data with instant query performance Less optimized for traditional BI workloads compared to Snowflake or Redshift Best for: Streaming analytics, security monitoring, AI-driven anomaly detection. How dbForge Edge enhances data analytics dbForge Edge is a software platform that provides a suite of tools for optimizing performance, simplifying workflows, and enhancing decision-making. With its user-friendly interface, it enhances [database management,](https://blog.devart.com/comparison-database-management-systems.html) making advanced analytics more accessible and efficient. Here’s a closer look at its offerings: Smooth Multi-Database Support: Manage MySQL, PostgreSQL, SQL Server, and Oracle without switching platforms. Smarter SQL Editing: Intelligent query optimization and debugging reduce errors and improve efficiency. Seamless Data Extraction & Transformation : Effortlessly retrieve, clean, and structure data from the databases for in-depth analysis. Advanced Query Optimization : Write, debug, and optimize SQL queries with intelligent suggestions and performance tuning. Automated Reporting & Dashboards : Generate dynamic reports and interactive dashboards to visualize trends, patterns, and anomalies. Enhanced Data Visualization : Use built-in profiling and visualization tools to transform raw data into actionable insights. Test Before You Commit: A free trial lets you experience the performance boost firsthand. Whether you look for solutions for sleek data manipulation, data analysis, or report generation, or developing a dashboard and seek to populate it with insights from raw data, [dbForge Edge](https://www.devart.com/dbforge/edge/) will cover all these needs. This multi-database solution has already recommended itself as a powerhouse for data engineers, data analysts, and everyone working with business intelligence since it doesn’t require advanced knowledge of database management to let you work with data and perform even the most complex queries and operations on it visually. Try [dbForge Edge](https://www.devart.com/dbforge/edge/) , an ultimate database development and management soultion to level up your data analysis and data-driven decision making! Conclusion Your database isn’t just a storage system—it determines how fast you get insights, how smoothly analytics runs, and how well your business scales. Choosing the right one means fast queries, smooth integrations, and cost-efficient scaling. The wrong one? Slow reports, skyrocketing costs, and endless frustration. But having the right database is just the start. To truly optimize performance, you need the right tools. dbForge Edge helps you fine-tune queries, automate indexing, and eliminate performance bottlenecks—so your database works at its full potential. [Download dbForge Edge](https://www.devart.com/dbforge/edge/download.html) for free and get the most out of your analytics. Tags [data analyst](https://blog.devart.com/tag/data-analyst) [data analytics](https://blog.devart.com/tag/data-analytics) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-database-for-data-analytics.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Choose+the+Right+Database+for+Data+Analytics&url=https%3A%2F%2Fblog.devart.com%2Fbest-database-for-data-analytics.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-database-for-data-analytics.html&title=How+to+Choose+the+Right+Database+for+Data+Analytics) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-database-for-data-analytics.html&title=How+to+Choose+the+Right+Database+for+Data+Analytics) [Copy URL](https://blog.devart.com/best-database-for-data-analytics.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-delphi-ides-features-and-benefits.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) 5 Best Delphi IDEs for 2025: Features and Benefits By [Victoria Shyrokova](https://blog.devart.com/author/victorias) November 13, 2024 [0](https://blog.devart.com/best-delphi-ides-features-and-benefits.html#respond) 705 An integrated development environment (IDE) for Delphi can streamline workflows, making it easier to deliver high-quality applications. The best Delphi IDE is essentially an all-in-one toolkit for crafting, testing, and deploying your application. Advanced IDEs empower developers to modernize data access components while preserving code security. Yet, not all of them ease the complexities of multi-platform development in the same way. So, let’s look at the five most popular Delphi IDEs and compare their features, benefits, and compatibility capacities. Table of contents RAD Studio C++Builder Lazarus CodeTyphon Visual Studio Code + Delphi Extension Pack Data Access Components for Delphi Conclusion RAD Studio Embarcadero’s official Delphi IDE and one of the most feature-rich environments for corporate use, RAD Studio offers powerful debugging tools and form designers for complex, multi-tiered projects. RAD Studio features Integrated Delphi and C++ support. Use both languages within the same IDE, simplifying multi-language projects and collaboration. Visual Component Library (VCL). It includes updated UI controls and WinRT APIs for Windows 11. Advanced debugging and profiling. You can use a proprietary debugger for Delphi applications on Windows or LLDB v12 debugging for non-Windows apps. RAD Server for enterprise applications. Build REST API servers with Swagger support, simplifying multi-tier app development. Benefits of RAD Studio Quick app design, prototyping, and deployment across platforms. Integration with version control systems simplifies collaboration between teams. A library of built-in components and 3rd-party integrations that reduces development time. Support for the latest mobile and desktop platforms. C++ Builder C++ Builder (part of RAD Studio) offers a powerful IDE for Delphi and C++ development. It provides robust cross-platform capabilities, advanced debugging tools, and a visual design environment. C++ Builder features Modern C++17 support. Use a Clang-enhanced compiler with full C++17 support. It also integrates the Dinkumware Standard Template Library (STL), providing developers access to essential data structures and algorithms. Visual development tools. Design user interfaces using a drag-and-drop interface. Integrated debugging and testing. Advanced debugging, memory profiling, and testing tools streamline your development process and reduce time to market. Continuous integration support. Integrate with popular CI tools like MSBuild, CMake, and Ninja, automating builds, testing, and deployment for more efficient development cycles. Benefits of C++ Builder Switchable languages within the same IDE that simplify workflows for multi-language teams. Advanced UI design and data binding features for easier development of complex apps. Access to the latest C++ features and popular libraries like Boost and SDL2. Connectivity with native databases, cloud integration, and debugging for scalable apps. Lazarus This free, open-source IDE is widely popular among budget-conscious developers for its “write once, compile anywhere” philosophy. With Lazarus, developers can create applications for various platforms, including Raspberry Pi. Although it lacks the advanced features of commercial IDEs like RAD Studio, Lazarus is often considered one of the best Delphi IDE tools thanks to its high customizability and active community support. Lazarus features Cross-platform development . Lazarus allows developers to write code once and compile it for different platforms. Rich visual components and libraries . Access a broad range of visual components through the Lazarus Component Library (LCL). This Delphi VCL-compatible library simplifies GUI design across platforms. Lack of dependencies . Create programs without platform-specific dependencies, simplifying distribution and reducing issues related to missing libraries. Debugging and code analysis . With GNU Debugger (GDB), you can set breakpoints, step through code, and inspect variables to resolve issues efficiently. Benefits of Lazarus Modifiable source code and a flexible library of plugins. Active community support that ensures constant updates and bug fixes. Ideal for small teams thanks to its lightweight nature and minimal resource requirements. Different widget sets for native-looking apps across environments. CodeTyphon CodeTyphon is a free, open-source environment based on FreePascal, designed for cross-platform development. While Lazarus has caught up in some areas, CodeTyphon remains a comprehensive solution for developers seeking a Delphi-compatible IDE that supports multiple operating systems. However, CodeTyphon has had licensing and copyright issues, including concerns over removed copyright notices and using the KSDEV code. The license prohibits redistribution and bundling, which conflicts with open-source principles. CodeTyphon features Extensive component library. Access a wide array of bundled components, libraries, and samples. Cross-platform capabilities. Support over 200 OS-CPU-Platform targets, including Windows, Linux, macOS, and BSD variants. Integrated development tools. Seamlessly transition between designing, coding, and debugging within the all-in-one suite, covering the entire development lifecycle. Embedded form designer. Visually design app interfaces using the drag-and-drop form designer. Benefits of CodeTyphon Full-cycle development support for Delphi and FreePascal. Community that contributes to new features, bug fixes, and security patches. Quick environment configuration to start compiling for multiple platforms. Visual Studio Code + Delphi Extension Pack As you know, Visual Studio Code (VSC) is a versatile, lightweight IDE that supports almost every major programming language. As for the Delphi Extension Pack and DelphiLSP, these packages let you integrate Delphi development directly into VSC. Delphi Extension Pack features Rich code editing. Enhance Visual Studio Code with Delphi-specific features like improved syntax highlighting, snippets, and web support. Pascal and Delphi language support. Access comprehensive support for Pascal and its dialects, including detailed syntax highlighting and extensive snippets. Multi-language development. Work on different projects simultaneously, supporting multiple languages and maximizing versatility with LSP-compatible editors. DelphiLSP support. You can integrate DelphiLSP for code completion and error checking, although some features are still in early development. Benefits of VS Code combined with Delphi Extension Pack Thousands of VS Code extensions for version control, debugging, and CI/CD. Adjustable color themes, team-specific configurations, and customization profiles. Usable for web development, backend programming, and more. Delphi IDE look and keyboard shortcuts for easy adaptation. Compatibility Comparison of Delphi IDE tools IDE Operating systems Visual designer Open-source [RAD Studio](https://docwiki.embarcadero.com/RADStudio/Sydney/en/Supported_Target_Platforms) Windows, macOS, iOS, Android, Linux, Raspberry Pi ✔ ✘ [C++ Builder](https://blogs.embarcadero.com/c-builder-and-platforms-support/) Windows, macOS, iOS, Android, Linux ✔ ✘ [Lazarus](https://www.lazarus-ide.org/index.php?page=about) Windows, macOS, Linux, FreeBSD, Raspberry Pi ✔ ✔ [CodeTyphon](https://www.pilotlogic.com/sitejoom/index.php/86-wiki/installation/172-minimum-system-requirements.html) Windows, Linux, macOS, OpenIndiana, FreeBSD, OpenBSD, NetBSD, DragonFly, Solaris, and more ✔ ✔ [VSC with Delphi Extensions](https://code.visualstudio.com/docs/supporting/requirements) Windows, macOS, Linux (for VSC) ✘ ✔  (IDE) Data Access Components for Delphi [Universal Delphi Data Access Components](https://www.devart.com/unidac/) (UniDAC) from Devart streamline data handling the cross-platform development capabilities in popular Delphi programming IDE solutions, including RAD Studio, Lazarus, C++Builder, and FreePascal. Additionally, UniDAC allows applications to work with data from various cloud services, databases, and many other third-party systems. We also provide extensive documentation, including guides on [how to connect DAC to SQL Server](https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html) or [Oracle](https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html) with minimal disruptions. Benefits of DAC High performance and low memory consumption, [compared to similar solutions](https://blog.devart.com/unidac-vs-firedac-performance-and-memory-consumption-comparison.html) . Simplified database connection management and data handling across multiple systems. Development for major desktop and mobile platforms with regular updates for compatibility. Broad support and integration with cloud services and enterprise-level solutions for flexible app development. In-memory datasets for quick data processing and real-time data manipulation. Secure connections (SSL, SSH, and HTTP/HTTPS) protocols for communication. Conclusion When selecting the best Delphi IDE, the key factors include compatibility with operating systems and programming languages, technical support, and customizability. But most importantly, using it should feel natural, so we recommend taking each of them for a test drive to find the one you’ll be comfortable using. Companies can further enhance their development efficiency with multiple IDEs with [Devart’s UniDAC](https://www.devart.com/unidac/) , which streamlines database connectivity and data handling. Moreover, Data Access Components from Devart provide a [60-day trial](https://www.devart.com/unidac/download.html) with full access to all features. So, if you’re curious to see why it has become a tool of choice for many companies and developers, give it a shot. Tags [dac](https://blog.devart.com/tag/dac) [dac components](https://blog.devart.com/tag/dac-components) [unidac](https://blog.devart.com/tag/unidac) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-delphi-ides-features-and-benefits.html) [Twitter](https://twitter.com/intent/tweet?text=5+Best+Delphi+IDEs+for+2025%3A+Features+and+Benefits&url=https%3A%2F%2Fblog.devart.com%2Fbest-delphi-ides-features-and-benefits.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-delphi-ides-features-and-benefits.html&title=5+Best+Delphi+IDEs+for+2025%3A+Features+and+Benefits) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-delphi-ides-features-and-benefits.html&title=5+Best+Delphi+IDEs+for+2025%3A+Features+and+Benefits) [Copy URL](https://blog.devart.com/best-delphi-ides-features-and-benefits.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/best-free-sql-database-software.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Best Free SQL Database Software By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) March 20, 2024 [0](https://blog.devart.com/best-free-sql-database-software.html#respond) 2304 SQL (Structured Query Language) is a universal language for interacting with relational database management systems (RDBMSs); and those interactions can be of different complexity, depending on the capabilities of the SQL tool you are using. Modern SQL tools cover the needs of both seasoned database professionals and casual users, especially those tools that provide graphical user interfaces. Database systems themselves are not one-size-fits-all either; each one has its peculiar features, advantages, and constraints. That’s why we decided to review the most popular free SQL tools tailored for different database systems to guide you towards an informed choice. Leading Database Management Systems and Their Characteristics Before we start reviewing actual SQL tools, let’s briefly describe the database management systems that these tools deal with. Plenty of various DBMSs are in operation now, but the lion’s share of the market belongs to 4 major systems: Microsoft SQL Server MySQL (alongside its most widely used fork, MariaDB) Oracle PostgreSQL Microsoft SQL Server [Official website](https://www.microsoft.com/en-us/sql-server/) SQL Server, developed and marketed by Microsoft, is a relational database management system, and a much favored option in corporate environments. If there is a high demand for handling numerous transactions and conducting comprehensive business analytics, SQL Server enters as the top choice. This RDBMS also offers advanced data security and in-memory performance, as well as integrated business intelligence tools. Supported OS: Primarily Windows, with support for Linux and Docker containers (necessary for work on macOS). Key features: Scalability through features like clustering, partitioning, and parallel query processing SQL Server Integration Services (SSIS) for data integration and transformation SQL Server Reporting Services (SSRS) for creating, managing, and deploying reports SQL Server Analysis Services (SSAS) for online analytical processing (OLAP) and data mining Robust security features including encryption, authentication, and access control High availability through features like AlwaysOn Availability Groups and failover clustering Full-text indexing and searching capabilities Pros: Seamless integration with other Microsoft products and services Excellent performance when dealing with complex queries Both on-premise and cloud environments Cons: Primary optimization for the Windows environment High licensing costs Recommended tools for Microsoft SQL Server: [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) (Express edition) [SQL Server Management Studio](https://www.devart.com/dbforge/sql/studio/alternative-to-ssms.html) + dbForge SQL Tools [DBeaver](https://www.devart.com/dbforge/edge/dbeaver-alternative.html) DbVisualizer [HeidiSQL](https://www.devart.com/dbforge/edge/heidisql-alternative.html) MySQL and MariaDB [MySQL official website](https://www.mysql.com/) [MariaDB official website](https://mariadb.com/) MySQL is an open-source relational database management system, currently owned by Oracle Corporation. MariaDB, a derivative of MySQL, was developed by the original MySQL creators to maintain full compatibility with MySQL. Globally, MySQL stands as one of the most popular database systems, a robust and versatile platform for a wide variety of applications, particularly within small and medium-sized business projects. Supported OS: Cross-platform, including Windows, Linux, and macOS Key features: High performance, especially in read-heavy operations Scalability through features like replication, sharding, and clustering Multiple storage engines Support for various replication configurations Support for triggers and stored procedures Advanced security with user authentication, access control, and encryption Pros: Free, open-source platform (both MySQL and MariaDB) A large active community providing support Cons: Lack of optimization for large-scale projects Recommended tools for MySQL and MariaDB: [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) (Express edition) [MySQL Workbench](https://www.devart.com/dbforge/mysql/studio/alternative-to-mysql-workbench.html) [phpMyAdmin](https://www.devart.com/dbforge/mysql/studio/alternative-to-phpmyadmin.html) [Adminer](https://www.devart.com/dbforge/edge/adminer-alternative.html) [SQLyog](https://www.devart.com/dbforge/mysql/studio/alternative-to-sqlyog.html) Oracle [Official website](https://www.oracle.com/) Oracle Database is a leading database management system, particularly valued for its enterprise-scale capabilities. It frequently becomes the default option for projects that require handling complex transactions and managing sizable databases within corporate settings. Additionally, effective use in cloud computing and data warehousing makes Oracle a preferred choice in the most demanding environments. Supported OS: Cross-platform, including Windows, Linux, and Unix. Key features: High availability and disaster recovery with Oracle Real Application Clusters (RAC) and Data Guard Support for partitioning Transparent Data Encryption (TDE), fine-grained access control, and audit vault Multitenant architecture with features like pluggable databases (PDBs) Various data compression techniques Advanced Analytics feature for in-database analytics and machine learning Spatial and graph database features Pros: Enterprise-level functionality, reliability, and performance Advanced security with plenty of features to ensure the safety of data Cons: Steep learning curve High price due to the licensing model Recommended tools for Oracle: [dbForge Studio for Oracle (Express edition)](https://www.devart.com/dbforge/oracle/studio/alternative-to-oracle-sql-developer.html) [Oracle SQL Developer](https://www.devart.com/dbforge/oracle/studio/alternative-to-oracle-sql-developer.html) Valentina Studio [Toad for Oracle](https://www.devart.com/dbforge/oracle/studio/toad-for-oracle-vs-dbforge-studio-for-oracle.html) PostgreSQL [Official website](https://www.postgresql.org/) PostgreSQL is a widely recognized open-source, object-relational database system. Its growing popularity stems from its remarkable extensibility and full compliance with SQL standards. With support for advanced data types and a comprehensive suite of performance optimization features, PostgreSQL delivers exceptional speed, flexibility, and high performance. Supported OS: Cross-platform, including Windows, Linux, and macOS Key features: Support for a wide range of data types, including arrays and user-defined types Native support for JSONB data type MVCC (Multiversion Concurrency Control) for concurrent transactions Foreign Data Wrappers (FDW) for accessing data from external sources Full-text search Support for table partitioning Support for streaming and logical replication SSL support, role-based access control (RBAC), and row-level security Pros: Free, open-source platform Active community High customization and flexibility Integration with other databases and systems Cons: Difficult performance tuning, which requires time and experience Lack of proper optimization for enterprise-level projects Recommended tools for PostgreSQL: [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) [pgAdmin](https://www.devart.com/dbforge/postgresql/studio/pgadmin-alternatives.html) DBeaver Multidatabase The above-mentioned database management systems represent the most popular options for the majority of projects. However, organizations may not rely solely on one system. Nowadays, it is a common practice to utilize multiple DBMSs for various purposes throughout the organization. This trend highlights the need for a universal solution (an application or a bundle of applications), compatible with all the required database systems, and covering the most widespread tasks related to database development, management, and administration. Recommended tools for multidatabase environments: dbForge Edge (Express edition) HeidiSQL DBeaver [DataGrip](https://www.devart.com/dbforge/edge/datagrip-vs-dbeaver-vs-dbforge-edge.html) GUI Tools for SQL Databases Here we are going to delve into some of the most renowned SQL-related tools. We’ll review both multidatabase solutions and solutions tailored for specific systems. It’s worth noting that our review focuses on free tools or those offering free versions. dbForge Studio (Express edition) SSMS + dbForge SQL Tools dbForge Edge (Express edition) DBeaver (Community edition) DbVisualizer HeidiSQL Oracle SQL Developer Valentina Studio Free MySQL Workbench phpMyAdmin Adminer pgAdmin dbForge Studio (Express edition) dbForge Studio is Devart’s flagship product, a comprehensive integrated development environment (IDE) tailored for a variety of database management tasks across different database management systems. It comes in four versions: for [SQL Server](https://www.devart.com/dbforge/sql/studio/) , [MySQL/MariaDB](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL/Amazon Redshift](https://www.devart.com/dbforge/postgresql/studio/) . The main goal of dbForge Studios is to minimize manual efforts and accelerate the completion of routine tasks by offering a suite of tools for database development, management, administration, and data analysis. Each Studio comes in a free Express edition that covers basic functionality. For users requiring more advanced features, paid editions are available through either subscription-based or perpetual licensing options, with fully functional free trials. Platform: Windows-native, yet can be launched on macOS and Linux via Wine or CrossOver Key Features: Coding assistance with auto-completion, code formatting, syntax checking, debugging, and more Query Builder for visual query construction without manual coding Database Designer for visual database development through ER diagrams Data import and export (support for 10+ data formats) Test data generation (200+ generators) Database monitoring and real-time troubleshooting User management Data analysis and reporting Pros: Database comparison and synchronization (involving both schemas and data) Automated generation of database documentation Data aggregation in pivot tables and visualization with graphs Detailed product documentation and other learning resources, including video tutorials Personalized support CLI-powered automation Cons: Lack of native support for non-Windows OS (requires compatibility solutions) Focus on a single DBMS Lack of more advanced features in the free Express edition Pricing: Free – Express edition (limited features) Paid – Subscription or perpetual license, from $9.95 per month, depending on the edition Free trial: 30 days (fully functional) SSMS + dbForge SQL Tools [SQL Server Management Studio (SSMS)](https://learn.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver16) is Microsoft’s stock IDE for SQL Server, designed to perform database-related tasks across SQL infrastructures, from SQL Server to Azure SQL Database. With its rich functionality and flexible configuration options, SSMS allows users to tailor the solution to their needs and develop, manage, and maintain SQL Server databases effectively. A significant advantage of SSMS lies in its extensibility through specialized add-ins. In addition to its own collection of plugins, SSMS supports third-party add-ins, offering the opportunity to enhance the standard functionality. Notably, the [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) package stands out as a comprehensive suite of 15 add-ins for SSMS, introducing a plethora of new options to the familiar interface, ranging from advanced code assistance to Source Control integration. You can install the entire package or some tools separately. Platform: Windows, requires virtual machines to run on other OS Key features: IntelliSense-based code completion T-SQL code debugging Visual tools for database design and query building Database schema and table data comparison and synchronization Database monitoring and tuning Task scheduling and automation Reporting services management Pros All-in-one tool for managing SQL Server databases Smart and intuitive graphical user interface (GUI) Support and regular updates from Microsoft Easy functional extensibility via plugins Cons: Limited to Microsoft SQL environments Possible compatibility issues for older SSMS versions Requires virtual machines to run on macOS or Linux Pricing: SSMS – Free dbForge SQL Tools – The package includes [5 free tools](https://www.devart.com/dbforge/sql/free-sql-tools/) (notably, the Express edition of the powerful [SQL Complete coding assistant](https://www.devart.com/dbforge/sql/sqlcomplete/) ), others are available for trial. Subscription-based license from $399.95 per year. Free trial: 30 days (fully functional) dbForge Edge (Express edition) [dbForge Edge](https://www.devart.com/dbforge/edge/) is a comprehensive multidatabase solution offering advanced functionality. It comprises four specialized dbForge Studios, enabling users to efficiently manage a wide range of database tasks across major database management systems such as SQL Server, MySQL/MariaDB, Oracle, and PostgreSQL. dbForge Edge serves as a universal toolset for professionals handling diverse database systems in their work, providing all necessary options within a single platform and eliminating the need for multiple tools and IDEs. Platform: Windows-native, yet can be launched on macOS and Linux via Wine or CrossOver Key features: SQL coding assistance with auto-completion, formatting, debugging, and refactoring capacities Customizable collection of code snippets Query building, analysis, and optimization Schema and data comparison and synchronization Easy generation of high-quality test data Data import and export to 10+ most popular data formats Data analysis and reporting Server monitoring User and privilege management CLI-powered automation Pros: One solution for handling tasks across multiple database systems Smart user interface with robust customization options Direct data migration between databases in different DBMSs Regular updates with functionality enhancements Comprehensive documentation Professional support from the vendor Cons: Limited features in the free Express edition Available options vary across the Studios No native support for macOS and Linux (requires Wine or CrossOver) Pricing: Free – Express edition (basic functionality) Paid – Subscription-based license starts from $699.95 per year Free trial: 30 days DBeaver (Community edition) [DBeaver](https://dbeaver.io/) is one of the leading database management tools, favored for its extensive compatibility with over 80 different database systems, including all the major DBMSs. This wide-ranging support makes it an ideal choice for environments that operate with multiple databases. As an open-source application, DBeaver allows users to tailor itself to meet all their specific needs. Powerful features, coupled with a clean user interface allow professionals of all levels to conduct their work efficiently across all platforms used in their daily workflows. Platform: Windows, Linux, macOS Key features: SQL Editor with code auto-completion, script debugging, and snippets Visual construction of databases and tables Visual Query Builder to create complex queries without coding Data import and export to the most popular formats Server monitoring with a visual SQL EXPLAIN plan Multiple test data generators with data customization Backup and recovery Tracking of user sessions Pros: Connection manager for multiple database connections Robust visualization capacities with diverse data viewing options Database comparison (schemas and table data) Numerous security-related features Cons: Advanced functionality is provided in paid editions Personal support is provided to paid users only Limited schema comparison options Short free trial Pricing: Free – Community edition Paid – Subscription-based license from $11 per month Free trial: 14 days DbVisualizer [DbVisualizer](https://www.dbvis.com/) is a universal database tool for developers, DBAs, and analysts, designed to be user-friendly and suitable for varying levels of expertise in database management. Moreover, it is a multidatabase solution that supports a range of popular DBMSs and offers a convenient GUI, letting the users speed up their routine tasks. Platform: Windows, Linux, macOS Key Features: Smart code auto-completion and syntax check Visual query construction and analysis (the EXPLAIN plan) Data visualization and direct data editing Data import and export User access configuration and control Pros: Code formatting tools UI customization with various themes and options Advanced connection security Cons: No database comparison and synchronization No database performance monitoring features Insufficient support and learning materials Pricing: Free – Limited functionality Paid – $197 per the first year, license renewal $69.00 per year HeidiSQL [HeidiSQL](https://www.heidisql.com/) is an established multidatabase tool that efficiently copes with essential database tasks, including SQL coding, query construction, and the creation and editing of various database objects, along with some administrative jobs. While it may lack some of the advanced features found in more robust tools, HeidiSQL remains a popular choice among database specialists thanks to its open-source nature and ease of use. Platform: Windows-native, requires Wine for operating on Linux Key features: SQL code completion and formatting Library of SQL code snippets Visual construction of complex queries Database user management Data import and export Backup and recovery Pros: Easy connection management with command-line support Secured connection with encryption Bulk table editing Text search across database tables and various databases Cons: Limited functionality Lack of documentation and vendor support Pricing: Free Oracle SQL Developer [Oracle SQL Developer](https://www.oracle.com/database/sqldeveloper/) , created by Oracle Corporation, serves as the default graphical user interface (GUI) client for Oracle Database, both on-premises and cloud-based. As an Oracle product, it handles a wide range of standard database development and management tasks and comes at no additional cost with any Oracle Database license, also ensuring regular updates and support provided by the company. Platform: Windows, Linus, macOS Key features: PL/SQL coding with context-aware auto-completion Visual query design and analysis Visual database design Data migration Database administration, including performance tuning, session monitoring, and user management Data reporting Version control integration Unit testing for PL/SQL entities Pros: Reverse and forward engineering support In-built features for the migration of third-party databases to Oracle Multi-language interface Cons: Lack of learning resources High resource consumption Pricing: Free Valentina Studio Free [Valentina Studio](https://www.valentina-db.com/en/valentina-studio-overview) is a multidatabase GUI tool for easy management of MySQL, MariaDB, SQL Server, SQLite, PostgreSQL, and its own Valentina DB databases. An intuitive interface allows users to manage, model, and develop database schemas, perform complex SQL queries, and analyze data. Valentina Studio stands out for its versatility and efficiency in handling database-related tasks, making it a valuable tool for professionals working with data. Platform: Windows, Linux, macOS Key features: Smart SQL editor with auto-completion Visual database modeling and query building Schema comparison and synchronization Database-specific direct data editing Database administration features Data reporting with visualization Pros: Database-specific code snippets Handy keyboard shortcuts Convenient and intuitive user interface Quick data search Easy handling of multiple databases in different DBMSs Cons: Lack of proper documentation Lack of technical support Pricing: Free – Basic functionality Paid – From $99.99 (perpetual license) MySQL Workbench [MySQL Workbench](https://www.mysql.com/products/workbench/) is the default IDE for MySQL and MariaDB, available for free. It is one of the most popular solutions for database development and management, with robust functionality and an intuitive graphical interface. Workbench is a great aid for users of all expertise levels, allowing them to speed up all routine operations, reduce the amount of manual work, and minimize errors caused by the human factor. Platform: Windows, Linux, macOS Key features: Powerful SQL editor (context-aware auto-completion, syntax check, code formatting) Visual database and table design and editing Query analysis with the EXPLAIN plan Object editor Data import and export (most popular data formats) Comprehensive database administration and monitoring tools Basic database documenting Pros: Advanced data modeling SSH and SSL support Database connection management Large and active community Cons: Lack of visual query building Lack of code debugging Restricted database schema and data comparison (is possible via Shell for MySQL utilities only) Pricing: Free phpMyAdmin [phpMyAdmin](https://www.phpmyadmin.net/) is a user-friendly web-based tool for managing MySQL databases. Its neat graphical interface that simplifies database management tasks and accessibility via a web browser makes this tool a popular choice. Offering the functionality necessary to perform a wide range of standard database tasks efficiently, phpMyAdmin allows working effectively regardless of location or device. Platform: Web-based Key features: SQL editor with code completion and syntax validation Creation, editing, and management of databases and their objects Visual database design and editing Database explorer Data editor with various viewing options Data export and import User and session management Pros: Search across tables and various databases Precise user permission configuration Detailed documentation and other informational resources Active and supportive community Localization in 72 languages Cons: Requires a web server and PHP installed and configured Brings additional load on the server Lacks some essential features like code debugging or database comparison Pricing: Free Adminer [Adminer](https://www.adminer.org/) is a web-based database management tool that often serves as an alternative to phpMyAdmin. Its popularity is based on its user-friendly interface, security features, and enhanced performance. This tool provides users with the capability to connect to multiple database servers, directly write and execute SQL queries on specific databases, manage databases efficiently, and carry out essential tasks with ease. Platform: Web-based Key features: Connection manager for multiple database connections Query writing and execution against any specific database Database creation and editing Database object creation and editing Search for data across tables and databases Data import and export User and privilege management Pros: Single-page application with a neat GUI Database schema printing Extensibility through dedicated plugins Collection of ready-made themes and designs for UI customization Localization in 43 languages Cons: Requires a web server and an installed and configured PHP The functionality is limited in comparison to other IDEs Pricing: Free pgAdmin [pgAdmin](https://www.pgadmin.org/) is the leading open-source tool for managing and administering databases on PostgreSQL. It is also the only application defined as the Postgres default client. pgAdmin offers a web-based interface that enables users to manage their databases from any location and comes equipped with a comprehensive suite of features for executing various database tasks, enhancing its accessibility and efficiency for users across the globe. Platform: Web-based Key features: SQL editor with smart code completion, syntax validation, and refactoring features Query analysis with the visual SQL execution plan Data editor Data search on a live database Effective navigation through objects Schema comparison and synchronization Data import and export (CSV, Text, Binary formats) Database administration capacities Pros: SQL code debugging Extensive documentation Large professional community Cons: No reporting functionality No UI customization options Pricing: Free These are the best-established GUI tools for doing various database tasks in different DBMSs. Conclusion In the current review, we have compiled information about various database management solutions, mainly focusing on those available free of charge. We have described their functionality and highlighted both the advantages and disadvantages of each. We hope that this review will help you choose the best solution for your work. If you are still unsure which one is the best fit, you might as well start with the [fully functional free trial of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) that is provided for 30 days and offers you to test all of its powers under full workload. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PosrgreSQL](https://blog.devart.com/tag/dbforge-studio-for-posrgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-free-sql-database-software.html) [Twitter](https://twitter.com/intent/tweet?text=Best+Free+SQL+Database+Software&url=https%3A%2F%2Fblog.devart.com%2Fbest-free-sql-database-software.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-free-sql-database-software.html&title=Best+Free+SQL+Database+Software) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-free-sql-database-software.html&title=Best+Free+SQL+Database+Software) [Copy URL](https://blog.devart.com/best-free-sql-database-software.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-git-gui-clients-for-windows.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) 10 Best Git GUI for Windows in 2025 By [dbForge Team](https://blog.devart.com/author/dbforge) January 3, 2025 [0](https://blog.devart.com/best-git-gui-clients-for-windows.html#respond) 92679 Git is a Version Control System (VCS) designed to track and record all changes made to files and code. With its help, you can compare, analyze, and merge changes, commit them to the repository (the storage of your code and code changes), or roll them back and restore the previous versions. Git is not the only VCS, but it is, undoubtedly, the most popular one. Since its birth in 2005, it has become the default solution for version control – decentralized, simple, fast, and highly efficient. All developers have the same tools, and the entire development process is much more flexible and transparent. To work with Git, you can use the command-line interface. Many professionals consider this to be the right way. On the other hand, you can use a Git GUI as an alternative. A Git GUI client is a tool that provides the user with an intuitive interface and does not require writing commands manually. This way, one can perform development tasks faster and in a more comfortable manner. In this article, we are going to review the most popular Git GUI tools for Windows, determine their strong sides, and help you pick the right solution for your particular needs. What Makes a Good Git GUI Client for Windows? Choosing the right Git client for Windows with a comprehensive GUI can significantly improve your workflow, making version control more intuitive and efficient in your projects. Here are the key qualities to look for: What to look for Why it’s important Clear and intuitive UI Enhances your Git workflow instead of overwhelming you with unnecessary dropdowns and functions. Comprehensive Git support Ensures full functionality for commits, branching, merging, stashing, and rebasing. Merge conflict resolution Built-in visual diff and merge tools simplify tracking and resolving conflicts. Integration with popular issue trackers Boosts efficiency by connecting with tools like Jira, Azure DevOps, GitHub, GitLab, and Bitbucket, especially for team projects. Affordability & licensing Reasonable pricing and a free version for individuals and small projects can be a deciding factor. Private repository support Secure and confidential development with free private repository handling, useful for budget-conscious users. OS-specific optimization Some Git GUI tools offer different features depending on the OS, so it’s important to check compatibility. Dark mode & customization Enhances user experience and aesthetics, making the interface more comfortable for long hours of work. Performance & stability A lightweight, responsive tool prevents slowdowns, even with large repositories. Command line integration Allows seamless switching between GUI and CLI for advanced users who need more control. Collaborative features Supports code reviews, pull request management, and real-time team collaboration, making teamwork smoother. Features to Look for in a Git GUI Client Getting a Git GUI client for Windows is like buying a car. Whenever you look for the right option, this doesn’t mean you have to necessarily get the one that’s “Porshe” in the world of Git software. Earlier, we have overviewed some of the features that make the Git tools for Windows stand out from their competition. However, for your project, it might make sense to pick the client that has essential features, covers the basics, and is well-balanced for your particular needs. Check the basic features list below to better understand what are the most common features you can expect from an average Git GUI for Windows. Visual Git History Keeping track of commits, branches, and merges can be overwhelming in a text-based log. A good Windows Git client GUI should provide a clear, interactive visual representation of the repository’s history, making it easier to understand code evolution, track changes, and navigate branches without relying on complex CLI commands. Integration With Popular IDEs Context-switching between the Windows Git client and your IDE can disrupt workflow efficiency. Look for a Git GUI that integrates seamlessly with Visual Studio, JetBrains IDEs, VS Code, or other development environments, letting you perform Git operations without leaving your coding workspace. Ease of Configuration and Setup Configuring Git, managing SSH keys, or setting up user credentials can be tedious, especially for beginners. A Windows Git client GUI should simplify the setup process with guided configurations, intuitive authentication options, and easy repository cloning, reducing friction when getting started. Merge Conflict Resolution Tools Merge conflicts can be frustrating and time-consuming to resolve manually. A Git GUI with built-in merge tools provides a visual way to compare conflicting changes, select resolutions interactively, and reduce the risk of mistakes, making conflict handling far more manageable. Cross-Platform Support If you work across multiple operating systems, having a Git client that functions consistently on Windows, macOS, and Linux ensures a smooth experience. This eliminates the need to learn different tools per platform and keeps workflows unified across teams. Top 10 Git GUI Clients for Windows Modern technologies brought us a lot of solutions to work with Git. Some developers, who aren’t satisfied with any of the currently existing Git GUI clients, even start to create their own tools. That’s why the variety of available Git client software is already impressive. In our review, we’ll focus on Git GUI tools for Windows, as this OS remains the leading one, occupying almost 73% of the market. We are going to consider the clients developed for Windows or cross-platform tools that work, among other systems, on Windows. Our Git desktop client top 10: GitHub Desktop GitKraken Sourcetree TortoiseGit SmartGit GitForce Git Cola Aurees Magit Fork If you are into Git, you might have already used some Git desktop clients. Or, at least, heard of some of them. So, let us dive deeper and see what these tools can offer to you. 1. GitHub Desktop [GitHub Desktop](https://desktop.github.com/) is, perhaps, the most famous client for Git. It is familiar to all developers keeping their repositories on GitHub (a repository hosting service used for version-controlling IT projects). It’s free, open-source, transparent, and functional. When you consider a Git client for Windows, GitHub Desktop is often the first option to come to mind. With this solution, you won’t need to use the command line and enter any commands manually. You only need to log in to your account at GitHub and use this GUI to manage code in your repositories. With GitHub Desktop, you can: Create new local repositories easily Track all changes and their authors visually Collaborate with other developers Checkout branches with ease Code safely with syntax highlighting GitHub Desktop is supported by a vast community of developers. They work continually to make both Git and this free client better for every user. 2. GitKraken [GitKraken](https://www.gitkraken.com/) is one of the best-known Git GUI tools for Windows, Linux, and macOS. Developers favor this software for its reliability and efficiency, and its stylish interface also helped this solution become so popular. It simplifies all the basic tasks, making it possible to perform the necessary actions and fix errors with one click. It boasts an embedded editor where you can edit code. You can also start new projects right away. Synchronization tasks are possible in real time, and its features make it ideal for teamwork. GitKraken is available for free, provided you use it for non-commercial purposes. There are also advanced, paid versions – Pro and Enterprise. GitKraken offers: Syntax highlighting Drag-and-drop functionality Tracking of all issues Support for Gitflow and Git Hooks Integration with repository hosting services GitKraken is one of the most highly functional and convenient Git clients for Windows, loved by millions of Git users worldwide. 3. Sourcetree [Sourcetree](https://www.sourcetreeapp.com/) is another famous free solution that provides access to Git on Windows and macOS. It allows you to connect to your repositories in GitHub, Bitbucket, Stash, or Microsoft TFS. Developed by Atlassian Corporation, this tool aims to make the life of Git users easier. It is simple and user-friendly, with transparent navigation and a bunch of useful features. You can easily perform all the necessary Git-related tasks, such as cloning repositories (including remote ones), pushing, pulling, committing, and merging changes. Both experienced users and beginners can work successfully with Sourcetree. More features include: Support for large Git files Detailed branching diagrams Reviewing all the changes (both outgoing and incoming) Easy maneuvering between branches Full support for Gitflow and Hgflow This tool is bound to keep your repos cleaner and your development more productive. 4. TortoiseGit [TortoiseGit](https://tortoisegit.org/) is a dedicated solution for working with Git on Windows. It is, in essence, a Windows shell interface, a free and open-source Git GUI that allows any team to adjust the functionality or even build a personal version of TortoiseGit for their specific needs. It can work with any file and does not depend on any IDE. TortoiseGit is used to quickly perform all the standard tasks, such as cloning repositories, creating branches, handling changes, viewing logs, etc. Another helpful feature is integration with Windows Explorer – you can perform the required tasks in a familiar and convenient environment. More features include: Autocompletion of keywords and paths High efficiency with large projects in non-linear development Easy handling of branches and tags Cryptographic authentication of history Ability to handle multiple tasks in teamwork TortoiseGit is popular worldwide. There are 30 different language versions to help developers from various countries master it in the most convenient way. 5. SmartGit [SmartGit](https://www.syntevo.com/smartgit/) is another functional cross-platform Git client software. It works smoothly on Windows, macOS, and Linux. For many users, SmartGit is the easiest Git client, which covers a good number of tasks. It provides the possibility to view and edit files side-by-side and allows resolving merge conflicts automatically. With the support for Gitflow, you can configure branches directly in the tool. There is no need to use any additional software. The features of SmartGit include: A built-in SSH client Visual commit history The Conflict Solver feature for fixing issues when merging files The single-view Log window to view commands, index, and working tree at once Commit Debugger for verifying any commit, if necessary SmarGit has both free and paid versions with more robust functionality and additional integration features. The easiness of use made this tool favored by many developers. It offers great functionality that is improved continually. 6. GitForce [GitForce](https://github.com/gdevic/GitForce) is a popular cross-platform Git GUI client, running on Linux (including Ubuntu) and Windows. This tool is simple, smart, and efficient. However, since the overall functionality of Git is versatile, and there are too many options, GitForce doesn’t cover everything. But it provides a straightforward way to perform the most common commands in a graphical interface. Besides, it is a free GUI, available to everyone. Among the many helpful features of GitForce, you may note support for multiple repositories, the possibility to scan local repositories, drag-and-drop functionality, access to history, etc. Further features of GitForce include: No installation – you simply get a single file and run it Creation and management of multiple repositories and branches Clean and intuitive GUI Easy management of SSH keys Embedded command-line interface Despite some functional limitations, GitForce is a very efficient free Git client for Windows. It is suitable for both beginners and experienced Git users. The most valuable benefit is that it can significantly reduce the need to use the command line to a minimum (or even eliminate it). 7. Git Cola [Git Cola](https://git-cola.github.io/) is a free and open-source Git desktop client. Initially developed for Linux, it also can be used as a GIT software for Windows, offering numerous efficient features in a customizable interface. There is a variety of tools at your service – and it is possible to rearrange them for your convenience and hide those that are irrelevant for your specific needs. Git Cola compares commits, searches for data by message, author, filename, etc., and edits git indexes. It also ensures proper execution of all necessary Git-related commands in a visual mode. An interface with several panes allows users to view different project aspects and track activities. More features of Git Cola include: A dedicated Git-Dag visualizer for branches and commands The Commit Message editor Keyboard shortcuts to accelerate performance Saving of layout modifications and with recovery on the next launch Cool dark mode and stylish custom themes for the Window Manager Git Cola is one of the most loved GUI Git tools for Windows that is highly useful for developers of all levels. 8. Aurees [Aurees](https://aurees.com/) is a free Git client that runs on Windows, macOS, and Linux. It has a dedicated account on GitHub, and users should log into that account to use the client. The primary purpose of the tool was editing and publishing Git files with ease. Colored tags simplify the navigation through remote repositories. Like other tools providing Git GUI for Windows, Aurees allows users to examine all changes, IDs, tags, and authors who implemented those changes. With this tool, you can easily detect and analyze differences between different data, handle branches, and revert changes to a previous working copy. More features of Aurees include: Sleek and user-friendly interface Viewing of commit and merge data in side-by-side windows to quickly identify and resolve any issues Viewing of differences in an advanced built-in editor This is a powerful free Git GUI tool that copes well with the majority of Git jobs. 9. Magit [Magit](https://magit.vc/) is not a separate Git desktop client – it is a free plugin with an original text-based interface. It is implemented as a GNU Emacs package to use on Windows, macOS, and Linux. This plugin allows developers to tackle version control directly in the Emacs window. This solution is very effective for high-level Git commands. It adjusts the outputs for reading by human operators. More features of Magit include: The possibility to clone repositories locally and pull changes Creation, merging, and rebasing of branches Compilation of commits and push operations Commit history Compilation and execution of patches Notes and tags The functionality is not that complete, yet in general, Magit lets developers get the majority of their daily tasks done from within Emacs. 10. Fork [Fork](https://git-fork.com/) is a relatively young, simple, and fast Mac and Widdows GIT client. It is available free of charge, but there is also a paid version with more options. The distinctive feature of this tool is a tab-based interface that makes navigation and management much faster. You can open websites or applications directly in Fork. More features of Fork include: An advanced view for examining and analyzing differences A dedicated repository manager A comprehensive file-tree repository structure Creation and deletion of remote repositories Support for all major Git commands Fork is one of the newer solutions, and its functionality is continually enhanced. dbForge Source Control – a powerful Git add-in for SSMS If you work closely with databases, and you use SQL Server Management Studio for that purpose, we’d love to recommend a specialized add-in called [dbF](https://www.devart.com/dbforge/sql/source-control/) [o](https://www.devart.com/dbforge/sql/source-control/) [rge Source Control](https://www.devart.com/dbforge/sql/source-control/) . Its key features include: Version control of database schemas and static table data directly in SSMS Support for all the major popular version control systems, such as Git (including GitHub, GitLab, Azure DevOps, and Bitbucket), Apache Subversion (SVN), TFVC, Mercurial (Hg), Perforce (P4), and SourceGear Vault Version control of working folders Easy teamwork via dedicated and shared development models Fast comparison of database versions Examination and resolution of conflicts and inconsistencies Detailed history of changes Advantages offered by dbForge Source Control Source Control is specifically designed for database version control via SSMS. While the abovementioned Git clients primarily cater to version control in software development, Source Control is focused on databases. Here are some of its main advantages: Seamless integration with SQL Server Management Studio (SSMS): Simply put, you can manage schema and data changes directly within SSMS – modify database objects, make and revert commits, view history, resolve conflicts, and so on. Needless to say why it’s convenient. Trouble-free collaborative database object management: Version control of databases and actual data has never been easier. All changes are quick, convenient, and transparent. If anything happens to go wrong, you will be able to quickly identify the source of the issue to effectively resolve it. And you have a lot of version control systems, besides Git, to choose from. So, in case you are a power user of SSMS and part of a team that works on one or more databases simultaneously, you can consider this add-in to enhance your teamwork. Source Control in dbForge Studio for SQL Server But what if you are seeking to replace SSMS with something more powerful? In this case, we can suggest an integrated environment that’s just as easy to master yet far more feature-rich – [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . It delivers integrated [Source Control](https://www.devart.com/dbforge/sql/studio/sql-source-control.html) that boasts the same capabilities as those of the abovementioned SSMS add-in, namely: Version control of database schemas and static table data Support for all the major popular version control systems, such as Git (including GitHub, GitLab, Azure DevOps, and Bitbucket), Apache Subversion (SVN), TFVC, Mercurial (Hg), Perforce (P4), and SourceGear Vault Version control of working folders Easy teamwork via dedicated and shared development models Fast comparison of database versions Examination and resolution of conflicts and inconsistencies Detailed history of changes To get a clear picture of what it looks like, take a look at the following screenshot that shows conflicts (which can be resolved), remote changes (which can be pulled), and your local changes (which can be either committed to the repository or reverted). Looks interesting? Then simply [download dbForge Studio for SQL Server for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and give it a go today! Source Control in dbForge Studio for MySQL And if you work with MySQL and MariaDB databases, you can opt for a twin Studio that’s just as versatile feature-wise – [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) . It has a roughly similar set of features, including [Source Control](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) , which offers the same advantages as that for SQL Server. As you can see in the screenshot below, the interface is identical, clean and intuitive. And, of course, you can [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) as well to explore everything it’s capable of. Note that both Studios are available as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , a multidatabase solution that covers a massive spectrum of tasks on a number of database systems and cloud services. It eliminates the need for assembling complex, consistent toolsets, yet gives you the firepower to be effective from day one. Just like a single Studio, dbForge Edge is available for a free 30-day trial, so we also gladly invite you to [download it today](https://www.devart.com/dbforge/edge/download.html) ! Conclusion Since you’ve been on a quest to find the best GIT client Windows OS can work with, you might have already tried some of the options from our list. However, there are still lots of options offering GIT GUI for Windows that you should consider before making a decision. In this article, we tried to ensure we listed most of them, highlighting their pros and cons. To wrap it up, here’s a comparison table with the list of GIT tools for Windows that were mentioned above. Git client Key features Pricing Collaboration features Platform support GitHub Desktop Syntax highlighting, branch checkout, visual change tracking Free Yes Windows, macOS GitKraken Embedded editor, drag-and-drop, Gitflow support Free & Paid Yes Windows, macOS, Linux Sourcetree Branch diagrams, Gitflow support, large file support Free Yes Windows, macOS TortoiseGit Windows Explorer integration, cryptographic history authentication Free Yes Windows SmartGit Conflict solver, visual commit history, SSH client Free & Paid Yes Windows, macOS, Linux GitForce Drag-and-drop, multiple repository management, embedded CLI Free Yes Windows, Linux Git Cola Customizable interface, Git-Dag visualizer, keyboard shortcuts Free Yes Windows, Linux Aurees Side-by-side commit/merge view, sleek interface Free Yes Windows, macOS, Linux Magit Text-based interface, Emacs integration, patch execution Free Yes Windows, macOS, Linux Fork Tab-based navigation, repository manager, file-tree structure Free & Paid Yes Windows, macOS Considering the variety of tools providing Git GUI for Windows, you can easily make an informed choice and pick the best GIT client for Windows based on your needs and expectations. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [Git client](https://blog.devart.com/tag/git-client) [git desktop client](https://blog.devart.com/tag/git-desktop-client) [git ui tool](https://blog.devart.com/tag/git-ui-tool) [source control](https://blog.devart.com/tag/source-control) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-git-gui-clients-for-windows.html) [Twitter](https://twitter.com/intent/tweet?text=10+Best+Git+GUI+for+Windows+in+2025&url=https%3A%2F%2Fblog.devart.com%2Fbest-git-gui-clients-for-windows.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-git-gui-clients-for-windows.html&title=10+Best+Git+GUI+for+Windows+in+2025) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-git-gui-clients-for-windows.html&title=10+Best+Git+GUI+for+Windows+in+2025) [Copy URL](https://blog.devart.com/best-git-gui-clients-for-windows.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-gui-for-oracle-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Choosing the Best GUI Client for Oracle Database By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) September 22, 2023 [0](https://blog.devart.com/best-gui-for-oracle-database.html#respond) 2926 As of 2023, Oracle continues to be the most popular database management system worldwide. Although it faces tough competition from MySQL, Microsoft SQL Server, and PostgreSQL, Oracle has managed to hold its top position. It’s the default choice for many industries, particularly for large-scale enterprise projects. Consequently, many database professionals often find themselves working with Oracle systems. Given all the convenience of using graphical interfaces, there is a rising demand for user-friendly tools that simplify database management, especially for working with [complex systems like Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) . In this article, we’ll explore some of the most popular Oracle management tools, designed to help database experts effectively address their work challenges. Contents The choice criteria dbForge Studio for Oracle Oracle SQL Developer Beekeeper Studio DbVisualizer RazorSQL TablePlus for Oracle DbSchema Conclusion The choice criteria In this article, we have highlighted 7 GUI-based client tools specifically designed for Oracle Database experts. We evaluated them based on the following key criteria essential for a high-quality Oracle client: Compatibility : The client should work seamlessly with various Oracle database versions, cloud platforms, and operating systems. Ease of Use : A clean, intuitive, and customizable user interface is crucial for a smooth experience. Coding Assistance : Any good Oracle client must offer features that help developers write quality code, build complex queries, analyze them, and optimize their performance. Database Schema Management : This is vital for database developers and administrators who are responsible for maintaining data integrity, organization, and optimal database performance. Performance Monitoring : Database administrators, in particular, will appreciate real-time performance monitoring capabilities that will help them quickly identify and resolve issues. User Management : The ability to create and manage database users, along with their permissions and privileges, is essential for database security. Data Import/Export : The client should allow users to import and export data easily while supporting a variety of data formats. Vendor Support : Fast and expert tech support from the vendor is a significant advantage when using complex, modern tools. Lastly, the cost is often a major consideration for potential users. The chosen software should offer reasonable and flexible pricing, along with free trials, to help users make an informed decision. These are the main factors we considered while reviewing various Oracle management tools. While they’re not the only criteria, we believe they are crucial for choosing a high-quality solution. Now, let’s explore the Oracle GUI tools that meet these criteria. dbForge Studio for Oracle [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) is a comprehensive integrated development environment (IDE) tailored for Oracle specialists. It offers a complete set of tools within a single platform, supporting all current Oracle versions, Oracle Cloud, Oracle on Amazon RDS, and Oracle Real Application Cluster. The primary aim of dbForge Studio for Oracle is to enhance developers’ coding speed and overall productivity, minimize errors, and automate routine tasks. The suite of tools provided covers all critical aspects of database design, development, management, and administration, and the smart and intuitive GUI allows for performing plenty of jobs visually. Key features : Advanced coding assistance, including phrase completion, formatting, and debugging Library of code snippets enabling developers to reuse code fragments and organize them for quick retrieval Query Builder for visual construction of queries Database Designer for a more visual approach to database construction and refinement of diagrams Change management tools for comparing and synchronizing database schemas and table content Data import from 10 different formats and export to 14 formats Data Generator that swiftly produces high-quality, realistic test data Real-time server session monitoring and troubleshooting Dedicated user management tools for configuring and overseeing user accounts Database Documenter that generates detailed documentation Data analysis and reporting tools that include Master-Detail Browser, pivot tables, charts, and automated report delivery Responsive support from the vendor, including the delivery of custom builds upon request Disadvantages : The functionality is restricted to Oracle databases only The interface is only available in English No native support for macOS and Linux (available only through the CrossOver compatibility solution) Pricing : Free Express Edition with basic functionality Subscription-based license – from $149.95/year Perpetual license – from $299.95 Free Trial – 30 days [Download dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) To learn more about the featutes of the tool, watch this video. Oracle SQL Developer [Oracle SQL Developer](https://www.oracle.com/database/sqldeveloper/) , provided by Oracle Corporation, stands as the official integrated development environment for Oracle Database. It comes at no extra charge with any Oracle Database license and provides a comprehensive array of tools for database developers and administrators, catering to on-premises and cloud platforms, and offering various connection methods. The primary role of Oracle SQL Developer is to streamline Oracle development and deployment. It boasts all the essential features for script creation and execution, database management, in-depth analysis, and reporting. Key features: PL/SQL coding module equipped with code debugging, formatting, and profiling capabilities Visual Query Builder with an intuitive drag-and-drop interface for crafting complex queries SQL Worksheet enabling the execution of SQL queries and PL/SQL code, displaying results in the grid Data Modeling module enabling the creation and management of database models, supporting both reverse and forward engineering Data migration via import & export with support for various file formats and bulk data operations Database Administration module that helps manage users, monitor performance, diagnose and rectify issues, and handle backup and recovery tasks Unit Testing module with a specialized framework for testing PL/SQL entities like functions and procedures, with the convenient monitoring of results Migration functionality that streamlines the transfer of third-party databases to Oracle with a dedicated wizard Support for multiple languages Disadvantages: Intensive resource consumption Sophisticated interface Lack of documentation and steep learning curve No query auto-completion Pricing: Free solution provided with any Oracle Database license [Download Oracle SQL Developer](https://www.oracle.com/database/sqldeveloper/technologies/download/) Beekeeper Studio [Beekeeper Studio](https://www.beekeeperstudio.io/db/oracle-client/) is a modern GUI-based database management solution that offers support for all major database management systems, including Oracle. Its user-friendly interface streamlines standard tasks while also providing options for traditional SQL query interactions. Beekeeper Studio is compatible with both on-premises and cloud platforms, and it’s readily available for Windows, macOS, and Linux users. This powerful tool boasts a wide range of features, encompassing SQL coding, database and table design, data visualization, data migration, and basic session management. It’s one of the top choices among Oracle database developers and managers. Key features: SQL editor with syntax highlighting, auto-completion, and shortcuts Multi-tabbed query editing to work on multiple queries simultaneously Visual query building with instant preview of results Data visualization with charts and graphs of different types, with various color schemes and data labels Data export into CSV, JSON, Markdown, and XLS with flexible export options Data browser that allows viewing, searching, and editing data Table designer and editor that help create and alter tables and their elements Secure SSL and SSH connections Suitability for teamwork across multiple devices Disadvantages: Lack of advanced performance monitoring and database administration options A smaller user community and fewer resources specifically targeting Oracle database management Pricing: Free Community Edition Subscription-based license – from $14/month per user Free trial – 14 days [Download Beekeeper Studio](https://www.beekeeperstudio.io/get) DbVisualizer [DbVisualizer](https://www.dbvis.com/database/oracle/) is a cross-platform database IDE designed for database developers, administrators, and data analysts. This software is available on Windows, Linux, and macOS, and, due to the support for Java Database Connectivity (JDBC), can work with multiple database systems, including Oracle. DbVisualizer equips users with a suite of tools designed to handle database design, management, and maintenance tasks. The software streamlines these processes through intuitive visualization features, ultimately enhancing overall efficiency. Key features: SQL Editor with context-aware code auto-completion and formatting Visual query builder with drag-and-drop options Visual explain plans that help make queries more efficient Data visualization that presents data in various formats, including lists, line charts, and staple graphs Direct data editing Data export to CSV, HTML, SQL, XML, XLS, XLSX, and JSON Data import from CSV and XLS GUI customization with options like Light and Dark themes and tab arrangement Query execution from the command line SSH encryption for data transmissions User permissions management Disadvantages: Limited functionality in the Free edition Steep learning curve and lack of learning resources No advanced functionality to manage database schemas Lack of performance monitoring functionality Pricing: Free edition with limited feature set Subscription-based license – $197 per user per first year License renewal – $69.00 per user [Download DbVisualizer](https://www.dbvis.com/download/) RazorSQL [RazorSQL](https://razorsql.com/features/oracle_gui_tools.html) stands out as a highly popular GUI tool for Oracle, empowering users to query, edit, and manage databases. This versatile solution is compatible across multiple platforms, including Windows, Linux, macOS, and Solaris. Additionally, it supports over 20 programming languages, such as SQL, Java, PHP, XML, and more. Its functionality encompasses an array of visual tools, including a user-friendly query builder designed to streamline the creation and management of various queries and database objects. Key features: SQL Editor with support for multiple programming languages, enabling users to write and execute queries directly Database Browser for quick search and comprehensive viewing of various database objects Database Editor that helps users edit, insert, and delete data Data Comparison for the comparison of table data between different databases and query results Data export into various formats, including CSV, Excel, HTML, JSON, and SQL INSERT statements Data import from CSV, Excel spreadsheets, and fixed-width files Table Designer that enables visual manipulation of table data, including finding, adding, modifying, replacing, and deleting DDL Generator that automatically generates DDLs for tables, views, and indexes Robust backup features for databases and tables Disadvantages: Limited functionality – primarily suited for small businesses and not well-suited for enterprise-level projects No features for monitoring databases and managing users Pricing: Perpetual license – from $99.95 per user Free trial – 30 days [Download RazorSQL](https://razorsql.com/download.html) TablePlus for Oracle [TablePlus for Oracle](https://tableplus.com/) is currently in beta and exclusively available on macOS. It offers a viable database management experience, boasting an intuitive and precise GUI that enables users to handle multiple databases simultaneously. Many users regard TablePlus as their ultimate choice due to its simplicity and support for essential database tasks, including PL/SQL coding. Key features: Advanced SQL Editor equipped with advanced auto-completion and syntax highlighting Database Structure and Table Data Editor Direct query execution Simple configuration for hassle-free data backup and restoration Data import and export to and from CSV files Customizable UI with multiple tabs and various display modes Configurable keyboard shortcuts TLS and libssh features that ensure secure data management Plugin support to expand functionality Support channels – email and social media Disadvantages: TablePlus for Oracle is currently in beta, and not all features may be fully supported Oracle Database support is limited to macOS Pricing: Perpetual license – from $89.00 Free trial – offers limited functionality [Download TablePlus for Oracle](https://tableplus.com/download) DbSchema [DbSchema](https://dbschema.com/databases/Oracle.html) is a database management tool designed for Oracle with a comprehensive GUI ensuring easy interaction with databases in a visual mode. It offers a wide range of robust functionalities to handle essential database tasks. These include visual database design with the reverse engineering option, PL/SQL coding, database deployment, and comprehensive database documentation. Key features: Secure connection through SSH Interactive diagrams for database schema management Schema synchronization and deployment of changes manually or automatedly Visual Query Builder to construct sophisticated queries without coding SQL Editor with graphical query explain plans, syntax highlighting, and query/script execution Reports & charts with the option to combine charts, buttons, and master-detail tables Automated execution of Groovy and Java scripts through the API Generation of database documentation in HTML5 and PDF formats Dark Theme for interface personalization A command-line client that can connect to multiple databases and automate routine tasks Disadvantages: Lack of troubleshooting functionality for high-performance operations Lack of user management functionality Possible failures because of Oracle’s JDBC driver issues Pricing: Free Community edition Subscription-based license – from $19.6 per month Perpetual license – from $196 Free trial – 14 days [Download DbSchema](https://dbschema.com/download.html) We found these solutions to be the best fit for Oracle specialists due to their advanced features, the exceptional user experience they deliver, and proven efficiency. All the above-mentioned multi-purpose tools effectively address key challenges when working with Oracle databases. Conclusion As we said before, numerous projects involve Oracle Database, so having the right tools to manage it can greatly enhance your productivity. Ideally, you’d want a single software suite that handles all your needs. Enter dbForge Studio for Oracle — a highly recommended all-in-one solution. It boasts a robust set of features that cater to every database task imaginable. From doubling the speed of code creation to adeptly managing both schema and data changes, generating test data, monitoring overall database performance, and even managing users — this solution has it all. Furthermore, with dedicated professional support and a wealth of resources available, mastering dbForge Studio for Oracle becomes a breeze. Ready to give it a try? You can try dbForge Studio for Oracle in your own work environment with a [free, fully functional 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) . Experience firsthand how it can streamline your daily tasks and take your efficiency to new heights. Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-gui-for-oracle-database.html) [Twitter](https://twitter.com/intent/tweet?text=Choosing+the+Best+GUI+Client+for+Oracle+Database&url=https%3A%2F%2Fblog.devart.com%2Fbest-gui-for-oracle-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-gui-for-oracle-database.html&title=Choosing+the+Best+GUI+Client+for+Oracle+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-gui-for-oracle-database.html&title=Choosing+the+Best+GUI+Client+for+Oracle+Database) [Copy URL](https://blog.devart.com/best-gui-for-oracle-database.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [Products](https://blog.devart.com/category/products) [SQL Aggregate Functions: Syntax, Use Cases, and Examples](https://blog.devart.com/sql-aggregate-functions.html) April 10, 2025"} {"url": "https://blog.devart.com/best-ide-product-2010-devproconnections.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) dbForge Studio for MySQL has been nominated in Best IDE Product of 2010 DevProConnections Community Choice Awards! By [dbForge Team](https://blog.devart.com/author/dbforge) September 13, 2010 [0](https://blog.devart.com/best-ide-product-2010-devproconnections.html#respond) 2706 Devart made it to the final of 2010 DevProConnections Community Choice Awards. The awards recognize the best products and services in the industry by a community vote. Devart has been nominated in 4 categories of 2010 DevProConnections Community Choice Awards! Nominated products: Best Add-In Product – Entity Developer (page 1, category 1: Add-In) Best Component Set – dotConnect for Oracle (page1, category 7: Component Set) [Best IDE Product – dbForge Studio for MySQL (page 2, category 14: IDE)](https://www.devart.com/dbforge/mysql/studio/) Best Free Tool – CodeCompare (page2, category 26: Free Tool ) Tags [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-ide-product-2010-devproconnections.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+has+been+nominated+in+Best+IDE+Product+of+2010+DevProConnections+Community+Choice+Awards%21&url=https%3A%2F%2Fblog.devart.com%2Fbest-ide-product-2010-devproconnections.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-ide-product-2010-devproconnections.html&title=dbForge+Studio+for+MySQL+has+been+nominated+in+Best+IDE+Product+of+2010+DevProConnections+Community+Choice+Awards%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-ide-product-2010-devproconnections.html&title=dbForge+Studio+for+MySQL+has+been+nominated+in+Best+IDE+Product+of+2010+DevProConnections+Community+Choice+Awards%21) [Copy URL](https://blog.devart.com/best-ide-product-2010-devproconnections.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/best-mysql-client-for-mac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) 16 Best MySQL GUI Clients for macOS By [dbForge Team](https://blog.devart.com/author/dbforge) February 3, 2025 [0](https://blog.devart.com/best-mysql-client-for-mac.html#respond) 7841 Well, we can’t argue that [Windows](https://blog.devart.com/top-10-mysql-gui-tools-for-database-management-on-windows.html) is the key platform for database development and management software—but what if you are a Mac user? Who said you can’t have equal opportunities to set up easy daily work with, for instance, MySQL databases? Simply take a closer look and you’ll see an abundance of top-tier MySQL tools for your Mac just around the corner. To make your search easier, we have prepared and reviewed a handy selection for you—most of the following tools definitely rank among the best of their kind and boast an appropriately convenient graphical user interface. What is a MySQL GUI client? But before we proceed to our selection, here’s a brief general overview of MySQL GUI tools (also called clients). These are software solutions that help you tackle a variety of database development, management, and [administration](https://www.devart.com/dbforge/mysql/studio/database-administration.html) tasks. The exact variety, however, depends on the capabilities of your client. It can be a rather simple data management solution, or it can be a feature-rich integrated environment that you can employ as an integral part of your DevOps cycle. See also: [How to install MySQL on macOS](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-macos/) As for the clients we have reviewed, they are diverse enough to make sure you will find at least something that perfectly matches your current needs. And, of course, you should pay attention to whether the client of your choice comes with detailed documentation, proper support, and maybe even some extra tutorials that will help you get effectively started right away. Top 16 MySQL GUI clients for macOS in 2025 dbForge Studio for MySQL MySQL Workbench SQLPro Sequel Pro Valentina Studio DBeaver Querious TablePlus dbForge Edge RazorSQL Navicat DataGrip Beekeeper Studio DbVisualizer Azure Data Studio DbGate dbForge Studio for MySQL [My SQL IDE: dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is an all-in-one integrated development environment (IDE), designed to simplify all routine jobs for database developers and administrators alike. Although the Studio was created as a classic Windows application, it is available on macOS and Linux via a special compatibility solution called [CrossOver](https://www.codeweavers.com/crossover/) . Pros Accelerated coding speed with the optimized [MySQL editor](https://www.devart.com/dbforge/mysql/studio/mysql-code-editor.html) Rich capabilities for [SQL development](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) , which include code completion, [SQL formatting](https://www.devart.com/dbforge/mysql/studio/sqlmanagement.html#Sophisticated-MySQL-Formatter) , [refactoring](https://www.devart.com/dbforge/mysql/studio/database-refactoring.html) , and [code debugging](https://www.devart.com/dbforge/mysql/studio/code-debugger.html) Options for coding-free database development and management with [Query Builder](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) and [Database Designer](https://www.devart.com/dbforge/mysql/studio/database-designer.html) (based on visual diagrams) Essential features like [database backup and recovery](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) , [database comparison and synchronization](https://www.devart.com/dbforge/mysql/studio/database-synchronization.html) , [database migration](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) , database [reverse engineering](https://www.devart.com/dbforge/mysql/studio/database-designer.html) , [user management](https://www.devart.com/dbforge/mysql/studio/database-administration.html) , and more [Source Control](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) for version-controlling database schemas and static data, compatible with Git, Mercurial, Apache Subversion, TFVC, Perforce, and other major version control systems CLI-powered automation of nearly all tasks A clean and intuitive visual user interface (GUI) ensures that [even a beginner](https://www.devart.com/dbforge/mysql/studio/tutorial-for-beginners.html) won’t get lost in the [multitude of features](https://www.devart.com/dbforge/mysql/studio/features.html) Multi-channel support and lots of additional materials ( [text guides](https://blog.devart.com/mysql-tutorial.html) and a free online [Devart Academy](https://www.devart.com/academy/mysql-studio/) , the selection of detailed, well-structured video tutorials covering database-related topics from the fundamentals to specific scenarios) Comprehensive documentation with detailed guides on installing dbForge Studio for MySQL on macOS Free 30-day trial — quite enough to get acquainted with the vast capabilities of this IDE and see whether it is really what you need Alternatively, you can use the Studio’s free Express Edition, which delivers the basic functionality Cons Requires CrossOver to work with dbForge Studio for MySQL on macOS; however, the installation and configuration process [is rather easy](https://docs.devart.com/studio-for-mysql/getting-started/how-to-install-dbforge-studio-linux-mac.html) Advanced functionality is only available in paid editions [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) Looking for a way to use dbForge functionality with other DBMSs? Try [dbForge Edge](https://www.devart.com/dbforge/edge/) , a multi-database solution that covers MySQL, MariaDB, SQL Server, Oracle, PostgreSQL, and many other cloud databases, storage, and search engines! MySQL Workbench MySQL Workbench is probably the default, if not the ultimate GUI client for MySQL database developers, architects, and analysts. Being compatible with macOS, Windows, and Linux, it includes a good selection of database design and administration tools that will definitely simplify your daily work. Pros Similarly to the previous client, you get tools that help you build queries visually and design databases with ER diagrams of any complexity Intelligent code completion Advanced tools for data modeling All the administration basics are firmly in place, including user management, server configuration and monitoring, database backup and recovery, as well as data migration It’s a free and very popular product with a large community Cons The UI is more complex than an average user would like it to be; as a result, even simple tasks, such as data transfer procedures, can get rather sophisticated Excessive resource consumption [Download MySQL Workbench](https://dev.mysql.com/downloads/workbench/) SQLPro SQLPro is a free (yet with several paid options) MySQL manager for macOS, which enables quick access to tables and views, delivers IntelliSense-like autocompletion, formatting, and syntax highlighting, supports multiple result sets and selection-based query execution, as well as includes a table designer for easy modification of columns, indexes, foreign keys, and triggers. Pros Native application Good-looking, intuitive interface Easy management of multiple result sets Cons Overall, the functionality is quite limited; however, if you don’t need an all-encompassing toolkit, you may consider this option No documentation [Download SQLPro](https://www.mysqlui.com/) Sequel Pro Sequel Pro is a completely free and open-source MySQL database manager that delivers the basic functionality for data management. If you need a simple tool to handle queries in multiple MySQL databases, this might be it. Pros The simplicity of the interface makes it a nice option for beginners Easy installation Convenient import and export of databases Cons No autocompletion (which is not that good, considering that the product focuses on writing and executing queries) Like in the previous case, the product offers limited functionality, but if your routine activities don’t venture beyond it, Sequel Pro is a nice option Underdeveloped documentation [Download Sequel Pro](https://sequelpro.com/download#auto-start) Valentina Studio Valentina Studio is a multiplatform GUI tool for easy management of MySQL, MariaDB, SQL Server, SQLite, PostgreSQL, and (their own) Valentina DB databases. Among the most demanded features, you could name visual query building and database modeling, a simple but useful data editor, a report designer, quick data transfers between source and target tables, schema comparison, and basic database administration. Pros Easy handling of multiple database management systems Convenient navigation and fast data search Robust report designer with rich capabilities The free edition of Valentina Studio is already rather capable, and the 3 paid editions improve on it even further Cons No tech support Rather underdeveloped documentation [Download Valentina Studio](https://www.valentina-db.com/en/download-valentina-studio/current) DBeaver Now let’s get back to more familiar titles. DBeaver is a multiplatform IDE supporting multiple database management systems. It is highly functional, user-friendly, and its Community Edition is available free of charge. The most popular features of DBeaver are the SQL query editor, visual query builder, database comparison tools, test data generator, and ER diagrams. There’s much more to DBeaver—and its team helps it evolve rather actively. If you are a demanding user, you definitely should explore this option. Pros Multi-user environment with flexible access management Advanced data editor Visual query builder is a proven solution for those who prefer to handle queries on diagrams Customizable management of result sets Flexible comparison of database objects with diverse presentation of results Dummy data generation Full documentation Cons No support service in the free edition Weak data visualization functionality Complicated data import and export procedures [Download DBeaver](https://dbeaver.io/download/) Querious Querious is a macOS-only commercial CUI client for MySQL and compatible databases. Under a clean interface, you will find a solution with moderately strong querying capabilities and a fine selection of tools for database object editing, server management, and easily configurable database structure comparison. Pros Native macOS experience with good, stable performance The basics that make your querying convenient—such as autocompletion, formatting, and syntax highlighting—you have them here Editors for all types of database objects Rich database administration and server management tools Support service It is a very affordable solution with a free 30-day trial Cons Like in a couple of cases described above, this is a moderately capable but not an all-in-one solution; the good news is that you have lots of time to explore it before buying No documentation [Download Querious](https://www.araelium.com/querious) TablePlus TablePlus is a nice-looking multiplatform GUI tool that helps you work with data in numerous database systems. However, take note that the main killer feature of TablePlus is its smart query editor with syntax highlighting, instant autocompletion, SQL formatting, and data editing features. The rest depends on whether it is your focus as well. Pros Clean and simple user interface Smooth native experience on each platform Good documentation If the free version of TablePlus is not enough for you, the paid license is rather affordable nonetheless Cons The functionality of TablePlus is well-aligned to SQL queries and data editing, but rather limited in all other respects; check it out if you mostly need to browse, edit, and query your data [Download TablePlus](https://tableplus.com/download) dbForge Edge [dbForge Edge](https://www.devart.com/dbforge/edge/) is a multi-database solution designed for full-stack database specialists whose daily duties involve handling diverse database management systems. dbForge Edge offers a plethora of features derived from specialized dbForge Studios (MySQL/MariaDB, SQL Server, Oracle, and PostgreSQL), combining their functionality into a unified solution. Edge allows users to efficiently manage databases on-premise and in the cloud, automate tasks, save time, and boost overall productivity. Pros Versatile coding assistance for all supported DBMSs with code completion, object suggestions, code debugging, formatting, analysis, syntax validation, snippets, etc. Visual database, table, and query design Database comparison and synchronization Source control integration supporting all popular version control systems Database administration with server/database monitoring and user management Data import and export with 14 supported data formats Test data generation with support for a variety of data types Comprehensive automated database documenting Task automation via the command-line interface In-depth product documentation and professional technical support from the vendor Free 30-day trial Cons Native compatibility with Windows only (requires compatibility solutions to work on macOS) Limited functionality in the free Express edition Available functionality varies depending on the DBMS [Download dbForge Edge](https://www.devart.com/dbforge/edge/download.html) RazorSQL RazorSQL is an easy-to-use SQL query tool that has been tested on over 40 database management systems, including MySQL. Its key features comprise a handy database browser, visual database tools, SQL query builder, SQL editor, as well as data import, export, and comparison functionality. Pros One of the more full-featured entries on the list Huge database system coverage Well-designed user interface Visual tools that help create, alter, describe, execute, and drop database objects Multi-tabular display of queries with capabilities for filtering, sorting, searching, and further operations CLI support Detailed documentation Free 30-day trial Cons Although RazorSQL offers quite a few features, you must carefully check whether each of them is advanced enough for your needs and requirements [Download RazorSQL](https://razorsql.com/download_mac.html) Navicat Navicat is a universal database development and administration solution that supports most of the popular database management systems and cloud platforms. With its help, you can easily design and manage entire databases and specific database objects, migrate data, compare and synchronize your databases, build queries, and perform reverse engineering. Pros Well-designed GUI Convenient database object designer Good SQL editor Visual database design and modeling Robust database comparison functionality Handy task automation capabilities (comparable only to those of dbForge Studio for MySQL) Cons It’s a rather pricey solution The 14-day trial is rather short for an IDE The documentation could use some expansion [Download Navicat](https://www.macupdate.com/app/mac/7119/navicat-for-mysql/download) DataGrip DataGrip is a smart subscription-based IDE for numerous database tasks. It equips database developers, administrators, and analysts with a multitude of integrated tools that help you work with queries and deliver flexible management of database objects. Pros Wide range of supported database management systems Intelligent suggestions and refactoring Version control integration Efficient navigation Integrated data connectors Extensive documentation with tutorials Free 30-day trial Cons Complicated learning curve Excessive resource consumption [Download DataGrip](https://www.jetbrains.com/datagrip/download/#section=mac) Beekeeper Studio Now let’s proceed to something more straightforward, but interesting nonetheless. Beekeeper Studio is a free and open-source GUI-based database manager and SQL code editor for MySQL, PostgreSQL, SQLite, and SQL Server databases. The creators of the Studio focused on making it as user-friendly and simple as possible. You can take a look at it if your primary work involves queries and doesn’t go far beyond them. Pros Simple user interface Good SQL editor features, which include autocompletion and syntax highlighting Searchable query history SSL encryption of connections Cons Limited functionality (not quite a drawback, because it is an editor and not an IDE for power users, and you should treat it as such) No support service No documentation [Download Beekeeper Studio](https://www.beekeeperstudio.io/get) DbVisualizer DbVisualizer is a smart and well-focused SQL editor and database manager, marketed as a database client with the highest customer satisfaction rating on G2. It is indeed a quite useful solution that enables you to work with SQL code, access and explore your databases and manipulate data. DbVisualizer is available in Free and Pro editions, the latter of which is activated with a license key. Pros Good-looking user interface Huge database system coverage Advanced SQL editor with automatic formatting and suggestions Focus on security (with SSH data encryption and secure access) Good customization CLI support Cons Sometimes it is hard to follow the flow or find the right option, which makes it not the best solution for beginners It can be slow at times It is also a bit expensive for the set of features it delivers [Download DbVisualizer](https://www.dbvis.com/download/) Azure Data Studio The last truly big name on our list is Microsoft’s Azure Data Studio. It is a cross-platform tool for data professionals who use on-premises and cloud data platforms on Windows, macOS, and Linux. Although SQL Server is the key DBMS for Azure Data Studio, you can use a special extension to connect to MySQL databases as well. The Studio delivers a modern editor experience with IntelliSense completion, code snippets, source control integration, an integrated terminal, built-in charting of query result sets, and customizable dashboards. Pros Clean and intuitive interface that takes the cue from Microsoft Visual Studio Quite a hefty set of features for a free product Seamless integration with Azure data services There are extensions that give access to new features and additional services Excellent documentation, support service, and a large community Cons Although Azure Data Studio is an advanced product, there are IDEs that boast a more in-depth approach to visual database design and query building, table design, server management, and database administration [Download Azure Data Studio](https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver16) DbGate Finally, let’s have an overview of DbGate – a free, cross-platform, and cross-database GUI client that covers both SQL-based and NoSQL systems. It allows connecting to multiple databases, browsing and editing table schemas and actual data, writing SQL queries with autocompletion, building queries visually, as well as creating ER diagrams, charts, and maps based on your data. Treat it as a non-ambitious database manager for macOS users whose requirements stick to the basics. Pros A nice set of features for a free tool Support for multiple databases (including NoSQL) The available import/export formats can be expanded with custom plugins Cons None of the available features can compete with the more advanced entries on this list, but we guess that goes without saying, yet it might be just what you are looking for [Download DbGate](https://dbgate.org/download/) Best MySQL GUI Clients for macOS Summary Client Free version available Multiplatform support Customer support User-friendly interface dbForge Studio for MySQL + + + + dbForge Edge + + + + MySQL Workbench + + + – SQLPro + + + + Sequel Pro + – – + Valentina Studio + + + + DBeaver + + + + Querious – – + + TablePlus + + + + RazorSQL + + + + Navicat + + + + DataGrip + + + – Beekeeper Studio + + + + DbVisualizer + + + + Azure Data Studio + + – + Conclusion Still unsure which client is the most suitable for you? That’s all right. At least now you can outline your needs and requirements more precisely, and take note of the pros and cons that can determine your final choice. Whether you are a MySQL newcomer or a seasoned expert, let us reiterate the importance of comprehensive documentation, active community, reliable technical support, and the availability of extra materials that will teach you how to deal with basic tasks most effectively. Tags [Alternatives](https://blog.devart.com/tag/alternatives) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [macOS](https://blog.devart.com/tag/macos) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-mysql-client-for-mac.html) [Twitter](https://twitter.com/intent/tweet?text=16+Best+MySQL+GUI+Clients+for+macOS&url=https%3A%2F%2Fblog.devart.com%2Fbest-mysql-client-for-mac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-mysql-client-for-mac.html&title=16+Best+MySQL+GUI+Clients+for+macOS) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-mysql-client-for-mac.html&title=16+Best+MySQL+GUI+Clients+for+macOS) [Copy URL](https://blog.devart.com/best-mysql-client-for-mac.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-mysql-gui-tools-for-linux.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Best MySQL GUI Tools for Linux By [dbForge Team](https://blog.devart.com/author/dbforge) March 12, 2025 [0](https://blog.devart.com/best-mysql-gui-tools-for-linux.html#respond) 8240 Linux has the reputation of an operating system for programmers. So, if you are a software developer who designs MySQL-based solutions, chances are high that you will do it on Linux. Thus, it would be great to have a MySQL IDE for Linux to simplify the work. But the question arises: is there an appropriate Linux DB tool for MySQL with a GUI? Contents MySQL GUI tool: What is it and why is it helpful? Most Popular GUI Tools for MySQL on Linux dbForge Studio for MySQL MySQL Workbench phpMyAdmin Navicat Valentina Studio DBeaver Beekeeper Studio DataGrip SQuirreL SQL Adminer SQLyog HeidiSQL TablePlus How to choose the best MySQL or MariaDB GUI tool? Conclusion MySQL GUI tool: What is it and why is it helpful? If\nyou ask what the most popular database management system is, MySQL\nwill be among the first answers coming to mind. It is reliable,\nconvenient, fast, functional, and open-source, meaning free of\ncharge. It offers lots of advantages for all database users,\nespecially those who develop software using LAMP – the famous\nsoftware development framework consisting of Linux (operating\nsystem), Apache (software server), MySQL (database management\nsystem), and PHP (object-oriented language). These four components provide everything you need to create highly functional web applications. Also, they are free of charge. However, to make your efforts more effective, it would be great to apply client software with a graphical interface to speed up the work and get rid of errors that occur inevitably in routine tasks. Such software clients are favored by users of all levels. Therefore, the need for GUI-based clients for MySQL brought the supply quickly. Most popular GUI tools for MySQL on Linux Modern\nGUI clients for MySQL are powerful multi-featured solutions for all\nkinds of jobs on MySQL. They allow the users to work with databases\nwithout coding directly (though, coding knowledge and skills are\nneeded). The market offers a wide variety of such software platforms. This\narticle will review the most functional and user-friendly MySQL GUI\nLinux solutions. The leading software producers have created Linux\nversions of their visual clients, and it is much easier to pick the\nsuitable variant for you. To\nselect the best MySQL GUI client Linux solutions for the overview, we\nhave used the following criteria: Functionality Price Technical Support Now,\nlet us have a look at the developers’ favorites that make MySQL\njobs on Linux easier for all users. dbForge for MySQL for Linux Support : Ubuntu, Debian, Fedora, Red Hat Enterprise Linux (RHEL) Price : 30-day free trial, free Express edition, paid editions from $9.95 (per month) [MySQL GUI Tool dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , designed as a powerful and easy-to-use MySQL and MariaDB GUI for Linux, offers a comprehensive toolset. Its graphical interface helps developers, database architects, DBAs, analysts, and DevOps professionals to automate and accelerate standard database tasks. In particular, it assists greatly with writing SQL statements of any complexity. The visual query builder and numerous features such as code auto-completion, syntax check, formatting, and code snippets help the developers cope with the most sophisticated queries. Also, with the [MySQL Source Control feature](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) , users can easily manage database versions, control schemas, and static table data. Enhance your workflow and edit your queries in visual GUI with the upgraded [MySQL editor](https://www.devart.com/dbforge/mysql/studio/mysql-code-editor.html) in dbForge Studio for MySQL. Also, being one of the most functional [MySQL and MariaDB GUI tools for Linux](https://www.devart.com/dbforge/mysql/studio/) , it helps you with many tasks. You can create and debug stored procedures, compare and synchronize database schemas and data, generate test data, migrate the data, manage user accounts, configure backup and restore tasks, and generate detailed reports on the performance and results. Pros Robust functionality with plenty of configuration options Quick professional support through several channels Comprehensive documentation and lots of learning materials Cons Advanced functionality is available in paid editions Supports only MySQL and MariaDB* No offline documentation [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) Looking for a way to use dbForge functionality with other DBMSs? Try [dbForge Edge](https://www.devart.com/dbforge/edge/) , a multi-database solution that covers MySQL, MariaDB, SQL Server, Oracle, PostgreSQL, and many other cloud databases, storage, and search engines! MySQL Workbench Support : Debian, Red Hat, Fedora, Oracle Linux, and Ubuntu Price : free MySQL\nWorkbench is the default Linux MySQL GUI client for database\ndevelopers, architects, and analysts. It is a cross-platform\nsolution, compatible with Windows, Linux, and macOS. A single IDE includes tools for any kind of database task and is among the best administration tools for MySQL and MariaDB environments on Linux. You can design databases with ER diagrams of any complexity and do forward and reverse engineering. SQL development becomes faster and easier with code auto-completion and visual tools for query building and optimizing. Database admins can configure servers, create and manage user accounts, migrate and audit data, and set up backup and restore tasks. Pros Easy management of standard database connections Administration and monitoring modules A single platform to cover the database modeling, generating, and managing databases Cons UI complexity Lack of the documentation Data transfer procedures complexity [Download MySQL Workbench](https://dev.mysql.com/downloads/workbench/) phpMyAdmin Support : Debian, Ubuntu & Mint, Fedora, RHEL & CentOS, Rocky Linux, AlmaLinux, Arch Linux, openSUSE Price : free phpMyAdmin\nis an open-source web-based solution with a simple but functional\nGUI. The purpose of the tool is to help MySQL users handle\ndatabase-related tasks online. It is a web application, therefore,\nphpMyAdmin is compatible with all OS, including Linux. With quality\ntranslations into 70+ languages, this Linux MySQL GUI tool is a\nfavorite choice of specialists all over the world. phpMyAdmin supports a wide range of operations on MySQL. Users can create, delete, and manage entire databases, tables, and database objects. Administrators can configure user accounts and their permissions down to the Table, Database, and Server levels. The data import/export mechanism is very simple – phpMyAdmin stores the data and schema in test files, and it is possible to use databases on other platforms. You can do tasks via the graphical interface or through executing SQL statements directly. Pros Comprehensive documentation Free of charge Support for most common file formats Cons Some security issues Performs slower than its competitors The GUI may be complicated for beginners [Download phpMyAdmin](https://www.phpmyadmin.net/downloads/) Navicat Support : Debian/Ubuntu, OpenSuse, CentOS, Fedora Price : 14-day free trial, monthly subscription $69.99, perpetual license $1,399.00 Navicat\nis a database development and administration tool that supports most\nof the popular database management systems and cloud platforms –\nall from one application and simultaneously. The Navicat users can design and handle databases and database objects, migrate the data between tables and different databases, compare and synchronize databases (both the data and schemas), and deploy changes. The reverse engineering module and a powerful query builder with graphical interfaces and drag-and-drop functionality let you perform the most complicated tasks faster. In general, Navicat provides all the functionality to work with databases, servers, and user accounts efficiently. Pros The possibility to automate tasks An attractive and intuitive graphical interface The report creator Cons High price Short trial period Ambiguous documentation [Download Navicat](https://www.navicat.com/en/download/navicat-for-mysql#lin) Valentina Studio Support : Ubuntu and its derivatives Price : free, paid from $79.99 Valentina\nStudio is a free tool for database management compatible with MySQL\nand many other RDBMS. It is a popular SQL GUI Linux solution for\nhandling multiple databases. Among the most demanded features, you\ncan name the query builder for speedy coding and the module for\nreverse engineering. The GUI offers convenient navigation and fast data search. You can migrate the data across the tables and databases and perform other standard database tasks. Besides, it has a separate module for user management. Pros Multiple editing of the object properties Warnings for the DELETE statements High security of the data Shortcut editor to ease working with different databases Cons Lack of support and documentation A complicated installation process [Download Valentina Studio](https://www.valentina-db.com/en/download-valentina-studio/current) DBeaver Support : Ubuntu, Debian, Mint, Arch Linux, RHEL-based systems Price : free (Community edition), paid from $10/month with a free trial DBeaver is a GUI-based software IDE with multi-database support. It is highly functional, user-friendly, and free of charge (Community edition). The most popular features of DBeaver are the SQL query editor, visual query builder, database comparison, test data generation, and ER diagrams. However, it offers other handy options for MySQL users. The\nteam behind DBeaver works on software improvement and keeps it\nup-to-date, fast, and stable. Pros Multi-user environment Database metadata storage with easy search Possibility to restrict user access on different levels Cons No support in the free edition Weaker data visualization functionality Complicated data import and export procedures [Download DBeaver](https://dbeaver.io/download/) Beekeeper Studio Support : Debian and Ubuntu, Fedora, Arch Linux, Price : free (Community Edition at GitHub), paid starts from $79 Beekeeper Studio is an open-source GUI tool for handling relational databases. The functionality ensures that you can do all essential MySQL-related tasks. There is the MySQL editor Linux version with the auto-completion and syntax highlighting features, the possibility to save queries for reusing, the data viewer, and other tools. The creators of Beekeeper Studio focus on making it as user-friendly and simple to use as possible. It is worth noticing the interface with multiple tabs for multi-tasking and keyboard shortcuts for faster work. Pros Query history with the search option SSL encryption of the connection Light and dark themes Cons The functionality is inferior to competitors The lack of support and documentation [Download Beekeeper Studio](https://www.beekeeperstudio.io/get) DataGrip Support : Latest 64-bit versions of Linux (e.g., Debian, Ubuntu, or RHEL) Price : 30-day trial, paid from $9.90/month DataGrip\nis a smart IDE for database tasks. It equips database developers,\nadministrators, and analysts with many professional tools integrated\ninto one platform. With the help of DataGrip, users can work with\nlarge queries and stored procedures easily as well as code faster\nwith the help of auto-completion, syntax checks, quick fixes, etc. There\nare tools to view the data and import and export it. Quick navigation\nthrough tables and easy access to all files and scripts also\naccelerate the tasks. Besides, DataGrip is very customizable,\nallowing database professionals to adjust the work to their needs. Pros Multiple shortcuts All the required data connectors are present Suggestions for queries, schemas, tables, functions, etc. Cons Complicated learning curve without onboarding help Can consume resources excessively [Download DataGrip](https://www.jetbrains.com/datagrip/download/#section=linux) SQuirreL SQL Support : CentOS, Debian, Fedora, Mint, Ubuntu Price : free SQuirreL\nSQL is an open-source graphical SQL client aimed to help database\nusers do the basic tasks on JDBC-compliant databases. As a Linux\nMySQL GUI manager, it provides the necessary functionality for the\ndata search and simplifies code writing with the auto-completion,\nspelling check, and reusing common queries. Pros Localizations into several languages, including Spanish, French, German, and Chinese Support for Java plugins to enhance the functionality and user experience High flexibility Cons Can only run on computers with Java installed Complicated installation procedure Limited support options [Download SQuirreL SQL](https://snapcraft.io/squirrelsql) Adminer Support : Ubuntu, Debian, Arch Linux Price : free Adminer\nis a common replacement for phpMyAdmin as it is also a web-based GUI\nsolution written in PHP. Many users find it more powerful and\nuser-friendly when doing all tasks on daily MySQL database\nmanagement. The tool provides the smart code auto-complete functionality, allows users to create and edit tables, and manages user accounts. The Adminer project is active, and the team behind it adds more functionality regularly. Pros No need for installation One-page interface for all tasks Cons The UI looks obsolete and not very intuitive A complicated process of moving database schemas across the workspace [Download Adminer](https://www.adminer.org/) SQLyog Support: Ubuntu, Debian, CentOS Price: 14-day free trial, paid from $299 SQLyog is a comprehensive GUI tool that is designed for managing MySQL and MariaDB databases across a wide range of environments that include physical, virtual, and cloud platforms. Its intuitive interface excels at simplifying complex database tasks, making it easy to use for beginners as well as experienced developers. SQLyog simplifies database development and administration with its features, including visual schema designer, query builder, and data synchronization tools. Its powerful capabilities, such as scheduled backups and query profiling, boost efficiency and ensure data integrity. Pros: User-friendly interface Visual schema designer Seamless data syncing Cons: No proc debugging: lacks an advanced debugging tool for the stored procedure code Patchy documentation: the product documentation is neither comprehensive nor clear Some advanced features require a paid license [Download SQLyog](https://webyog.com/product/sqlyog/) HeidiSQL Support: Debian, Ubuntu, Arch Linux Price: free HeidiSQL is a free and open-source administration tool for relational databases, supporting systems such as MariaDB, MySQL, Microsoft SQL Server, PostgreSQL, and SQLite. It provides an intuitive interface that allows users to browse and edit data, as well as create and modify tables, views, procedures, triggers, and scheduled events. The application offers features like multiple parallel sessions, SSH tunneling for secure connections, and the ability to export and import data in various formats. Its lightweight design and comprehensive functionality make it popular among developers and database administrators. Pros: User-friendly interface Supports a wide range of database systems Ensures secure connections via SSH tunneling Cons: Dependency on community support: primarily relies on the community for troubleshooting and support, which may not always be timely Incomplete documentation: the documentation may lack detail for certain features Doesn’t have advanced functionality [Download HeidiSQL](https://www.heidisql.com/) TablePlus Support: Ubuntu, Debian, Fedora Price: Trial version available, paid from $99 TablePlus is a tool for those looking to manage modern, native databases. It covers multiple relational databases, including MySQL, PostgreSQL, SQLite, and SQL Server. A rich set of features like inline editing, advanced filters, and a powerful SQL editor backs a clear, fully optimized interface. This helps developers and database administrators in their work, making it more efficient and maximizing their time. The application supports protected connections, safeguarding data during transfers via native libssh and TLS encryption. It makes customizing its appearance an easy task, while support for multiple tabs and windows simplifies its usage. If the goal is to optimize database management, TablePlus can help fulfill it in accordance with individual preferences. Pros: Intuitive, user-centric interface Support for multiple database systems Secure connections with SSH and TLS Cons: Costly but lacks scalability Autocomplete features might be obtrusive Limited advanced features: while efficient for common use, it lacks enhanced capabilities available in more specialized tools [Download TablePlus](https://tableplus.com/) How\nto choose the best MySQL or MariaDB GUI tool? You can choose the best Linux tools for your work depending on your preferences. Even with a basic knowledge of the system, you can quickly master the tasks with modern GUI-based clients. You will learn how to use them more effectively with experience. Getting back to the LAMP software development framework, we stressed that it consisted of free components. Thus, it seems logical to focus on the free MySQL clients for Linux too. However, the catch with free software is that it often has functional limitations and lacks support. Therefore, in this article, we focused on the capabilities of both free and paid solutions. One more condition is always a big advantage when considering the best SQL client Linux version. It is the support. If the software provides quality and comprehensive documentation, a forum where one can discuss different issues and share tips, you will work with it more efficiently. It does not matter if you are a beginner or an established professional. When reviewing different tools, we paid special attention to the factor of user support. Tool Support Key Features Pricing dbForge Studio for MySQL Ubuntu, Debian, Fedora, RHEL Visual query builder, schema comparison, backup & restore 30-day trial, free Express, paid from $9.95/month MySQL Workbench Debian, Red Hat, Fedora, Oracle Linux, Ubuntu ER diagrams, forward/reverse engineering, server configuration Free phpMyAdmin Debian, Ubuntu, Mint, Fedora, RHEL, CentOS, Rocky Linux, AlmaLinux, Arch Linux, openSUSE Web-based, supports most file formats, multilingual interface Free Navicat Debian/Ubuntu, OpenSuse, CentOS, Fedora Data migration, schema synchronization, query builder 14-day trial, $69.99/month, $1,399 perpetual license Valentina Studio Ubuntu and its derivatives Query builder, reverse engineering, user management Free, paid from $79.99 DBeaver Ubuntu, Debian, Mint, Arch Linux, RHEL-based systems Multi-database support, SQL query editor, ER diagrams Free (Community), paid from $10/month BeeKeeper Studio Debian, Ubuntu, Fedora, Arch Linux Query history, SSL encryption, multi-tab interface Free (Community), paid from $79 DataGrip Latest 64-bit versions of Linux (e.g., Debian, Ubuntu, RHEL) Smart IDE, SQL auto-completion, customizable UI 30-day trial, paid from $9.90/month SQuirreL SQL CentOS, Debian, Fedora, Mint, Ubuntu Java plugin support, auto-completion, multi-language UI Free Adminer Ubuntu, Debian, Arch Linux Web-based, auto-completion, lightweight Free SQLyog Ubuntu, Debian, CentOS Visual schema designer, query profiling, scheduled backups 14-day free trial, paid from $299 HeidiSQL Debian, Ubuntu, Arch Linux SSH tunneling, multiple parallel sessions, export/import support Free TablePlus Ubuntu, Debian, Fedora Inline editing, advanced filters, secure connections Trial version available, paid from $99 Conclusion Professional tools make the lives of software developers much easier, and it is not a problem to find a solution suitable for your database development and [administration](https://www.devart.com/dbforge/mysql/studio/database-administration.html) workflow. We reviewed the most popular tools created by the leading companies. And we hope our research will help you choose the best software for MySQL databases on Linux. While our focus is on tools, it’s also worth noting that [installing the MySQL client on Debian](https://www.devart.com/dbforge/mysql/install-mysql-on-debian/) is a straightforward process using standard package management tools. In addition to our instruments reviews, you’ll also find a user-friendly guide on [how to install MySQL on Linux](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-linux/) , simplifying your initiation into the world of database management. Tags [Alternatives](https://blog.devart.com/tag/alternatives) [linux](https://blog.devart.com/tag/linux) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-mysql-gui-tools-for-linux.html) [Twitter](https://twitter.com/intent/tweet?text=Best+MySQL+GUI+Tools+for+Linux&url=https%3A%2F%2Fblog.devart.com%2Fbest-mysql-gui-tools-for-linux.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-mysql-gui-tools-for-linux.html&title=Best+MySQL+GUI+Tools+for+Linux) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-mysql-gui-tools-for-linux.html&title=Best+MySQL+GUI+Tools+for+Linux) [Copy URL](https://blog.devart.com/best-mysql-gui-tools-for-linux.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/best-oracle-database-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Top 10 Oracle Tools By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) November 22, 2024 [0](https://blog.devart.com/best-oracle-database-tools.html#respond) 824 Oracle Database remains a global leader in database management, valued for its power and enterprise-level features. It is often the default choice for managing large databases and handling complex transactions, despite its complexity and cost. Effective tools for working with Oracle Database can significantly ease the workload for specialists. Both Oracle Corporation and various third-party vendors have developed a range of solutions designed for diverse database-related tasks in Oracle environments. This article reviews the most popular and effective tools, from robust, multi-featured IDEs to specialized solutions for specific tasks. We have picked the following tools for our examination: Oracle SQL Developer – the only official Oracle Database client created and supported by Oracle Corporation. dbForge Studio for Oracle – a comprehensive integrated development environment for all tasks in Oracle Database Navicat for Oracle – a cross-platform solution for handling database development tasks in Oracle dbForge Data Generator for Oracle – a specialized tool for producing high-quality test data Toad for Oracle – an advanced IDE for Oracle Database with high security and teamwork capacities dbForge Documenter for Oracle – a professional tool for creating comprehensive database documentation PL/SQL Developer – a popular software solution for working with Oracle PL/SQL stored program units dbForge Schema Compare for Oracle – a tool for the most precise and flexible schema comparison in Oracle DataGrip for Oracle – a smart cross-platform IDE for database management in Oracle dbForge Data Compare for Oracle – a dedicated solution for table data comparison and syncing How to choose the right Oracle tool – some helpful tips on choosing the most suitable solution Let’s start. Oracle SQL Developer Official Oracle Database client, created and supported by Oracle Corporation [Website](https://www.oracle.com/database/sqldeveloper/) Pricing: Free Rating: 4.5 out of 5 Oracle SQL Developer is the primary client software for Oracle Database, developed and supported directly by Oracle Corporation. As the official tool, it provides full support for all Oracle versions, both on-premises and in the cloud, with regular updates and patches from the vendor. It is a cross-platform product available for Windows, Linux, and macOS, offering a wide range of database development and management features. These include PL/SQL coding, data migration, database administration, visual tools for database design and query building, and more. Pros: Context-aware code auto-completion Data reporting Version control integration Unit testing for PL/SQL entities Reverse and forward engineering User management and performance monitoring Database backup and recovery Migration of third-party databases to Oracle Multi-language graphical user interface Cons: High resource consumption Steep learning curve with limited documentation [Download Oracle SQL Developer](https://www.oracle.com/database/sqldeveloper/technologies/download/) dbForge Studio for Oracle Comprehensive integrated development environment for all tasks related to Oracle Database [Website](https://www.devart.com/dbforge/oracle/studio/) Trial: 30 days (full functionality) Pricing: subscription-based license from $149.95/year, perpetual license from $299.95 Rating: 4.5 out of 5 dbForge Studio for Oracle is a powerful IDE designed specifically for Oracle Database, offering a multitude of features that is often not found in other solutions. It supports all modern Oracle versions, both on-premises and in the cloud, and provides complete database development, management, and administration functionality. The primary goal of dbForge Studio for Oracle is to minimize manual effort by automating routine tasks, speeding up processes through its visual interface, and ensuring faster and more precise task execution. Although developed as a Windows-native application, it runs on Linux and macOS via Wine or CrossOver. Pros: Code auto-completion, formatting, debugging, syntax validation, and more Database schema and data comparison and synchronization Visual Query Builder for easy, no-code query creation Data migration with support for up to 14 file formats Unlimited test data generation Automated database documentation User management tools Task automation via CLI Comprehensive product documentation Extensive learning resources, including video tutorials Free Express edition with basic functionality Cons: Limited features in the free Express edition No native cross-platform compatibility GUI is available only in English [Download dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) Navicat for Oracle Cross-platform solution for handling database development tasks in Oracle [Website](https://www.navicat.com/en/products/navicat-for-oracle) Trial: 14 days Pricing: subscription-based license from $20.99/month, perpetual license from $399 Rating: 4.4 out of 5 Navicat for Oracle is one of the most popular GUI-based solutions for database development and administration in Oracle Database. It is compatible with both on-premise systems and cloud platforms, enhancing users’ capabilities in doing tasks related to data management, database design, data migration, database comparing and synchronization, query building, and many other areas. Overall, Navicat provides Oracle specialists with all features to efficiently handle databases, servers, and user accounts. Pros: Robust SQL coding and editing Advanced connection management Collaborative tools for teamwork Data analysis with charts and dashboards Server security management Connection settings and queries synchronization Intuitive data modeling Data transfer across databases Test data generator Automated task scheduling Cons: Limited functionality in Query Builder Limited features in lower editions Lack of learning resources and documentation Short trial period [Download Navicat for Oracle](https://www.navicat.com/en/download/navicat-for-oracle) dbForge Data Generator for Oracle Specialized tool for producing high-quality test data [Website](https://www.devart.com/dbforge/oracle/data-generator/) Trial: 30 days Pricing: subscription-based license from $169.95/year, perpetual license from $329.95 dbForge Data Generator for Oracle is a standalone tool for filling Oracle tables with realistic test data. With a variety of basic and predefined generators, this solution provides high-quality test data of any required type and volume. Data Generator supports essential column data types while maintaining data integrity through foreign key, trigger, and check constraint support. Its user-friendly graphical interface and flexible configuration options enable easy and precise task configuration for each column. Additionally, all tasks can be automated and scheduled via the command line. Pros: Over 200 predefined data generators Custom generators for diverse categories Inter-column dependency support Editable data generation templates Command-line automation for tasks Cons: Limited to data generation functionality GUI is available exclusively in English [Download dbForge Data Generator for Oracle](https://www.devart.com/dbforge/oracle/data-generator/download.html) Toad for Oracle Advanced IDE for Oracle Database with high security and teamwork capacities [Website](https://www.quest.com/products/toad-for-oracle/) Trial: 30 days Pricing: from $625/year Rating: 4.4 out of 5 Created by Quest Software, renowned for its secure, professional data tools, Toad for Oracle is highly valued by users for its ability to simplify and automate database management, boosting overall productivity. Toad for Oracle is a robust database management tool designed for developers, administrators, and analysts. It offers a versatile suite of features, from coding support to DevOps, aimed at optimizing workflows, reducing code defects, enhancing output quality, improving security, and enabling team collaboration. Pros: Coding assistance, including code analysis, debugging, and snippets Visual query building, analysis, and optimization Code reviewing Database schema and data comparison and synchronization Data modeling with reverse engineering Data import and export Test data generation Source control integration Database administration Automated PL/SQL unit testing Task automation Cons: High cost Complex, less intuitive UI High resource usage in query optimization [Download Toad for Oracle](https://www.quest.com/register/141779) dbForge Documenter for Oracle Professional tool for creating comprehensive database documentation [Website](https://www.devart.com/dbforge/oracle/documenter/) Trial: 30 days Pricing: subscription-based license from $89.95/year, perpetual license from $169.95 dbForge Documenter for Oracle is a visual tool designed to simplify the creation of comprehensive database documentation. It efficiently manages databases of any size or complexity, automatically generating documentation in multiple formats (HTML, PDF, and Markdown) for easy navigation and object search. This tool provides a detailed view of the entire database structure, covering all Oracle schema objects along with their properties, inter-object dependencies, and scripts. The schema view is organized in a tree-like format for clarity, while style templates allow users to customize the layout to fit their preferences. Pros: Flexible settings to include or exclude specific objects or object groups Option to add or edit descriptions for Oracle schema objects Quick object search capability Automated generation via the command-line interface Cons: The functionality is limited to documentation tasks only GUI is available exclusively in English [Download dbForge Documenter for Oracle](https://www.devart.com/dbforge/oracle/documenter/download.html) PL/SQL Developer Popular software solution for working with Oracle PL/SQL stored program units [Website](https://www.allroundautomations.com/products/pl-sql-developer/) Trial : 30 days Pricing : permanent license from $243 per user Rating : 4.5 out of 5 PL/SQL Developer is a professional IDE for Oracle that comprises a full set of features for developing, testing, and debugging stored program units (applications, triggers, packages, etc.). The software delivers robust functionality in a user-friendly interface. With advanced coding assistance, integration with version control systems, data analysis and reporting, and more, PL/SQL Developer stands out as a versatile and reliable tool that helps its users to boost performance significantly. Pros: Unicode-compliant PL/SQL editor with powerful functionality Integrated debugger and code formatter Test Manager for efficient regression testing Tree-view Object Browser for accessing object info Connection management for multiple connections Graphical Query Builder for creating and editing statements SQL and PL/SQL performance optimization tools Built-in data reporting capabilities Database schema comparison and synchronization Cons: Requires an additional plugin for version control Complex and non-intuitive interface Limited documentation and learning resources available [Download PL/SQL Developer](https://www.allroundautomations.com/products/pl-sql-developer/free-trial/) dbForge Schema Compare for Oracle Handy tool for the most precise and flexible schema comparison in Oracle [Website](https://www.devart.com/dbforge/oracle/schemacompare/) Trial : 30 days Pricing : subscription-based license from $119.95/year, perpetual license from $229.95 Rating : 4.0 out of 5 (5 out of 5 when it is a part of [dbForge Compare Bundle for Oracle](https://www.devart.com/dbforge/oracle/compare-bundle/) ) dbForge Schema Compare for Oracle is a standalone tool for comparing and synchronizing Oracle database schemas. It supports live databases, script folders, snapshots, and backups, efficiently detecting all differences between databases, and providing detailed reports. The tool enables easy synchronization, allowing users to deploy changes across production and staging environments. With its intuitive visual UI, convenient wizards, and flexible configuration options, users can easily set up and run comparison and synchronization tasks. Additionally, it integrates well into DevOps workflows. Pros: In-depth database analysis for accurate comparison Comparison of Oracle table structures Manual or automated schema synchronization Flexible synchronization options (single differences, groups, or all discrepancies) Reusable synchronization scripts Detailed discrepancy reports in HTML and Excel Automation of comparison and sync tasks via the command line Cons: Limited to schema comparison and synchronization (data comparison and synchronization are available in [Compare Bundle for Oracle](https://www.devart.com/dbforge/oracle/compare-bundle/) ) Requires a paid license [Download dbForge Schema Compare for Oracle](https://www.devart.com/dbforge/oracle/schemacompare/download.html) DataGrip Smart cross-platform IDE for Oracle database management [Website](https://www.jetbrains.com/datagrip/) Trial : 30 days Pricing : subscription-based, from $9.90 per month Rating : 4.6 out of 5 DataGrip is a versatile, subscription-based IDE designed to support a range of database tasks. It equips Oracle developers, administrators, and analysts with numerous integrated tools helping work with PL/SQL queries and efficiently manage database objects. This software is popular for its highly customizable interface with multiple UI skins, enabling users to tailor the looks to their preferences, hide unnecessary elements, and arrange features and options for easy access. DataGrip also offers intelligent PL/SQL coding assistance, code editing and debugging tools, visual database design capabilities, database connection management, and many other features. Pros: Advanced SQL auto-completion and code editing Visual database designer for tables and views Visual query analysis and optimization Data viewing and editing User management A variety of plugins for extended functionality Comprehensive documentation with tutorials Cons: Limited features compared to some advanced IDEs High resource consumption Lacks database administration features [Download DataGrip](https://www.jetbrains.com/datagrip/download/) dbForge Data Compare for Oracle Dedicated solution for table data comparison and synchronization [Website](https://www.devart.com/dbforge/oracle/datacompare/) Trial : 30 days Pricing : subscription-based license from $119.95/year, perpetual license from $229.95 Rating : 4.0 out of 5 (5 out of 5 when it is a part of [dbForge Compare Bundle for Oracle](https://www.devart.com/dbforge/oracle/compare-bundle/) ) dbForge Data Compare for Oracle is a standalone tool for comparing and synchronizing data between Oracle tables. It helps users quickly identify data differences and deploy changes across supported platforms. The tool supports all Oracle versions from 9i to 23c and allows data comparison across tables, views, materialized views, and custom queries, even in databases with different structures. After comparison, users can synchronize data with flexible options, seamlessly applying changes from development to staging or production. Pros: Flexible comparison settings, with options to exclude objects individually or by mask Detailed analysis of data differences Customizable synchronization settings to deploy individual changes, grouped changes, or all at once Automated generation and reusable storage of synchronization scripts CLI-based automation for data comparison tasks Comprehensive reporting of data differences in HTML and Excel formats Cons: Limited functionality to data comparison and synchronization only (schema comparison and synchronization are available in [Compare Bundle for Oracle](https://www.devart.com/dbforge/oracle/compare-bundle/) ) No monthly subscription option [Download dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/download.html) How to choose the right Oracle tool To help Oracle Database specialists select the right tools, several key factors should be considered: Functionality Clarify your objectives and daily tasks, and ensure the tools you’re considering can fully support these needs. And of course, it’s always better to find a single solution that has everything you need in one place. Compatibility Ensure compatibility with your current setup and flexibility to adapt to future environments, including various Oracle versions, whether on-premise or in the cloud, across different operating systems. Simplicity Consider software with an intuitive, customizable interface that simplifies the configuration and execution of tasks. Vendor Support Opt for tools backed by reliable vendor support, including comprehensive documentation and personalized help across multiple channels. Price Pricing flexibility is crucial, with options like subscriptions or perpetual licenses and trials allowing you to evaluate before committing. Conclusion Given the prominence of Oracle in the database management landscape, there is a strong demand for effective tools. Fortunately, a range of high-quality solutions is available to meet the needs of Oracle specialists. In this article, we have reviewed some of the most popular and highly functional software tools for Oracle Database. One notable option is dbForge Studio for Oracle, which offers a wide range of features for standard Oracle tasks. It can save you significant time and costs and comes with a [fully functional 30-day free trial](https://www.devart.com/dbforge/oracle/studio/download.html) . Download the trial to experience how it can improve your workflows. Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-oracle-database-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Top+10+Oracle+Tools&url=https%3A%2F%2Fblog.devart.com%2Fbest-oracle-database-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-oracle-database-tools.html&title=Top+10+Oracle+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-oracle-database-tools.html&title=Top+10+Oracle+Tools) [Copy URL](https://blog.devart.com/best-oracle-database-tools.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [Products](https://blog.devart.com/category/products) [SQL Aggregate Functions: Syntax, Use Cases, and Examples](https://blog.devart.com/sql-aggregate-functions.html) April 10, 2025"} {"url": "https://blog.devart.com/best-postgresql-gui-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Top-Rated PostgreSQL GUI Tools [2024] By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) May 9, 2023 [0](https://blog.devart.com/best-postgresql-gui-tools.html#respond) 2912 Graphical User Interface (GUI) tools are omnipresent. It is how we interact with technologies in our daily lives without profound technical knowledge. However, intelligent GUI-based tools are equally essential for technical professionals, as they assist in enhancing their users’ efficiency and productivity. Speaking of database development, management, and administration, we can’t omit the role of GUI tools either. Plenty of solutions are tailored to perform all kinds of database tasks. In this article, we are going to review the best GUI tools designed for PostgreSQL – one of the most popular database management systems in the world. Most useful Windows PostgreSQL GUI clients dbForge Studio for PostgreSQL pgAdmin DBeaver DataGrip Navicat for PostgreSQL OmniDB HeidiSQL Beekeeper Studio TablePlus SQLGate Most popular features and functionalities of PostgreSQL GUI tools How to make the right choice Conclusion Most useful Windows PostgreSQL GUI clients PostgreSQL GUI tools provide their users with quick access to the necessary areas and methods to perform work tasks and resolve professional challenges. Further, the visual representation of the data also helps process the results faster, spot essential trends, or detect issues to resolve. Modern PostgreSQL GUI solutions focus on the possibility to provide the users with all tools they would need for work in one software. Still, the functionality of various tools available on the market differs significantly. Let us have a look at the most widely-used options that the Postgres devotees prefer in their work. dbForge Studio for PostgreSQL dbForge Studio for PostgreSQL is a robust IDE that enables efficient PostgreSQL development, management, and administration. Its extensive set of advanced features is specifically designed to streamline all database-related tasks. With its user-friendly and fast graphical user interface (GUI), dbForge Studio for PostgreSQL is favored by both novice and expert users. Pros: Schema and data compare and synchronization SQL code prompt and formatter Robust data analysis and reporting capabilities Suitable for small teams and enterprise companies Powerful Data Generator for test data Cons: Advanced features only available in the paid licenses Compatibility is limited to PostgreSQL databases only Potentially high cost for individual users Price: Free Express edition, paid license starts from $79.95 per year (a subscription-based license) and $149.95 (perpetual license), Free Trial is available for 30 days. [Download dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/download.html) pgAdmin pgAdmin is the sole native GUI client for PostgreSQL. It provides comprehensive support for all operations on this RDBMS, offers a rich set of features for its users, and is free and open source. As a web application, pgAdmin is compatible with all major operating systems and can be configured and run on cloud servers. Pros: Offers a stable, responsive, and customizable platform Provides visual tools for quick task configuration Includes the dashboard for monitoring all server activities Includes an SQL editor and code debugger Offers detailed documentation and community support Cons: Installation requires the use of command-line The GUI is non-intuitive Handling multiple databases requires advanced skills Price: Free [Download pgAdmin](https://www.pgadmin.org/) DBeaver DBeaver is a GUI-based tool for database developers and administrators that supports PostgreSQL as well as all other major DBMS. With DBeaver, users can easily create and edit databases and database objects, manage database users and connections, and perform other database tasks. Its robust functionality and flexible pricing options have made it one of the most popular choices for PostgreSQL specialists of all skill levels. Pros: Allows constructing SQL queries in a visual mode Provides various data views in the SQL editor Can import and export data in all popular formats Grants advanced security for databases Includes an ER diagram generator Cons: Intensive resource consumption Disconnection from the database after a period of inactivity Personal support available only for users with paid licenses Price: Free (Community edition), paid license starts from $110, Free Trial is available [Download DBeaver](https://dbeaver.io/) DataGrip DataGrip is a widely used PostgreSQL GUI tool. As a desktop application, it empowers users to create and execute queries, keep track of their history, and gain insights into the effectiveness of those queries and the overall behavior of the database engine. Additionally, DataGrip offers advanced code debugging and code refactoring features to guarantee the high quality of the output. Pros: Extensive customization options, including dark themes Advanced SQL code autocompletion and editing capabilities Diverse plugin collection for various databases DDL and DML automation tools Comprehensive reports featuring graphical charts Cons: High cost (only subscription model is available) Excessive resource consumption The lack of database administration capabilities No support for concurrent multi-database management Price: $229 per year per user (for organizations), Free Trial is available. [Download DataGrip](https://www.jetbrains.com/datagrip/) Navicat for PostgreSQL Navicat for PostgreSQL is a comprehensive graphical tool for database development on Postgres. Its user-friendly interface, rich feature set, and fast performance make it great for both experienced developers and beginners. Users can easily connect to local or remote PostgreSQL servers, as well as major cloud platforms, and synchronize connection settings and queries with the cloud to access essential functionality from anywhere. Pros: High level of security Job Scheduler to improve team productivity Navicat Cloud for team collaboration on its own cloud platform Data Modeling tool for easy database visualization, creation, and editing Import/export data in various formats Cons: Slow performance issues Manual refresh for new rows High cost especially for smaller teams Data modeling and charts only available in the highest-tier plan Price: Paid perpetual license starts from $229, a subscription-based license starts from $119.99 per year, Free Trial is available for 14 days. [Download Navicat for PostgreSQL](https://www.navicat.com/en/products/navicat-for-postgresql) OmniDB OmniDB is an open-source web tool designed to simplify database development tasks with a particular focus on PostgreSQL. Despite being open-source, it provides a variety of professional features for SQL development typically found in premium IDEs, such as code auto-completion tools, charts, and debugging features. The lightweight and intuitive interface makes it a valuable asset for database developers. Pros: User-friendly connection management Smart SQL editor with code auto-completion Efficient debugger for functions and procedures Customizable charts displaying real-time database metrics High-level data encryption Cons: The lack of community support and insufficient documentation Better suited for individual developers rather than teams Reduced functionality compared to other IDEs No data import/export capabilities Price: Free [Download OmniDB](https://github.com/OmniDB/OmniDB) HeidiSQL HeidiSQL is a user-friendly, free, and open-source solution with a convenient graphical interface for managing databases on PostgreSQL and other popular database management systems. It is lightweight and straightforward to operate. Although it may not have all the advanced features of paid IDEs, it provides essential tools for editing queries, managing data and tables, troubleshooting, and managing database users. Pros: Simple installation process Multiple connections in a single window Data export to XLS, HTML, JSON, and PHP SSH support Encrypted server-client connection Cons: Limited functionality compared to advanced IDEs Unstable performance reported Insufficient features for complex tasks Price: Free (open source) [Download HeidiSQL](https://www.heidisql.com/) Beekeeper Studio Beekeeper Studio has gained popularity as a software solution for managing database tasks on PostgreSQL. Its user-friendly graphical interface offers all the necessary functionality without requiring the user to manually write code or memorize keyboard shortcuts. With Beekeeper Studio, performing standard database tasks such as executing SQL queries or updating a table is effortless and quick. Pros: Strong privacy and security measures Tab-based interface for easy database management Separate tabs for table DDL and data views Ability to save and organize frequently used queries Flexible data export options including support for CSV, JSON, JSONL, and SQL formats Cons: Not recommended for database administration Absence of database performance monitoring Price: Starts from $7/month per user, Free Trial is available for 14 days. [Download Beekeeper Studio](https://www.beekeeperstudio.io/) TablePlus TablePlus is a widely used GUI-based software designed for database developers. It allows them to manage multiple databases quickly and securely on PostgreSQL and other database systems. An intuitive and user-friendly GUI simplifies the tasks of browsing, querying, and editing data. Moreover, users can customize the UI, configure shortcuts, and install Java-based plugins as needed, enhancing the software’s versatility and utility. Pros: Multiple security features Built-in support for SSH and TLS Smart SQL editor with diverse output splitting Dark theme option High performance and speed Cons: Insufficient support and documentation Limited free features Potential inconsistency due to UI customization Price: Starts from $89.00 per user, one-time purchase. [Download TablePlus](https://tableplus.com/) SQLGate SQLGate is a robust yet user-friendly IDE designed for PostgreSQL and other SQL databases. It simplifies the process of building and managing databases, as well as all tasks related to handling vast amounts of data. While SQLGate is primarily used by database managers and analysts, its versatile functionality also makes it beneficial for developers. Pros: Direct database connections Effortless export of massive amounts of data to Excel Support for multiple languages Four different themes to customize the interface Ability to view, filter, sort, and group results in the grid Cons: Expensive licensing Limited advanced features for the price Restricted functionality in the free edition Price: $500 one-time purchase (Enterprise edition). [Download SQLGate](https://www.sqlgate.com/) Most popular functionalities of PostgreSQL GUI tools The primary objective of every PostgreSQL GUI client is to provide its users with the necessary functionality and simplify their daily work processes. Since a significant portion of daily tasks are standard, they should be delegated to software tools. Most popular GUI solutions for Postgres allow you to manage such tasks efficiently. SQL editor The feature aims to help database developers accelerate their coding process and improve the quality of their output. SQL editors generally offer code auto-completion, syntax checking, code formatting, and debugging functionality. While not every tool in our review has all of these options available, the most robust solutions, such as dbForge Studio for PostgreSQL, offer all of them. Query Profiler With this feature, users can fine-tune query performance on PostgreSQL. The functionality includes query analysis, profiling, optimization, and visual comparison of query results. Consequently, database administrators can reduce the time for diagnostics and debugging and streamline queries before executing them. Data import/export Daily operations frequently require the relocation of data between different storages. However, using an advanced PostgreSQL GUI tool can make these data migration tasks effortless for users. Task templates for import and export, as well as advanced mapping capabilities available in dbForge Studio for PostgreSQL, significantly simplify all jobs. Data Generator Software testing demands the supply of high-quality and realistic test data. To meet this need, powerful data generators have been developed to provide database specialists with the required volumes and types of data. However, such advanced functionality is typically found only in the most robust Integrated Development Environments (IDEs). Backup and restore Performing regular backups and database restoration are fundamental tasks for every database specialist, regardless of the database system they work with. The PostgreSQL GUI client enables users to configure and schedule these routine tasks conveniently. Connection management One of the most essential features for many users is the ability to establish, edit, and manage multiple connections to servers and databases. The top PostgreSQL GUI tools offer this feature along with other benefits, such as color markers for distinct connections, grouping functionality, and more. Data visualization and reporting Effective reporting is essential for assessing current performance and devising future strategies. A proficient PostgreSQL GUI client must provide data visualization capabilities, including customizable reports with charts, pivot tables, diagrams, and the ability to save reports in various formats. Other features are also in high demand for PostgreSQL developers and administrators. For instance, the schema and data compare functionality is in high demand for database development, but it is present in the most powerful IDEs only. It is the first question that arises when you consider which GUI tool for Postgres you should adopt How to make the right choice If you are working with PostgreSQL databases and need a GUI solution to enhance your work, you will certainly have a wide variety of options. How to choose the best PostgreSQL GUI for your needs? Functionality Check if the software has all tools and features required for your successful work. Note that the same software solution may have different editions with different functionality. Cost There are both free and paid tools, and they vary in functionality significantly. Paid tools often offer different models, like subscriptions or one-time payments. Check all these options. Support and documentation Make sure that comprehensive documentation is available for the software, and check the support options – there might be community forums, regular support from professional managers, etc. Compatibility Make sure that the PostgreSQL GUI client you consider is compatible with your Postgres version. Also, check for the servers’ and cloud platforms’ support. Security Databases often store a lot of sensitive data, and you need to protect it. Check and ensure that the software you target provides the necessary security features (encryption, secure connections, user permissions, etc.) Performance Check how the Postgres GUI consumes system resources and how it processes large amounts of data and complex queries. The software should not slow your system down. Your first step is to define what you want to achieve with the software, and what your environment restrictions are. Then, check the available options and define if they match your capacities and requirements. Conclusion Even the best PostgreSQL GUI tools can’t substitute proper technical knowledge. Still, decent GUI solutions enhance the work experience. Whether you are a seasoned Postgres pro or a newcomer, these cutting-edge tools will help you master the tasks and raise your effectiveness. [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is one of the most favored solutions for all Postgres specialists – a fully-functional IDE that helps you develop, manage, and administer databases, with plenty of options and powerful reporting capacities. Moreover, the [Free Trial of this tool is available for 30 days](https://www.devart.com/dbforge/postgresql/studio/download.html) , so you can test this IDE under the full load. And if your work duties suggest dealing with other RDBMSs except for PostgreSQL, you might refer to the [multi-database dbForge Edge](https://www.devart.com/dbforge/edge/) with the support for all major systems (SQL Server, MySQL, Oracle, and PostgreSQL) – the latest solution presented by Devart. Tags [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-postgresql-gui-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Top-Rated+PostgreSQL+GUI+Tools+%5B2024%5D&url=https%3A%2F%2Fblog.devart.com%2Fbest-postgresql-gui-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-postgresql-gui-tools.html&title=Top-Rated+PostgreSQL+GUI+Tools+%5B2024%5D) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-postgresql-gui-tools.html&title=Top-Rated+PostgreSQL+GUI+Tools+%5B2024%5D) [Copy URL](https://blog.devart.com/best-postgresql-gui-tools.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [Products](https://blog.devart.com/category/products) [SQL Aggregate Functions: Syntax, Use Cases, and Examples](https://blog.devart.com/sql-aggregate-functions.html) April 10, 2025"} {"url": "https://blog.devart.com/best-postgresql-orm-in-dotnet.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Industry Insights](https://blog.devart.com/category/industry-insights) [Products](https://blog.devart.com/category/products) [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) 9 Best ORM Solutions for .NET: Comparison Guide By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) January 2, 2025 [0](https://blog.devart.com/best-postgresql-orm-in-dotnet.html#respond) 2020 Building a .NET application that interacts with a database? You’ll need a way to manage data efficiently. While you still can write raw SQL queries manually, that approach might get you in all kinds of trouble, from slower development because of repetitive code and having to troubleshoot performance issues due to clumsy queries to higher project maintenance costs whenever your database has changes in its schema. That’s where ORM (Object-Relational Mapping) comes in, making database interactions faster, safer, and easier to maintain, as it: Eliminates repetitive SQL writing. With ORM, instead of writing raw SQL, you work with C# objects and methods to fetch, update, and delete data. Ensures security against SQL injection. ORMs use parameterized queries, preventing common injection attacks. Handles relationships & transactions. ORMs manage foreign keys, joins, and transactions, reducing the risk of broken queries or inconsistent data. Assists in performance optimization. ORMs optimize queries with lazy loading, eager loading, and caching to improve speed. Moreover, some ORMs allow fine-tuning to prevent unnecessary database calls. Provides schema flexibility. If you change the database structure, you don’t need to rewrite every query since ORMs can adapt using migrations. It goes without saying that ORMs can become your perfect allies in building .NET applications, saving you tons of time and effort. However, when you check which options are available on the market, you can easily get dazed by loads of information, some of which can be outdated or irrelevant to your project. For one, if your project requires simple CRUD operations and quick development, Entity Framework Core is the best choice. But then, there are some faster options that reportedly provide better performance, which you can also go for. In this article, we’ll overview top 9 .NET ORMs, and will try to help you pick the best ORM for .NET Core for your future projects. Table of contents Criteria for selecting the best .NET ORM Top 9 .NET ORM solutions Comparison Best practices for using ORMs in .NET Enhancing ORM with Devart tools Conclusion Criteria for selecting the best .NET ORM Choosing the right .NET ORM isn’t about picking the “best” one — it’s about finding the right fit for your project. Consider your application’s needs, database structure, and performance requirements before making a decision. Here are some criteria you should consider before committing to any of the options. 1. Use case Your choice should align with your project’s specific requirements. Some ORMs are better suited for high-performance applications, while others prioritize flexibility. Define your priorities before deciding. 2. Expected number of users The choice of a .NET ORM depends a lot on the scenarios of how your app is going to be used, and especially on the number of users it’s going to have. If it’s just a single-user scenario, most ORMs will work. However, in multi-user apps, one should consider how the ORMs will act when several users modify data at the same time. 3. Locking and caching Some ORMs cache data to boost performance but risk showing outdated information. Others work directly with the database for real-time accuracy. 4. Application type Particlular .NET ORMs fit better for web apps since they can handle high request loads and show top performance, while for desktop apps, you can opt for more feature-rich ORM solutions. 5. Support of database features Most ORMs use generic SQL to stay compatible with multiple databases, but that can mean losing access to database-specific features (full-text search, advanced indexing). 6. Handling of data model updates Some .NET ORMs automate database migrations, saving time, while others require manual updates, offering more control but adding maintenance overhead. Thus, you should choose an ORM based on your team’s workflow and the project’s complexity. 7. Dependencies in ORM-specific data objects Some ORMs tie your application logic to ORM-specific objects, making switching databases harder. Others allow POCOs (Plain Old CLR Objects), keeping data models separate from ORM logic, but require more setup. 8. Ability to fetch data from a remote server If your app fetches data from a remote server, some ORMs handle this more efficiently than others. Look into how well it manages lazy loading, batching, and serialization. 9. Learning curve Comprehensive documentation and community support can save you when issues arise. Check forums, GitHub issues, and commercial support options before committing. Picking the right .NET ORM can save you development time, reduce maintenance headaches, and improve performance. Define your project’s needs first, then find the ORM  that fits you best —  you don’t have to settle for a one-size-fits-all solution. Top 9 .NET ORM solutions Let’s explore the top proven ORM solutions for .NET in 2025: 1. Entity Framework Core (EF Core) EF Core is a modern, cross-platform .NET ORM that simplifies database access in applications. It supports multiple databases, LINQ queries, automatic schema migrations, and change tracking, making it a powerful choice for developers who want to reduce manual SQL writing. Key features: Productivity. Eliminates the need for hard-coded SQL by enabling LINQ-based queries, lowering the learning curve. Maintainability & security. Built-in SQL query management reduces the risk of SQL injection. Cross-platform support. Runs on .NET Core, making it compatible with Linux and Windows. Supports multiple databases like SQL Server, PostgreSQL, MySQL, and SQLite. Performance optimization:. Supports eager and lazy loading to optimize data retrieval, preventing unnecessary queries. Automated change tracking. Keeps entity state synchronized with the database, reducing manual coding efforts. Learn more about [EF Core](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) strengths and architecture specifics, and check the best practices of using it in your project. 2. Dapper Dapper is a lightweight, high-performance micro-ORM designed for developers who need direct SQL control while benefiting from object mapping. Being listed as the best orm for C# by many developers, this solution provides simple yet powerful data access capabilities without the overhead of a full-fledged ORM. Key features: Performance-focused. One of the fastest ORMs, ideal for high-performance applications with low latency needs. Minimal overhead. Lightweight and requires only a few lines of code to execute queries. Raw SQL control. Allows direct execution of SQL queries and stored procedures, offering complete database flexibility. Advanced query handling. Supports multi-mapping, bulk inserts, and batching multiple queries in a single call. Asynchronous support. Enables efficient execution of database operations with async/await patterns. 3. NHibernate NHibernate is a mature and highly flexible .NET ORM that is widely used in enterprise applications. This solution offers advanced mapping capabilities, extensive customization options, and support for multiple databases. However, it has a steep learning curve and lacks documentation. Key features: Advanced Mapping Capabilities: Supports complex object-relational mappings with fine-grained control. Database Flexibility: Works with multiple databases and supports various ID generation strategies (Identity, Sequence, etc.). Caching Mechanisms: Includes first- and second-level caching for optimized performance. Unit of Work Implementation: Ensures efficient transaction handling and data consistency. Mature Ecosystem: Well-established with a strong enterprise presence, offering stability and reliability. Explore the detailed [comparison between NHibernate and EF Core](https://blog.devart.com/nhibernate-vs-entity-framework-core-a-comprehensive-comparison-for-c-net-developers.html) , two leading object-relational mapping frameworks. 4. Marten Marten is a powerful .NET ORM built specifically for PostgreSQL, offering a hybrid approach that combines traditional relational database features with NoSQL-style document storage. It simplifies schema management and is ideal for applications that require JSON document storage, event sourcing, and full-text search. Key Features: JSON document storage. Leverages PostgreSQL’s advanced JSON capabilities for flexible data modeling. Event sourcing support. Provides built-in event store functionality for CQRS-style architectures. Hybrid persistence. Supports relational, document, and key/value storage within the same database. Optimized querying. Compiled queries offer superior performance compared to traditional ORM LINQ translation. Flexible schema management. Reduces the need for complex mapping and database migrations. 5. LinqConnect LinqConnect by Devart is a powerful ORM for .NET designed to extend the functionality of LINQ to SQL. It offers comprehensive ORM capabilities with native support for SQL Server, Oracle, MySQL, PostgreSQL, and SQLite. Key features: LINQ Provider. Unlike most ORMs, LinqConnect is designed as a LINQ provider, which ensures higher performance and better LINQ support without extra layers between LINQ and SQL. Flexible development approaches. Supports code-first, database-first, model-first, and combinations of database-first and model-first. Wide integration. Seamlessly integrates with various Microsoft technologies, including Windows Forms, ASP.NET WebForms, WPF, and WCF RIA Services. It also supports LinqDataSource for ASP.NET web applications and data binding for Windows Forms and WPF. Mapping support. Supports complex types, many-to-many associations, inheritance (TPT, TPH). Rich runtime features: Smart Change Tracking for better data consistency. Advanced Error Handling for easier troubleshooting. Extensibility f or customizing CRUD operations and more. POCO Support for a clean and flexible domain model. Lazy & Eager Loading can be customized on a per-query basis. Performance Optimizations. Offers compiled query caching, batch updates, and other performance improvements that ensure faster execution times with minimal code changes. 6. RepoDB RepoDB is a hybrid ORM for .NET that combines Dapper’s flexibility with the automation of Entity Framework Core. It offers both raw SQL execution for performance and simplified CRUD operations to reduce boilerplate code. Key Features: Hybrid approach. Mixes micro-ORM efficiency with full-ORM capabilities. Optimized performance. Lightweight, fast, and memory-efficient for high-speed data processing. Asynchronous operations. Supports async CRUD, queries, and transactions. Flexible querying. Provides both method-based operations and direct SQL execution. Advanced mapping. Includes type mapping, field mapping, and multiple result set mapping. Caching & tracing. Features built-in MemoryCache for second-level caching and query tracing. SQL statement builder. Simplifies query generation for better maintainability. 7. OrmLite OrmLite is a lightweight, high-performance ORM designed to simplify data access while maintaining flexibility. It focuses on minimal configuration, safety, and high efficiency, making it a great choice for developers who want control over SQL while avoiding excessive boilerplate code. Key features: Bulk inserts. Efficiently insert large datasets with optimized performance. Reference support. Easily define relationships between entities. Typed join expressions. Enables strongly-typed LINQ-style joins. Dynamic result sets. Retrieve flexible, dynamic data structures. Database transactions. Supports ACID-compliant transaction management. Type converters. Custom serialization for complex data types. 8. PetaPOCO PetaPoco is a lightweight and high-performance micro-ORM for .NET designed for simplicity, speed, and flexibility. It took best from multiple ORMs while maintaining minimal overhead and zero dependencies. Key features: Tiny & fast:. Uses dynamic method generation (MSIL) for quick property assignments. Async & sync support. Perform database operations asynchronously or synchronously. Pure POCOs. Works with undecorated POCOs or attributed POCOs for flexibility. Fluent configuration. Easy setup without extensive boilerplate. CRUD helper methods. Built-in Insert, Delete, Update, Save, and IsNew methods. SQL-first approach. Directly write SQL queries without ORM overhead. SQL Bbuilder:. Low-friction SQL helper for inline query writing. Logging & customization. Hooks for exception logging, value converters, and custom mappings. Wide database compatibility. Supports SQL Server, SQLite, MySQL, PostgreSQL, Firebird, MariaDB, MS Access, and more. Cross-platform. Works with .NET Standard 2.0, .NET 4.0/4.5+, and Mono 2.8+. 9. XPO XPO is a blazing-fast ORM for .NET designed for code-centric development with minimal database complexity. It supports Code-First, Model-First, and Database-First workflows, making it a versatile choice for .NET developers. Key features: High performance. Optimized for speed, allowing efficient data retrieval and persistence. Transparent object-relational mapping. Uses .NET Reflection and attributes to map business objects to relational databases. CRUD & Query support. Supports LINQ queries, object queries, and calculated conditions for flexible data retrieval. Built-in caching. Uses MemoryCache for enhanced performance. Transaction & concurrency support. Implements Unit of Work, optimistic locking, and nested transactions for data consistency. Binding & pagination. Simplifies UI development by binding paginated object collections to controls. Multi-database support. Compatible with SQL Server, MySQL, PostgreSQL, Oracle, SQLite, Firebird, and more. Custom constraints & indexes. Allows developers to define custom constraints and database indexes using metadata attributes. Comparison .NET ORM frameworks tend to have different strengths and specifics that make them totally fit one project and use case and be inapplicable for a different instance. Since it might be difficult to get them compared, check the table below to find the best ORM for .NET project you are handling. Product Type Support for SQL Database Support Model-First or Code-First Performance Optimization Entity Framework Core (EF Core) Full-ORM LINQ + SQL SQL Server, PostgreSQL, MySQL, SQLite, etc. Both (Code-First, Database-First, Model-First) Second-Level Cache, Lazy Loading, Compiled Queries Dapper Micro-ORM Raw SQL SQL Server, MySQL, PostgreSQL, etc. Code-First Minimal overhead, high performance for raw SQL NHibernate Full-ORM HQL, LINQ SQL Server, MySQL, PostgreSQL, etc. Both (Code-First, Database-First, Model-First) Second-Level Cache, Lazy Loading, Optimized for large datasets Marten Full-ORM LINQ + SQL PostgreSQL Code-First Optimized for event sourcing, eventual consistency LinqConnect Full-ORM LINQ-based PostgreSQL, SQL Server, MySQL, SQLite Both (Code-First, Database-First, Model-First) Compiled query cache, batch updates RepoDB Hybrid-ORM Raw SQL SQL Server, MySQL, PostgreSQL, etc. Code-First Batch updates, high performance, optimized SQL execution OrmLite Micro-ORM Raw SQL + POCO Support SQL Server, PostgreSQL, MySQL, SQLite Code-First Memory caching, fast CRUD operations PetaPoco Micro-ORM Raw SQL + POCO Support SQL Server, PostgreSQL, MySQL, SQLite Code-First Memory caching, high performance, optimized for raw SQL XPO Full-ORM LINQ + SQL SQL Server, MySQL, PostgreSQL, SQLite, Oracle, etc. Both (Code-First, Database-First, Model-First) Compiled queries, batch updates, caching Best practices for using ORMs in .NET On your quest to find the best ORM for .NET Core, you might already have some options in mind. However, having a perfect ORM doesn’t mean you will build a scalable and sustainable app at the end of the day. There are certain pitfalls in relying on ORMs just a bit too much or overusing them when it’s not that necessary. The focal point of using an ORM for .NET is that it should help you keep things simple, efficient, and flexible as you map objects to relational data. However, sometimes, this isn’t the case. Here’s how you can get the most out of your ORM solution, ensuring clean, performant, and easy-to-maintain code. 1. Keep things simple Complex configurations or additional steps (like source code generation) can add overhead, increase maintenance, and make the ORM harder to use. A good ORM should be intuitive and not require complex configurations, preprocessing, or code generation. Minimize unnecessary steps, such as creating many additional files for mappings or configurations. This ensures that the ORM actually improves productivity instead of creating new headaches. 2. Non-intrusive object model Many ORMs impose certain rules or require special classes (e.g., base classes or proxy objects) that make it harder to integrate the ORM into your existing object model. Use POCOs (Plain Old CLR Objects) for your business logic. Avoid requiring any superclasses or proxies in your object models. This keeps your objects clean and focused solely on business logic without introducing unnecessary dependencies on the ORM. 3. Flexibility in object modeling Some ORMs impose limitations on how you model your domain, such as restricting class relationships or requiring you to flatten models. Your ORM should allow you to model your domain as you see fit, supporting class hierarchies, relationships (one-to-one, one-to-many), and operations (lazy loading, deep fetching). The ORM should not impose restrictions on your business logic. 4. Declarative and simple mapping Mapping between the object model and the database schema can get messy and complicated if not organized properly. It can also increase maintenance overhead when mappings are embedded in code. Keep your mappings clear and easy to manage. Avoid putting mappings directly in your source files, as it clutters your code and requires constant recompilation. Instead, store mapping specifications externally, in a simple, declarative format like XML or annotations, but without overloading the system. 5. Control object modicications Automatically persisting changes to the database can lead to unintended data loss or corruption, as you might not want all changes to be saved. Avoid making assumptions about the persistence of an object. Developers should explicitly control when an object is updated, inserted, or deleted. This ensures the ORM doesn’t unintentionally modify data that wasn’t meant to be persisted. 6. Avoid overly complex query languages and APIs Some ORMs introduce complex query languages or APIs that can be cumbersome to use and difficult to understand. SQL is powerful and widely understood, so leverage its capabilities through LINQ or raw SQL queries. This makes it easy to work with. 7. Use direct database access when needed While the ORM should handle most tasks, you may ыешдд need direct access to the database. For such cases, provide simple mechanisms for executing raw SQL queries without complicating the ORM interface. Enhancing ORM with Devart tools Devart offers a suite of tools that significantly enhance the capabilities of ORM in .NET, particularly for PostgreSQL databases. Entity Developer [Entity Developer](https://www.devart.com/entitydeveloper/) is a powerful ORM designer for ADO.NET Entity Framework, NHibernate, LinqConnect, Telerik Data Access, and LINQ to SQL. It simplifies the design, development, and maintenance of ORM models through its visual designer and code generation capabilities. For PostgreSQL, Entity Developer ensures that developers can use the full power of their ORM with enhanced productivity and reduced development time. If you’re evaluating different ORM tools for .NET, you might find this comparison between [NHibernate vs Dapper](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) helpful to better understand their strengths and use cases. dotConnect [dotConnect](https://www.devart.com/dotconnect/postgresql/) is an enhanced ORM-enabled data provider that builds on the [ADO.NET technology](https://www.devart.com/dotconnect/what-is-ado-net.html) . It supports a vast range of ORM solutions, including Entity Framework, NHibernate, and LinqConnect. dotConnect enhances ORM performance with your databases by providing advanced features such as secure SSL connections, PostgreSQL notifications, PostgreSQL bulk data loading, GEOMETRY, PostgreSQL ARRAY types, and others. Conclusion When choosing the best ORM for .NET, remember that there’s no one-size-fits-all solution. Each ORM brings its own set of strengths and trade-offs, whether it’s code-first, model-first, or database-first approaches. What’s most important is finding the one that aligns with your development philosophy and project needs. Using an ORM for .NET projects, when done right, reduces bugs, enhances security, and simplifies maintenance, but the choice of the ORM is only the first step to successful app development. Make this choice wisely based on your project’s specifics, and use the best practices of object-relational mapping for top results and balanced performance. Also, having an ORM designer to complete your toolset might also help you with faster, mistake-free development. Check [Entity Developer](https://www.devart.com/entitydeveloper/) from Devart to learn how to combine it with EF Core, NHibernate, and other popular ORMs to design ORM models faster. Tags [ef core](https://blog.devart.com/tag/ef-core) [entity developer](https://blog.devart.com/tag/entity-developer) [orm](https://blog.devart.com/tag/orm) [orm solutions](https://blog.devart.com/tag/orm-solutions) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-postgresql-orm-in-dotnet.html) [Twitter](https://twitter.com/intent/tweet?text=9+Best+ORM+Solutions+for+.NET%3A+Comparison+Guide&url=https%3A%2F%2Fblog.devart.com%2Fbest-postgresql-orm-in-dotnet.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-postgresql-orm-in-dotnet.html&title=9+Best+ORM+Solutions+for+.NET%3A+Comparison+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-postgresql-orm-in-dotnet.html&title=9+Best+ORM+Solutions+for+.NET%3A+Comparison+Guide) [Copy URL](https://blog.devart.com/best-postgresql-orm-in-dotnet.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) Best Practices in Using the DbContext in EF Core By [dotConnect Team](https://blog.devart.com/author/dotconnect) October 19, 2022 [0](https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html#respond) 7253 Entity Framework Core is an open-source, popular, lightweight, and flexible cross-platform ORM. In Entity Framework Core (also called EF Core), a Db context is an object that coordinates queries, updates, and deletes against your database. It takes care of performing CRUD operations against your database. The DbContext, the central object in Entity Framework Core, is a gateway to your database. It provides an abstraction layer between the domain model and EF Core. This abstraction helps you keep your models independent from EF Core and lets you easily switch persistence providers if needed. This article talks about Db Context concepts and the best practices you can adhere to for improving the performance and scalability of your applications built using ASP.NET 6. We’ll use a PostgreSQL database using Devart for PostgreSQL to store and retrieve data. Getting Started First off, we need to have a database against which the queries will be executed. For the sake of simplicity, instead of creating our own database, we will use the Northwind database. If you don’t have the Northwind database available, you can get the script(s) from here: [https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/northwind-pubs](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/northwind-pubs) . Next, create a new .NET Core Console Application project and include the Devart.Data.Postgresql NuGet Package onto it. It can be installed either from the NuGet Package Manager tool within Visual Studio or from the NuGet Package Manager console by using the following command: PM> Install-Package Devart.Data.PostgreSql Next, create a custom DbContext class named MyDbContext in a file having the same name with a “.cs” extension and replace the default generated code with the code listing given below: public class MyDbContext : DbContext\n{\n public MyDbContext()\n {\n }\n protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)\n {\n }\n protected override void OnModelCreating(ModelBuilder modelBuilder)\n {\n }\n public DbSet Customers { get; set; }\n public DbSet Orders { get; set; }\n} We’ll use this class to perform CRUD operations in the subsequent sections of this article. What is DbContext? Why do we need it? The concept of a DbContext has been around since the first version of the Entity Framework was released. It is one of the fundamental building blocks in the Entity Framework and Entity Framework Core that are used for creating, querying, and updating data from a database. A DbContext represents a collection of entity sets or tables in your database. It allows us to write strongly-typed code against our database without having to deal with low-level SQL statements or having to worry about manually mapping classes to database tables. The DbContext is not just another object. It’s one of the core objects in Entity Framework Core, and it has a lot of responsibilities to manage. You’ll use it for things like: Accessing the database Managing connections to the database Managing transactions with the database (including setting up new ones, rolling back existing ones, or even canceling them) Change tracking Reading and persisting data. Querying for entity types using LINQ syntax There Should be Only one DbContext Instance per Request/Unit of Work/Transaction The DbContext is a singleton class that represents the gateway to all data access, and therefore should not be instantiated more than once. If you need multiple database connections or have multiple active contexts operating in parallel, then use the DbContextFactory class instead. Avoid Disposing DbContext Instances It would help if you did not dispose of DbContext objects in most cases. Although the DbContext implements IDisposable, it should not be manually disposed of or wrapped in a using statement. DbContext controls its lifespan; after your data access request is complete, DbContext will automatically close the database connection for you. Don’t use Dependency Injection for your DbContext Instances It would help if you did not use Dependency Injection for your DbContext instance in domain model objects. This rule aims to keep the code clean and DRY and make it easier to test. Hence, inject the actual instance of the class that has a dependency if you want to use dependency injection. So, instead of injecting an instance of DbContext, you should inject an instance of type IUnitOfWork or another class that contains other dependencies that the specified object type can use. Injecting a DBContext directly into your business logic classes will make the methods of your business logic classes hard to unit test as they would require a connection string and possibly some configuration settings etc., to work. So, avoid code such as this in your controllers or business logic classes: public class DemoController : ControllerBase\n{\n private MyCustomDbContext _dbContext;\n public DemoController(MyCustomDbContext customDbContext)\n {\n _dbContext = customDbContext;\n }\n//Other methods\n} Keep Your Domain Objects Ignorant of the Persistence Mechanism Persistence ignorance may be defined as the ability to persist and retrieve standard .NET objects without knowing the intricacies related to how the data is stored and retrieved in the data store. Your domain model objects should not be aware of the persistence mechanism, i.e., how the data is persisted in the underlying data store. In other words, if your entities are persistence ignorant, they shouldn’t bother about how they’re persisted, created, retrieved, updated, or deleted. This would help you to focus more on modeling your business domain. You can refer to the Customer or Order class given later in this article. Both of these classes are POCO (an acronym for Plain Old CLR Objects) classes. The persistence mechanism logic should be encapsulated inside the DbContext. A persistent ignorant class is one that doesn’t have any knowledge of how the data is persisted in a data store. Only the data access layer in your application should have access to the DbContext. Additionally, only the DbContext should be allowed to access the database directly. That said, you should use persistence ignore with caution. Here’s what the Microsoft documentation says: “Even when it is important to follow the Persistence Ignorance principle for your Domain model, you should not ignore persistence concerns. It is still very important to understand the physical data model and how it maps to your entity object model. Otherwise, you can create impossible designs”. Split a large DbContext to Multiple DbContext Instances Since the DB context represents your database, you may wonder whether the application should only have one DB context. In reality, this practice is not at all acceptable. It has been observed that if you have a large Db Context, EF Core will take more time to get started, i.e., the startup time of EF Core will be significantly more. Therefore, rather than using an extensive database context, break it into many smaller ones. Then, you can have each module or unit have one Db Context. Disable Lazy Loading and Use Eager Loading for Improved Performance Entity Framework Core uses any of the following three approaches to load related entities in your application. Eager loading – in this case, related data is loaded at the time when the query is executed using the Include() method. Explicit loading – in this case, related data is loaded explicitly at a later point in time using the Load() method. Lazy loading – this is the default phenomenon used meant for the delayed loading of related entities. You should turn off lazy loading to improve performance using the LazyLoadingEnabled property as shown in the code snippet given below: ChangeTracker.LazyLoadingEnabled = false; Instead, you should use eager loading to load your required entities at once. The following code snippet illustrates how you can use eager loading using your custom DbContext instance. using (MyDbContext dbContext = new DbContext())\n{\n var result = dbContext.Customers.Include(\"Orders\")\n .Where(o => o.ShipCity == \"New Jersey\")\n .FirstOrDefault();\n} Disable Object Tracking unless it is Required An ORM can manage changes between in-memory objects and the database. This feature is also known as object tracking. Note that this feature is turned on by default on all entities. As a consequence, you may modify those entities and then persist the changes to the database. However, you should be aware of the performance cost associated. Unless necessary, you should disable object tracking. The following code sample shows how to use the AsNoTracking method to minimize memory use and increase speed. var order = dbContext.Orders.Where(o => o.ShipCountry == \"India\").AsNoTracking().ToList(); You may also deactivate query-tracking behavior at the database context level. This would deactivate change tracking for all entities associated with the database context. this.ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking; Here’s the updated source code of our custom Db context class named MyDbContext with change tracking and query tracking disabled. public class MyDbContext : DbContext\n{\n public MyDbContext()\n {\n ChangeTracker.QueryTrackingBehavior =\n QueryTrackingBehavior.NoTracking;\n this.ChangeTracker.LazyLoadingEnabled = false;\n }\n\n public MyDbContext(DbContextOptions\noptions) : base(options)\n {\n ChangeTracker.QueryTrackingBehavior =\n QueryTrackingBehavior.NoTracking;\n this.ChangeTracker.LazyLoadingEnabled = false;\n }\n protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)\n {\n }\n protected override void OnModelCreating(ModelBuilder modelBuilder)\n {\n }\n\n public DbSet Customers { get; set; }\n public DbSet Orders { get; set; }\n} The source code of the Customer and Order classes is given below: public class Customer\n{\n public int Id { get; set; }\n public string FirstName { get; set; }\n public string LastName { get; set; }\n}\npublic class Order\n{\n public int Id { get; set; }\n public DateTime OrderDate { get; set; } = DateTime.Now;\n} Use DbContext Pooling A DbContext is often a lightweight object: creating and disposing of one does not require a database activity, and most applications may do so with little to no performance effect. However, each context instance builds up various internal services and objects required for executing its functions, and the overhead of doing so repeatedly might be detrimental to the application’s performance. Here’s exactly where DbContext pooling helps. When a newly created instance is requested, it is returned from the pool rather than being created from scratch. With context pooling, context setup costs are only incurred once at the program startup time rather than every time the application runs. When using context pooling, EF Core resets the state of the context instance and places it in the pool when you dispose of an instance. You can leverage the built-in support for DbContext pooling in EF Core to enhance performance. Using this feature, you can reuse previously generated DbContext instances rather than building them repeatedly. DbContext pooling, in other words, enables you to reuse pre-created instances to achieve a speed benefit. The following piece of code shows how you can use this feature: services.AddDbContextPool(\n options => options. UsePostgreSql(dbConnectionString)); Other Best Practices If you’re dealing with vast amounts of data, i.e., large datasets, you should not return the entire resultset. Instead, you should implement paging to return one page of data. You should note that the DbContext instance is not thread-safe. Hence, you should not use multiple threads to access a DbContext instance simultaneously. You can batch queries in EF Core to minimize roundtrips and improve performance. Another way to improve query performance in EF Core is by using compiled queries. Take advantage of the execution plan of queries to fine-tune performance, understand performance bottlenecks, etc. An execution plan comprises the series of operations performed by a database to fulfill a request. You should check the execution plans of your queries and costs (CPU, elapsed time, etc.). You may then choose the best strategy and modify the query before using the query again. Summary From a conceptual perspective, the DbContext is similar to the ObjectContext and represents a hybrid of the unit of work and repository design patterns. You can use it for any database interaction in an application that leverages EF Core in its data access layer. When using DbContext in EF Core, there are a few best practices to follow in order to maximize the efficiency and effectiveness of working with EF Core in your applications. Tags [ADO.NET](https://blog.devart.com/tag/ado-net) [ASP.NET](https://blog.devart.com/tag/asp-net) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [PostgreSQL](https://blog.devart.com/tag/postgresql) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-practices-in-using-the-dbcontext-in-ef-core.html) [Twitter](https://twitter.com/intent/tweet?text=Best+Practices+in+Using+the+DbContext+in+EF+Core&url=https%3A%2F%2Fblog.devart.com%2Fbest-practices-in-using-the-dbcontext-in-ef-core.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html&title=Best+Practices+in+Using+the+DbContext+in+EF+Core) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html&title=Best+Practices+in+Using+the+DbContext+in+EF+Core) [Copy URL](https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/best-tool-to-compare-two-sql-server-databases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) 7 Best Tools to Compare Two SQL Server Databases – Free and Paid By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) February 27, 2025 [0](https://blog.devart.com/best-tool-to-compare-two-sql-server-databases.html#respond) 682 As someone who’s worked with SQL Server databases, you know there’s no margin for error. A single discrepancy between environments can cascade into a failed deployment, broken workflows, or inaccurate reports. And let’s face it, manually comparing schemas and data isn’t just tedious; it’s risky. This is where industry-leading database comparison tools come to the rescue, turning a frustrating, time-consuming process into a smooth and efficient one. This guide highlights the best SQL Server database comparison tools, selected for their ability to handle even the most complex environments. Whether you’re synchronizing schemas across regions or fine-tuning your production pipeline, these tools will help you work smarter, not harder. Read on! Table of contents Tool comparison table Why compare SQL Server databases? Manual vs. automated comparison Tools for comparing SQL Server databases How to choose the right tool Tool comparison table Below is a quick preview of the top tools we’ll cover, their strengths, pricing models, and trial availability to help you make a quick comparison at a glance: Tool Name Best For Pricing Free Trial Availability Key Features Link to Website dbForge Compare Bundle Advanced data & schema comparison and synchronization. Starting at $269.95 Yes Schema and data comparison, integration with version control, backup and recovery [Visit dbForge Compare Bundle](https://www.devart.com/dbforge/sql/compare-bundle/) Redgate SQL Compare & SQL Data Compare Developers and CI/CD workflows Pricing upon request Yes Schema/data comparison, CI/CD pipeline integration, detailed comparison reports [Visit Redgate SQL Compare](https://www.red-gate.com/products/sql-compare/) XSQL Software Comprehensive schema and data comparisons Paid plans available Yes Side-by-side schema comparison, automated synchronization [Visit XSQL Software](https://www.xsql.com/) SQL Admin Studio Database administrators managing backups and syncs Paid plans available Yes Schema comparison, data synchronization, backup management [Visit SQL Admin Studio](https://www.simego.com/Products/SQL-Admin-Studio) SQL Server Database Comparison Tool Budget-conscious users Free No Basic schema and data comparison [Visit SQL Server Database Comparison Tool](https://www.codeproject.com/Articles/205011/SQL-Server-Database-Comparison-Tool) DB Comparer Quick and lightweight comparisons Free No Visual schema difference display [Visit DB Comparer](https://dbcomparer.com/) OpenDBDiff Open-source enthusiasts Free No Command-line schema comparison, customizable [Visit OpenDBDiff](https://github.com/OpenDBDiff/OpenDBDiff) Now that you have a quick overview, let’s explore why comparing SQL Server databases is essential and dive deeper into the tools that can make it effortless. Why compare SQL server databases? Comparing SQL Server databases is critical for keeping systems running smoothly and avoiding costly errors. Here are a few key scenarios where database comparison tools prove invaluable: Syncing development, staging, and production : Ensures consistency across environments, automating schema and data alignment to prevent deployment issues. Upgrades or migrations : Verifies schemas and data remain intact during transitions, avoiding data loss or misaligned configurations. Examining discrepancies : Quickly identifies and resolves mismatches in schema or data to maintain performance and accurate reporting. Disaster recovery validation : Confirms restored backups match the original database, safeguarding against missing or corrupted data. These tools save time, reduce errors, and keep your systems reliable without the hassle of manual comparisons. Manual vs. automated comparison Manual methods: Time-consuming and error-prone While it’s possible to compare databases manually using SQL queries and scripts, this method comes with significant drawbacks: Complexity : Writing and maintaining scripts to compare large databases with intricate schemas is challenging, especially for teams with limited resources or expertise. Human error : Manual processes are prone to oversight, especially when handling extensive datasets or complex relationships between tables. Time investment : Comparing even moderately sized databases manually can take hours or even days, delaying critical tasks and impacting productivity. Automated tools: efficient, accurate, and scalable Database comparison tools eliminate the inefficiencies of manual methods by automating the process. Here’s why they’re the superior choice: Speed : Automated tools can compare entire databases within minutes, saving valuable time for teams working under tight deadlines. Accuracy : These tools are designed to detect even the smallest discrepancies in schema or data, ensuring precision and reliability. Scalability : Automated tools can handle large, complex databases with ease, making them ideal for enterprise environments and high-volume operations. User-friendly features : Many tools offer visual interfaces, intuitive workflows, and detailed reports that make database comparison accessible to users with varying levels of technical expertise. Tools for comparing SQL server databases With automated tools, you can save time, reduce errors, and focus on high-priority tasks. Below is a list of the top tools for SQL Server database comparison, broken down into their features, pricing, and key benefits to help you find the perfect fit for your needs. dbForge Compare Bundle for SQL Server [dbForge Compare Bundle](https://www.devart.com/dbforge/sql/compare-bundle/) offers advanced schema and data comparison tools, making it a top choice for developers and DBAs. With integration options for Git and other version control systems, it’s perfect for maintaining consistent databases across multiple environments. Trial availability : Yes, free trial available. Pricing : Paid plans start at $269.95. Rating : G2: 4.7/5. Key features Schema and data comparison with clear side-by-side views. Integration with Git, SVN, and other version control systems. Backup and recovery functionality for added security. Automation through CLI (Command Line Interface) support for scheduled database synchronization. Built-in DevOps and CI/CD support, allowing seamless integration into automated workflows. Custom scripting and report generation for tracking and documenting database changes. Pros and cons Pros Cons Advanced features for schema and data comparison Higher price point for small teams Integration with version control systems like Git Slight learning curve for beginners Backup and recovery tools for enhanced security Trial is time-limited, no free version beyond that Supports automation and cloud platforms, ensuring smooth and efficient workflows [Download dbForge Compare Bundle >](https://www.devart.com/dbforge/sql/compare-bundle/download.html) Redgate SQL Compare & SQL Data Compare Redgate SQL Compare & SQL Data Compare is tailored for DevOps workflows, with seamless CI/CD integration and detailed reports that simplify schema and data synchronization. Trial availability : Yes, free trial available. Pricing : Pricing available upon request. Rating : G2: 4.6/5. Key features Schema and data comparison tools with rollback capabilities. DevOps integration with Azure DevOps, GitHub, and Jenkins. Clear and actionable comparison reports. Pros Cons Seamless CI/CD pipeline integration Smaller teams may face higher rates. Rollback features prevent deployment errors Some advanced features require extra Redgate tools Clear reports for quick team communication Initial setup may take time Trusted and reliable for enterprise-level workflows XSQL Software XSQL Software provides user-friendly schema and data comparison with robust synchronization options. It’s a great choice for teams looking for a straightforward yet effective tool. Trial availability : Yes, free trial available. Pricing : Paid plans available. Rating : G2: 4.5/5. Key features Visual schema comparison for easy identification of differences. Automated synchronization with minimal manual effort. Batch processing for managing multiple tasks. Pros Cons Simple and intuitive interface Limited analytics integration Reliable synchronization features for schemas and data Not ideal for large-scale enterprise needs Affordable for small to medium-sized businesses Minimal customization Batch processing for efficient multitasking SQL Admin Studio SQL Admin Studio simplifies schema comparison, data syncing, and backup management, making it an ideal tool for database administrators managing complex systems. Trial availability : Yes, free trial available. Pricing : Paid plans available. Rating : G2: 4.4/5. Key features Schema comparison and data synchronization for consistency. Backup management tools to safeguard databases. Task automation for syncing and backups. Pros Cons Intuitive interface suitable for most users Limited analytics tools Automates routine tasks like backups and synchronization Not designed for DevOps workflows Reliable for managing database consistency SQL Server Database Comparison Tool A no-frills, budget-friendly option for small-scale projects. This free tool provides basic schema and data comparison functionality. Trial availability : Not applicable – Free tool. Pricing : Free. Rating : G2: 4.0/5. Key features Basic schema and data comparison. Lightweight and efficient for smaller databases. Quick setup for fast results. Pros Cons Free and accessible No automation or integration options Lightweight and quick for simple comparisons Not scalable for large projects Works well for small, low-complexity tasks Lacks advanced features like reporting and rollback DB Comparer DB Comparer is a lightweight tool built for quick schema comparisons. Its simple, visual interface makes it a good choice for smaller projects. Trial availability : Not applicable – Free tool. Pricing : Free. Rating : G2: 4.3/5. Key features Visual schema comparison for easy review. Lightweight design for minimal resource use. Quick setup and operation. Pros Cons Free and lightweight Limited to schema comparison only Intuitive visual interface Cannot handle large or complex databases Quick setup and operation Lacks automation and data comparison OpenDBDiff OpenDBDiff is an open-source, command-line tool for schema comparisons. It’s perfect for developers who want flexibility and customization. Trial availability : Not applicable – Free tool. Pricing : Free. Rating : G2: 4.1/5. Key features Command-line interface for direct schema comparisons. Fully customizable as an open-source solution. Lightweight and resource-efficient. Pros Cons Open-source and completely free Requires command-line expertise Customizable for specific use cases No graphical interface Lightweight and efficient Minimal community support These tools cover a wide range of needs, from quick and free solutions to feature-rich enterprise options. Choose the one that fits your workflow, and get those databases in sync! How to choose the right tool Selecting the right SQL Server database comparison tool depends on your specific requirements, budget, and workflow preferences. Here’s a simple guide to help you make an informed decision. Define your needs Understanding your role and priorities is crucial when choosing a tool. Different users have different requirements: For developers : Developers often work in fast-paced environments where automation and seamless integration with DevOps pipelines are essential. Look for tools that offer: Integration with CI/CD workflows (e.g., Redgate SQL Compare & SQL Data Compare ). Detailed comparison reports for schema and data differences. Easy rollback and deployment features. For Database Administrators (DBAs) : DBAs prioritize database integrity, synchronization, and backups. Tools that provide advanced administrative capabilities will be the best fit: Schema and data synchronization tools (e.g., SQL Admin Studio or dbForge Compare Bundle ). Backup and recovery functionalities. Options to automate routine tasks like synchronization and backups. Consider your budget Your budget plays a significant role in selecting a tool. Here’s how to align your choice with your financial resources: Free tools for simple tasks : If you’re working on small projects or have minimal comparison needs, free tools like SQL Server Database Comparison Tool , DB Comparer , or OpenDBDiff can get the job done without any cost. Paid tools for enterprise needs : For more complex use cases, such as handling large-scale databases, integrating with cloud platforms, or managing enterprise workflows, paid tools like dbForge Compare Bundle , Redgate SQL Compare & SQL Data Compare , or SQL Admin Studio offer advanced features worth the investment. Use trials Before committing to a purchase, leverage free trials to evaluate a tool’s usability, performance, and features: Explore features : Test the schema and data comparison capabilities, as well as reporting and synchronization options. Assess ease of use : Ensure the interface is user-friendly and suits your level of expertise. Validate performance : Run comparisons on your existing databases to ensure the tool meets your speed and accuracy expectations. Compatibility : Check if the tool integrates seamlessly with your existing DevOps pipelines, cloud services, or version control systems. By carefully considering your needs, budget, and hands-on experience during trials, you can confidently select a tool that aligns with your requirements. Whether you’re a developer, a DBA, or managing enterprise databases, there’s an option out there to help you streamline your workflow. Would you like to move on to the conclusion next? The takeaway Whether you’re a developer working on CI/CD pipelines, a DBA managing critical backups, or a small business owner handling simple database tasks, having the right comparison tool can transform your workflows. From syncing schemas across environments to validating disaster recovery efforts, these tools handle it all. Start by trying out the free trials, assess your unique needs, and choose the tool that fits your workflow. With the right solution, you’ll save time, avoid errors, and confidently manage your SQL Server databases. Find out more by visiting the [Devart Database Comparison Tools](https://www.devart.com/dbforge/compare-tools.html) page. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-tool-to-compare-two-sql-server-databases.html) [Twitter](https://twitter.com/intent/tweet?text=7+Best+Tools+to+Compare+Two+SQL+Server+Databases+%E2%80%93+Free+and+Paid%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fbest-tool-to-compare-two-sql-server-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-tool-to-compare-two-sql-server-databases.html&title=7+Best+Tools+to+Compare+Two+SQL+Server+Databases+%E2%80%93+Free+and+Paid%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-tool-to-compare-two-sql-server-databases.html&title=7+Best+Tools+to+Compare+Two+SQL+Server+Databases+%E2%80%93+Free+and+Paid%C2%A0) [Copy URL](https://blog.devart.com/best-tool-to-compare-two-sql-server-databases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) Best VCL UI Components for Delphi Developers By [Victoria Shyrokova](https://blog.devart.com/author/victorias) December 27, 2024 [0](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html#respond) 684 Working with Delphi UI components is essential to make your development life easier and more efficient. The Visual Component Library (VCL) is the backbone of Delphi’s RAD environment for Windows, providing many prebuilt UI components like buttons, tables, and text boxes. It also serves as a foundational framework for C++Builder. Plus, it inspired the Lazarus Component Library (LCL), which supports cross-platform development. Now, lots of Delphi component libraries extend the VCL framework with advanced visual controls and data access modules that work across different environments. You can just drag and drop these elements to build feature-rich user interfaces and connect your apps to databases and cloud services. This speeds up your workflow and keeps things consistent in all your projects. Table of contents Key features to look for in VCL UI components Most popular VCL UI components for Delphi developers Enhanced connectivity with Devart’s UniDAC Enable Delphi workflow with VCL components Key features to look for in VCL UI components There are many [Delphi libraries and tools](https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html#visual) , but not any VCL collection will do  — some are extensive but lack flexibility for complex applications, while others offer considerable customization but poor performance. Here are a few things to look for as you evaluate your choices: Ease of integration: Opt for libraries that are easy to install in Delphi environments and integrate smoothly with other database and visual components. Flexibility: Choose Delphi UI components that let you tweak their behavior and look to fit different devices and your app’s evolving needs. Performance: Prioritize components that are optimized for performance, meaning fast rendering, smooth user interactions, and efficient use of resources. Scalability: If you are developing enterprise applications or plan on growing, consider components that can handle increasing data loads and user interactions. Server-aware providers Want your app to manage and display data in the UI with minimal retrieval delays? You may want to look into server-aware providers. They directly connect your Delphi application to databases and cloud services, using server-specific features to optimize data access, manipulation, and synchronization. This way, you can manage data transactions better and use native tools like direct SQL execution, which speeds up queries, lowers latency, and maintains the responsiveness of your app. Cross-platform compatibility Not all VCL components are available in Lazarus, so make sure to check the library’s documentation for compatibility. Those that work in Delphi, C++Builder, and Lazarus save a great deal of development and maintenance time. They let you reuse the same components and code across these environments without needing separate libraries or drivers. Most of them also support deployment on multiple operating systems, ensuring a consistent user experience without the hassle of managing multiple codebases. Server-independent SQL If you’re building an app that needs to connect to different databases, consider server-independent SQL components. They allow you to write your SQL queries just once and use them on different database systems. This means you can easily adapt your app when client requirements change, simplifying maintenance and reducing the risk of errors. Plus, it keeps your app portable, so it’s easier to deploy in Delphi environments. Most popular VCL UI components for Delphi developers DevExpress VCL components DevExpress VCL is a set of UI components for building rich Windows apps with Delphi and C++Builder . One of its most useful controls is the Data Grid , which helps you visualize complex data. You can group, filter, and sort information and then display it dynamically without additional coding. Need to implement complex scheduling features but want to save time? The Scheduler component comes with drag-and-drop functionality, recurring events, and customizable views. There’s also a Ribbon Control to make your app more intuitive, featuring a user-friendly interface similar to what you find in Microsoft Office UIs. TMS VCL UI Pack TMS VCL UI Pack isn’t your average collection of text boxes and labels. With over 600 VCL components , it lets you create cohesive and feature-rich UIs for Windows-only apps . You’ll get anything from advanced grid controls to versatile planners with plenty of configuration options. For instance, you can use the TAdvStringGrid to handle and visualize large datasets, and the TPlanner for organizing schedules. Not to mention flexible editors like the TAdvEdit component, which allows users to easily input and manipulate data within your application. EhLib EhLib focuses on grid functionality , so it’s a very good choice for data-intensive apps like accounting and medical software. It’s a native VCL component library , compatible with Delphi, C++Builder, and Lazarus. EhLib components are all about handling large amounts of data to keep applications responsive and stable . Some of their main features include automatic column resizing, export options to formats like Excel and HTML, and advanced filtering, sorting, and grouping for easy data management. FastReport for VCL Unlike EhLib, FastReport VCL specializes in simplifying report generation for Delphi and C++Builder applications . It’s specifically designed to integrate well with the VCL framework, so you can easily work with VCL Styles and report templates . Its visual designer lets you create a wide variety of reports, from basic lists to complex master-detail formats. You can pull in data from databases like FireDAC and ADO and then export the results to over 30 formats, such as PDF, XLSX, XML, and HTML. If you want to share your reports, you can deploy them to cloud services, email them, or just print them out. LMD VCL Components The LMD’s toolkit is far more than just a bunch of controls for Delphi UI design. You can use more than 750 native components to create engaging user experiences, handle complex tasks quickly, and enable richer interactions. For building interactive and informative displays, you get components like the TElXTree , which supports advanced tree structures and customizable item cells. There are also powerful system programming tools that make file management simple. You can even use multimedia components to integrate audio and video playback into your apps. BergSoft NextSuite If you’ve developed CRMs, financial tools, or other apps involving extensive data handling directly in the UI, you’ve probably heard of BergSoft’s NextSuite. It’s pretty popular because of how well it streamlines user interaction in data-driven C++Builder and Delphi apps , especially with its NextGrid and NextDBGrid components. NextGrid is a flexible grid tool that allows you to personalize how your data looks, making it easier for users to find and understand the information they need. NextDBGrid integrates with database support, so you can connect columns to dataset fields and do real-time data updates. Both have many advanced features, including multiple column types, sorting options, and efficient data manipulation capabilities. Enhanced connectivity with Devart’s UniDAC Setting up database connections can be a real headache, especially when you work with several database servers or try to make everything work on different platforms. To help you out, [UniDAC](https://www.devart.com/unidac/) provides a unified interface that lets you connect and switch between major databases and cloud services with just a few clicks. You can w ork with many popular databases, such as Oracle, PostgreSQL, and MySQL, as well as most cloud services , including QuickBooks, Netsuite, and Salesforce. Besides, it supports Delphi, C++Builder, and Lazarus on all major operating systems. This means you can interact with your data without the hassle of reconfiguring your app each time you change your data source. Server-aware data providers UniDAC uses server-specific connectivity to make your data operations run as efficiently as possible. How? Unlike other server-aware data providers, it automatically optimizes most data management tasks based on your selected server. With MySQL, for instance, UniDAC directly uses native protocol support for fast data retrieval. If you’re working with PostgreSQL, it implements the database’s advanced data types to enhance performance and simplify data manipulation. Database independence with UniDAC UniDAC lets you create dynamic SQL , swapping out portions of your queries with your own code. These aren’t tied to any specific database server, so it’s easy to switch between databases. You might still have to make some adjustments for changes in SQL syntax or database-specific features, but you don’t need to start from scratch with everything. As a result, you can work faster and avoid getting bogged down in maintenance as your needs change. Enable Delphi workflow with VCL components Many standard Delphi UI components are rigid, and custom coding takes too much time. The right VCL components can help you get things done faster and build consistent and functional user interfaces without the hassle. But to save this additional time from complex connectivity issues, you need to use a data access component that simplifies interactions across databases. For fast integration with most Delphi IDEs, [try UniDAC](https://www.devart.com/unidac/) . It can help you improve your app’s performance and ensure seamless data management across multiple platforms. Tags [delphi components](https://blog.devart.com/tag/delphi-components) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbest-vcl-ui-components-for-delphi-developers.html) [Twitter](https://twitter.com/intent/tweet?text=Best+VCL+UI+Components+for+Delphi+Developers&url=https%3A%2F%2Fblog.devart.com%2Fbest-vcl-ui-components-for-delphi-developers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html&title=Best+VCL+UI+Components+for+Delphi+Developers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html&title=Best+VCL+UI+Components+for+Delphi+Developers) [Copy URL](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Delphi InterBase: Comprehensive Guide to Features, Installation, and Usage](https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html) April 10, 2025"} {"url": "https://blog.devart.com/better-together-experience-snowflake-with-certified-devart-connectivity-solutions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Python Connectors](https://blog.devart.com/category/products/python-connectors) [SSIS Components](https://blog.devart.com/category/products/ssis-components) Better Together: Experience Snowflake With Certified Devart Connectivity Solutions By [Victoria Shyrokova](https://blog.devart.com/author/victorias) June 25, 2024 [0](https://blog.devart.com/better-together-experience-snowflake-with-certified-devart-connectivity-solutions.html#respond) 1126 Devart’s joining Snowflake Partner Network Technology Select Tier is a highly anticipated milestone that will positively impact the experience of those who choose to work with Snowflake and look for extended connectivity solutions that perfectly sync with this platform. There are many connectivity solutions on the market, but only some of them provide full compatibility with Snowflake data types, support Snowflake standard SQL expressions, or are fit for ETL. On the other hand, Snowflake is one of the most popular AI data cloud platforms that is revered by application developers, data engineers, and data architects for its immense processing capabilities, scalability, and predictive analytics potential. This platform works miracles when it’s empowered with an ETL pipeline that facilitates multi-source data extraction, a series of transformations, and fast data loading into a storage system where it can be further used to train ML algorithms for advanced analytics or processed to get ultimate data-driven insights. So, how can Devart tools help you integrate your data sources and applications with Snowflake while executing SQL queries, ensuring optimal performance, and handling accurate data mapping? Check out the list of our connectivity tools to explore the use cases and learn about the benefits our connectors and components provide when used with Snowflake. ODBC Driver for Snowflake Even though Snowflake provides regular connectors for popular platforms like PowerBI or Tableau, these native connectivity solutions cannot handle parametrized queries and lack flexibility. Also, they often need extra fine-tuning, and developers have to study API parameters to use them. Luckily, an ODBC Driver for Snowflake from Devart already has over a dozen proven integration options, facilitating the process. With an [ODBC Driver for Snowflake](https://www.devart.com/odbc/snowflake/) , you can use Snowflake standard SQL expressions to work with different functions and modify data using DML operators while having the full support of Snowflake data types and Snowflake API. ODBC driver from Devart will assist you in setting up a secure connection with IDEs, accessing cloud data from your app, and integrating Snowflake data with data visualization and business analysis platforms. [Try ODBC Driver for Snowflake >](https://www.devart.com/odbc/snowflake/download.html) SSIS Data Flow Components for Snowflake Snowflake integration with other databases and cloud applications using SSIS is fairly common when your ultimate goal is establishing an ETL connection. Unfortunately, using SSIS as it is requires additional effort. For instance, extra changes are required to use the Snowflake COPY command, and the performance must often be fine-tuned. The [Devart SSIS Data Flow Components for Snowflake](https://www.devart.com/ssis/snowflake/) , on the other hand, let you accelerate data import to Snowflake with the COPY command, help you automate integration with SSIS Data Flow tasks, and become invaluable assets when you need to keep Snowflake and other data sources, such as Azure SQL Data Warehouse, Google BigQuery, or Amazon Redshift, in sync. With SSIS Data Flow Components for Snowflake, you get the highest performance possible with minimum effort. You can also easily use Devart Snowflake Source Editor to drag and drop elements to build queries without code and preview the results to ensure they work as expected. [Try SSIS Data Flow Components for Snowflake >](https://www.devart.com/ssis/snowflake/download.html) Python Connector for Snowflake Many developers and architects use Python as one of the main languages to build complex applications that have to be connected to a database to work with datasets. Snowflake is one of the most popular platforms that can be used as a data source. When connected with a Python application, it is twice as powerful. That’s why the Devart team created Python Connector for Snowflake, letting you access the Snowflake data cloud from your application. The [Devart Python Connector for Snowflake](https://www.devart.com/python/snowflake/) supports Snowflake and Python data types. It performs connection pooling and local data caching to accelerate performance and lets you use the standard SQL syntax to manage data. [Try Python Connector for Snowflake >](https://www.devart.com/python/snowflake/download.html) The Benefits of Devart Technology Partnership with Snowflake Snowflake data cloud is becoming increasingly popular among businesses for its scalability, superb performance, and concurrency. It’s fit for advanced tasks and can be used as a powerful data source for your application or platform when paired with the fitting connectivity solution. On the other hand, the Devart connectivity tools have everything you need to integrate Snowflake data within any third-party platform. They were tested with Snowflake to ensure maximum compatibility and have additional features to align with your specific needs while you work with your favorite data cloud. If you don’t have experience with the connectivity tools, the Devart team provides comprehensive documentation and support to assist you in your tasks and ensure your success. What’s Next? Devart products are made with high-quality standards and efficiency in mind, and we look forward to partnering with platforms that share our values to provide our clients with more functionality. By extending our partnerships, we strive to facilitate your experience with connectivity tools and level up the overall performance of solutions. Explore your ways with Snowflake while staying true to Devart connectivity tools, and rest assured that we always have your back. Tags [odbc](https://blog.devart.com/tag/odbc) [odbc driver](https://blog.devart.com/tag/odbc-driver) [odbc for snowflake](https://blog.devart.com/tag/odbc-for-snowflake) [python connectors](https://blog.devart.com/tag/python-connectors) [SSIS Data Flow](https://blog.devart.com/tag/ssis-data-flow) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbetter-together-experience-snowflake-with-certified-devart-connectivity-solutions.html) [Twitter](https://twitter.com/intent/tweet?text=Better+Together%3A+Experience+Snowflake+With+Certified+Devart+Connectivity+Solutions&url=https%3A%2F%2Fblog.devart.com%2Fbetter-together-experience-snowflake-with-certified-devart-connectivity-solutions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/better-together-experience-snowflake-with-certified-devart-connectivity-solutions.html&title=Better+Together%3A+Experience+Snowflake+With+Certified+Devart+Connectivity+Solutions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/better-together-experience-snowflake-with-certified-devart-connectivity-solutions.html&title=Better+Together%3A+Experience+Snowflake+With+Certified+Devart+Connectivity+Solutions) [Copy URL](https://blog.devart.com/better-together-experience-snowflake-with-certified-devart-connectivity-solutions.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/bigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [SSIS Components](https://blog.devart.com/category/products/ssis-components) [What’s New](https://blog.devart.com/category/whats-new) BigCommerce API v3 Support in dotConnect for BigCommerce 1.9 and SSIS Data Flow Components for BigCommerce By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 13, 2019 [0](https://blog.devart.com/bigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html#respond) 3016 We are pleased to announce support of BigCommerce API v3 in [dotConnect for BigCommerce 1.9](https://www.devart.com/dotconnect/bigcommerce/) and [SSIS Data Flow Components for BigCommerce](https://www.devart.com/ssis/bigcommerce.html) . The new BigCommerce API was introduced to simplify product management, reduce the number of API calls used and improve API performance. Note that the schema of some BigCommerce objects (especially, product-related) is different in API v2 and API v3. BigCommerce API v3 is not fully complete and does not cover all BigCommerce features yet. When using BigCommerce API v3 connection, you will work with some BigCommerce objects supported in API v3, via API v3; other objects are accessed via API v2. We don’t limit users to API v3 only, and you will be able to create both API v2 and API v3 connections. Unlike API v2, BigCommerce API v3 support only OAuth authentication, and does not support Basic authentication. You can find more information about OAuth authentication in BigCommerce in [their documentation](https://developer.bigcommerce.com/api-docs/getting-started/authentication) . Feel free to download the new versions of [SSIS Data Flow Components](https://www.devart.com/ssis/download.html) and [dotConnect for BigCommerce](https://www.devart.com/dotconnect/bigcommerce/download.html) , try the new functionality, and [leave feedback](https://forums.devart.com/) ! Tags [BigCommerce](https://blog.devart.com/tag/bigcommerce) [SSIS](https://blog.devart.com/tag/ssis) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html) [Twitter](https://twitter.com/intent/tweet?text=BigCommerce+API+v3+Support+in+dotConnect+for+BigCommerce+1.9+and+SSIS+Data+Flow+Components+for+BigCommerce&url=https%3A%2F%2Fblog.devart.com%2Fbigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html&title=BigCommerce+API+v3+Support+in+dotConnect+for+BigCommerce+1.9+and+SSIS+Data+Flow+Components+for+BigCommerce) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html&title=BigCommerce+API+v3+Support+in+dotConnect+for+BigCommerce+1.9+and+SSIS+Data+Flow+Components+for+BigCommerce) [Copy URL](https://blog.devart.com/bigcommerce-api-v3-support-in-dotconnect-for-bigcommerce-1-9-and-ssis-data-flow-components-for-bigcommerce.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/bigcommerce-api-v3-support-in-excel-add-ins-2-2.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) BigCommerce API v3 Support in Excel Add-ins 2.2 By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 11, 2019 [0](https://blog.devart.com/bigcommerce-api-v3-support-in-excel-add-ins-2-2.html#respond) 4593 Devart is glad to announce the release of [Excel Add-ins 2.2](https://www.devart.com/excel-addins/) with support for BigCommerce API v3 and improvements for Zendesk. The new version of Excel Add-in for BigCommerce allows users to select the BigCommerce API version to use when working with BigCommerce data. Note that the lists of objects and their structure is different for different API versions. The main differences are connected with objects, storing data about products, their variants, options, prices, brands, etc. Some objects that are not supported in BigCommerce API v3 are accessed via API v2 regardless of API version selected. The new API v3 connections use OAuth authentication (using Store Credentials), and require a different set of connection parameters than API v2 connections. You can obtain the required connection parameters when creating a BigCommerce API account. Besides, the new version of Excel Add-in for Zendesk offers some improvements for working with Zendesk tickets via Zendesk Search API and allows disabling of Search API use completely. Feel free to [download the new versions of Devart Excel Add-ins](https://www.devart.com/excel-addins/universal-pack/download.html) , try the new functionality, and [leave feedback](https://www.devart.com/excel-addins/universal-pack/feedback.html?pn=Devart%20Excel%20Add-ins) ! Tags [excel addins](https://blog.devart.com/tag/excel-addins) [what's new excel addins](https://blog.devart.com/tag/whats-new-in-excel-addins) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbigcommerce-api-v3-support-in-excel-add-ins-2-2.html) [Twitter](https://twitter.com/intent/tweet?text=BigCommerce+API+v3+Support+in+Excel+Add-ins+2.2&url=https%3A%2F%2Fblog.devart.com%2Fbigcommerce-api-v3-support-in-excel-add-ins-2-2.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bigcommerce-api-v3-support-in-excel-add-ins-2-2.html&title=BigCommerce+API+v3+Support+in+Excel+Add-ins+2.2) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bigcommerce-api-v3-support-in-excel-add-ins-2-2.html&title=BigCommerce+API+v3+Support+in+Excel+Add-ins+2.2) [Copy URL](https://blog.devart.com/bigcommerce-api-v3-support-in-excel-add-ins-2-2.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/bit-manipulation-functions-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Unlocking the Power of Bit Manipulation Functions By [Nataly Smith](https://blog.devart.com/author/nataly-smith) September 4, 2023 [0](https://blog.devart.com/bit-manipulation-functions-in-sql-server.html#respond) 2156 Like everything in the modern world, RDBMS systems undergo upgrades and changes. Therefore, in order to keep up with this rapid pace, database developers and administrators must catch these changes on the fly in order to stay on the crest of the wave. Fortunately, the recent release of such tools as [dbForge Studio](https://www.devart.com/dbforge/sql/studio/) and [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) can help make the work faster and more precise than it was before. In this article, we are going to delve into the intricacies of the bit manipulation functions and the innovation of SQL Server 2022, both in theory and practical application. To be more precise, we will explore how to use bit manipulation functions in [the new release version of Devart products](https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html) . Contents Understanding Bit Manipulation New Bit Manipulation Functions in SQL Server 2022 Understanding BIT_COUNT Function Exploring GET_BIT Function Diving into SET_BIT Function Unpacking LEFT_SHIFT and RIGHT_SHIFT Functions Bit Manipulation Functions in Practice Incorporating SQL Complete in Bit Manipulation Conclusion Understanding Bit Manipulation By definition, bit manipulation involves working with individual bits within binary data representations. By directly altering these bits, intricate operations can be performed with remarkable precision. For example, bitwise AND, OR, XOR, and shift operations are common in bit manipulation. In cases where a set of flags is represented as bits, the AND operation helps determine whether a specific flag is activated. Additionally, bitwise shifts can multiply or divide integers by powers of two. Through these techniques, bit manipulation proves to be a fundamental and powerful tool in programming and data manipulation. New Bit Manipulation Functions in SQL Server 2022 In SQL Server 2022, several new functions have been introduced to enhance your capabilities when working with binary data. These novel additions unlock endless possibilities, allowing you to manipulate and navigate binary information with newfound ease and efficiency: BIT_COUNT GET_BIT SET_BIT LEFT_SHIFT RIGHT_SHIFT Each of these functions serves a specific purpose, which we will examine in more detail later in this article. Understanding BIT_COUNT Function The BIT_COUNT function is going to be the first of the new functions to be unfolded today. SQL Server 2022 provides a straightforward way to count the number of set bits (bits with a value of 1) within a given binary value. This can be particularly useful for tasks that involve evaluating the density or distribution of specific binary patterns within your data. Now, let us play around with the BIT_COUNT syntax in order to better see it in action. For example, the following query will show the number of bits set to 1 for the number 10: SELECT BIT_COUNT(10) as result; The displayed result is 2. In order to understand why we encountered exactly this result, let us take a look under the hood of this query. There, we will see what exactly is happening and what leads to the specified outcome. In binary, 10 is represented as 1010, which makes up to two values that are set to 1. To introduce some variety into our examples and achieve a more comprehensive understanding of the concept, we are going to observe how the BIT_COUNT function operates with hexadecimal numbers, such as 0x1305A: SELECT BIT_COUNT (0x1305A) as result; On executing the query, we see that the result is 7 since 0x1305A in binary is 00010011000001011010. It contains seven values that are set to 1. Exploring GET_BIT Function The next stop on our bit manipulation journey is the GET_BIT function. It enables you to retrieve the value of a specific bit at a given position within a binary value. This function allows you to query individual bits within the binary data, which can be valuable for tasks like conditional checks or pattern recognition. For better understanding, let us take a look at an example: SELECT GET_BIT (10, 2) as result; Note: In dbForge Studio for SQL Server, bit values are typically displayed as True or False for easy understanding. When you query a database table containing a BIT data type column, the result set will show True if the bit value is 1 and False if the bit value is 0. This representation helps make it clear whether a particular bit is turned on (1) or off (0) without having to interpret binary digits. The value returned after executing the function is 0. This is because 10 in binary is 1010, and the value in position 2 (while 0 is the first position from right to left) is 0. However, if we slightly modify the query by replacing the argument 2 with 3, the result will be different: SELECT GET_BIT (10, 3) as result; In this scenario, the query will return a result of 1. When we substitute argument 2 with 3, we are now targeting the third position from the right within the binary number 1010. In this case, that particular position holds a value of 1. Diving into SET_BIT Function Moving on to the SET_BIT function that empowers you to modify a specific bit at a designated position within a binary value. This functionality is useful when you need to update or manipulate specific bits without altering the entire binary value. The following example will set the bit in the first position (right to left, positions start with 0) to 1. Setting the bit to 1 is the default unless specified otherwise. SELECT SET_BIT (14, 0) as result; The value returned by the function is 15. 14 in binary is 1110, so if we change the first value from 0 to 1, the value in binary will be 1111, which is 15 in decimal. To deepen our understanding of the SET_BIT function, let us see how it works with hexadecimal values. The following example will set the bit in the third place of the said number: SELECT SET_BIT (0x23AEF, 3, 0) as result; The value returned by the query is 0x023AE7. If we convert 0x23AEF to binary, that would be 00100011101011101111. Now, by changing the third value to 0, we get 00100011101011100111. Finally, the new value is converted to hexadecimal, so 00100011101011100111 is 0x023AE7. Unpacking LEFT_SHIFT and RIGHT_SHIFT Functions As you may recall from the beginning of this article, in addition to the functions we have already discussed, SQL Server 2022 has introduced two more functions: LEFT_SHIFT and RIGHT_SHIFT. Let us now discover these functions to ensure we have a comprehensive grasp of all the innovations at hand. LEFT_SHIFT Function As its name suggests, the LEFT_SHIFT allows you to perform a left-shift operation on a binary value. A left shift operation involves moving the bits of a binary value to the left by a specified number of positions. This effectively multiplies the binary value by 2, raised to the power of the specified shift amount: For example, if you have the decimal value 12 (that is 1100 in decimal) and you apply a left shift of 2 positions, the result would be 110000, which is equivalent to the decimal value 48. SELECT LEFT_SHIFT (12, 2) as result; RIGHT_SHIFT Function Similarly, RIGHT_SHIFT has also been introduced in SQL Server 2022 as a bit manipulation function. It allows you to perform a right-shift operation on a binary value. Such operations involve moving the bits of a binary value to the right by a specified number of positions. This effectively divides the binary value by 2 raised to the power of the specified shift amount, and takes the integer quotient. For instance, take the decimal value of 42 (101010 in binary) and apply a right shift of 2 positions to it: SELECT RIGHT_SHIFT (42, 2) as result; The result would be 1010, which is equivalent to the decimal value 10. Bit Manipulation Functions in Practice Having covered the theoretical aspects of comprehending the recently introduced bit manipulation functions in SQL Server 2022, we are ready to transition into the practical dimension. Let us explore real-life scenarios that demand the application of these functions and figure out how bit manipulation can effectively address and solve various challenges. Flag management: Bit manipulation functions are often used to manage flags or status indicators efficiently. For example, a single integer column can represent multiple boolean flags, saving space and simplifying queries. Permission systems: Bit manipulation is useful for handling permissions or access control. Each bit could correspond to a specific permission, making it easy to check or modify access rights. Data compression: Bit manipulation is a key technique in data compression algorithms, where bits are rearranged to reduce the overall data size without losing information. Error detection: In networking and data transmission, error detection and correction codes use bit manipulation to add redundancy and detect errors in the received data. Cryptographic 0perations: Bitwise operations are fundamental in encryption and decryption algorithms, ensuring secure data transmission. Resource optimization: In embedded systems or low-level programming, bit manipulation can optimize memory usage or improve processing efficiency. Benefits of Using Bit Manipulation Functions in SQL Server Efficiency: Bit manipulation functions can handle complex operations using minimal resources, improving query performance and reducing processing time. Compact storage: Storing multiple boolean flags as bits in a single column conserves space compared to using separate columns, which is particularly beneficial for large datasets. Reduced complexity: Bit manipulation simplifies complex operations, making code more concise and easier to maintain. Performance: When handling large datasets, bitwise operations are often faster than equivalent logical or arithmetic operations, contributing to better overall system performance. Improved query speed: Utilizing bit manipulation functions can lead to faster query execution, which is crucial in time-sensitive applications. Simplified logic: Bit manipulation can simplify complex conditional checks and calculations, making the logic more understandable and less error-prone. Compatibility: Bit manipulation is a common technique across various programming languages and platforms, ensuring consistent behavior regardless of the context. Feature-rich expressions: Bit manipulation functions enable you to create expressive expressions for a wide range of tasks, enhancing the flexibility of your SQL statements. In essence, bit manipulation functions in SQL Server offer a powerful set of tools to optimize data storage, streamline operations, and enhance overall system performance. Their applications span from database design and optimization to solving complex real-world problems efficiently. Incorporating SQL Complete in Bit Manipulation When it comes to optimizing bit manipulation tasks in SQL Server, SQL Complete emerges as an invaluable tool. By seamlessly integrating with both usual and newly-introduced SQL Server’s functionality, SQL Complete streamlines the process of working with binary data, offering a range of features designed to enhance efficiency and accuracy. Overview of SQL Complete SQL Complete is a comprehensive SQL coding assistance tool that significantly enhances your productivity. With its intelligent code completion, formatting, and analysis capabilities, SQL Complete accelerates query writing and improves code quality. It offers real-time suggestions, syntax highlighting, and customizable code snippets, facilitating a smoother coding experience. The tool’s integration with dbForge Studio for SQL Server empowers developers to navigate databases, design queries, and manage data with greater ease and precision. How SQL Complete Can Aid in Bit Manipulation [Intelligent Code Completion](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html) SQL Complete provides intelligent code completion capabilities, suggesting SQL keywords, object names, and even column names as you type. It helps to save time and reduces syntax errors by offering relevant suggestions based on the context. [SQL Snippets](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html#sql_snippets) The tool includes a collection of code snippets for common bit manipulation operations, such as creating tables, stored procedures, and queries. These snippets can be quickly inserted into your code, boosting productivity and reducing repetitive typing. [Code Formatting](https://www.devart.com/dbforge/sql/sqlcomplete/sql-code-formatter.html) The solution offers powerful SQL formatting options that automatically format your SQL code according to the default or customized formatting rules. It ensures consistency and readability, making your code more manageable and professional. [Code Refactoring](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html) SQL Complete enables you to easily refactor your SQL code. It provides functionality to rename objects, extract SQL code into a separate stored procedure, and perform other code refactoring tasks, helping you maintain a clean and organized codebase. [Productivity Extension](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html) SQL Complete’s productivity features include code snippets, intelligent code formatting, quick object search, SQL code highlighting, navigating large bit manipulation queries, code refactoring, and advanced code suggestion, all of which facilitate faster and more efficient SQL development. [Code Highlighting and Analysis](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html#highlight_identifier_occurrences) The add-in provides advanced code highlighting and analysis capabilities. It identifies syntax errors, unresolved references, and potential performance issues in your SQL code, helping you catch and fix problems early in the development process. These are just a few of the features and benefits of using SQL Complete. It enhances the development experience, improves productivity, and helps maintain high-quality SQL code. Conclusion In summary, the realm of relational database management is ever-evolving, forcing swift adaptation to maintain relevance. The introduction of dbForge Studio and SQL Complete addresses this need, empowering professionals to navigate the shifting landscape with enhanced efficiency and precision. This article has explored bit manipulation functions within SQL Server 2022, both in theory and practical application. These functions hold the potential to revolutionize data manipulation, simplifying complex tasks and unlocking new possibilities. To experience the transformative capabilities of dbForge Studio and SQL Complete firsthand, a fully functional [30-day trial for dbForge Studio](https://www.devart.com/dbforge/sql/studio/download.html) and a [14-day trial for SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) offer a gateway to embracing the future of data management, ensuring professionals remain ahead in a world of constant change. Tags [Bit Manipulation Functions](https://blog.devart.com/tag/bit-manipulation-functions) [BIT_COUNT](https://blog.devart.com/tag/bit_count) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [GET_BIT](https://blog.devart.com/tag/get_bit) [LEFT_SHIFT](https://blog.devart.com/tag/left_shift) [RIGHT_SHIFT](https://blog.devart.com/tag/right_shift) [SET_BIT](https://blog.devart.com/tag/set_bit) [SQL Server](https://blog.devart.com/tag/sql-server) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbit-manipulation-functions-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Unlocking+the+Power+of+Bit+Manipulation+Functions&url=https%3A%2F%2Fblog.devart.com%2Fbit-manipulation-functions-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bit-manipulation-functions-in-sql-server.html&title=Unlocking+the+Power+of+Bit+Manipulation+Functions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bit-manipulation-functions-in-sql-server.html&title=Unlocking+the+Power+of+Bit+Manipulation+Functions) [Copy URL](https://blog.devart.com/bit-manipulation-functions-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/build-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Build Credit Card Analytics with Oracle Autonomous Database using Devart ODBC for Oracle By [Max Remskyi](https://blog.devart.com/author/max-remskyi) January 16, 2023 [0](https://blog.devart.com/build-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html#respond) 2546 Contents 1. Abbreviation 2. Objective 3. Technology Background 3.1. Power BI Desktop 3.2. Oracle Autonomous Data Warehouse Database 3.3. Devart ODBC for Oracle Driver 4. Prerequisites 5. Architecture 6. Install Devart ODBC for Oracle 7. Driver Configuration 7.1. Wallet Configuration 7.2. Window DNS Configuration 8. Create Data 9. Create ODBC Connection in PowerBI Desktop 10. Build Credit Analytics 11. Conclusion 1. Abbreviation Oracle Cloud Infrastructure Console OCI Oracle Autonomous Data Warehouse ADW 2. Objective This tutorial is to build Credit Card Analytics by leveraging Power BI Desktop, Devart ODBC for Oracle driver and Oracle Autonomous Data Warehouse Database. It will not be used as the instruction on how to work on PowerBI Desktop or ADW Engine instead we will demonstrate how we will work with Devart ODBC for Oracle driver. 3. Technology Background 1. Power BI Desktop Microsoft Power BI Desktop is built for the analyst. It combines state-of-the-art interactive visualizations, with industry-leading data query and modelling built-in. Create and publish your reports to Power BI. Power BI Desktop helps to empower others with timely critical insights, anytime, anywhere. 2. Oracle Autonomous Data Warehouse Database Oracle Autonomous Data Warehouse (ADW) is fully-managed and offers high performance. It includes all of the performance of the Oracle Database in the fully-managed environment that is turned and optimized for the Data Warehouse workload. It means you don’t need to take more effort and resources (DBA role) to manage the database and optimize the workload. Self-Driving A user defines service levels, the database makes them happen Self-Securing Protection from both external attacks and malicious internal users Self-Repairing Automated protection from all downtime 3. Devart ODBC for Oracle Driver [ODBC Driver for Oracle](https://www.devart.com/odbc/oracle/) is a high-performance connectivity solution with enterprise-level [features](https://docs.devart.com/odbc/oracle/features.htm) for accessing Oracle databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. Our ODBC driver fully supports standard ODBC API functions and data types and enables easy and secure access to live Oracle data from anywhere. A distinctive feature of this driver is the ease of establishing a connection – just specify the Host and Port of the Oracle server and there is no need to install and configure the Oracle Client. 4. Prerequisites Before starting to connect the Oracle database by using Devart ODBD for Oracle and building Credit Card Analytics, make sure to download the necessary tools and drivers: Click [here](https://www.microsoft.com/en-us/download/details.aspx?id=58494) to download Power BI Desktop Click [here](https://www.devart.com/odbc/oracle/) to download Devart ODBC for Oracle Driver Click [here](https://objectstorage.us-ashburn-1.oraclecloud.com/n/id66dobbdxlj/b/Data/o/german_credit_data.csv) to download the Credit Card data Click [here](https://codingsight.com/provisioning-oracle-autonomous-database/) to understand how to provision ADW instances on Oracle Cloud 5. Architecture Power BI Desktop uses Devart ODBC for Oracle driver to connect ADW that is hosted in Oracle Cloud, and query data for building analytics. 6. Install Devart ODBC for Oracle Download and run the installer file to install Follow the instruction Select Destination Location Select Components -> Full Installation Click Next Click Next and enter the Activation Key or choose Trial option -> Next Click Next to start the installation After the installation is completed, we continue configuring the driver 7. Driver Configuration In this step, we will need to configure Oracle Call Interface and Windows DNS 1. Wallet Configuration Download client credentials and store the file in a secure folder on your client computer by following the link [https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/connect-download-wallet.html#GUID-DED75E69-C303-409D-9128-5E10ADD47A35](https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/connect-download-wallet.html#GUID-DED75E69-C303-409D-9128-5E10ADD47A35) Unzip/uncompress the credentials file into a secure folder on your client computer such as the folder C:\\instantclient-basic-19-3\\ADW_DEMO Create the TNS_ADMIN environment variable and set it to the location of the credentials file: Click Start, Run Type rundll32.exe sysdm.cpl,EditEnvironmentVariables Environment Variables window appears -> Click New… Enter Variable name: TNS_ADMIN Enter Variable value: C:\\instantclient-basic-19-3\\ADW_DEMO This folder is where the wallet file is uncompressed Click OK -> Restart your computer to ensure the variable is created completely 2. Window DNS Configuration Click Start, Run Type the C:\\WINDOWS\\SysWOW64\\odbcad32.exe if the system is 64bit to open ODBC Data Source Administrator Click on the Driver tab and make sure Devart ODBC Driver for Oracle i s in the list of drivers Select the User DSN or System DSN tab. Click Add. The Create New Data Source dialogue will appear. Select Devart ODBC Driver for ODBC Driver for Oracle and click Finish. The driver setup dialogue will open. Enter the connection information in the appropriate fields. Check on the checkbox Direct,  Oracle Client will not be required to download and install in our environment. Open tnsname.ora file in the folder where the wallet file is uncompressed. We will get the connection information in the file such as Host Name, Port, and Service Name. We will select one of three services to establish the DNS Configuration for Devart ODBC. Host: adb.us-ashburn-1.oraclecloud.com Port: 1522 Service Name: ghkzzgdddmcvkfb_adwdemo_medium.adb.oraclecloud.com User ID: < database user schema created in ADW instance > Password: < enter password > The above information will be different and depend on your ADW instance provisioned Autonomous Data Warehouse by default supports Mutual TLS (mTLS) connections that clients connect through a TCPS (Secure TCP) database connection using standard TLS 1.2 with a trusted client certificate authority (CA) certificate. Certification authentication with Mutual TLS uses an encrypted key stored in a wallet on both the client (where the application is running) and the server (where your database service on the Autonomous Data Warehouse is running). It means we will need to provide the wallet file for Devart ODBC to ensure the secured connection Go to Security Settings -> SSL Options Wallet Path: path to cwallet.sso file. This file is located in the folder where the wallet file is uncompressed C:\\instantclient-basic-19-3\\ADW_DEMO\\cwallet.sso Server Certificate DN: CN=adwc.uscom-east-1.oraclecloud.com,OU=Oracle BMCS US,O=Oracle Corporation,L=Redwood City,ST=California,C=US Server Certificate DN is extracted from tnsname.ora file that is the value of ssl_server_cert_dn adwdemo_medium = (description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=adb.us-ashburn-1.oraclecloud.com))(connect_data=(service_name=ghkzzgdddmcvkfb_adwdemo_medium.adb.oraclecloud.com))(security=(ssl_server_cert_dn=” CN=adwc.uscom-east-1.oraclecloud.com,OU=Oracle BMCS US,O=Oracle Corporation,L=Redwood City,ST=California,C=US “))) Click on Test Connection and make sure the connection is corrected. 8. Create Data Now, we will load Credit Card data into ADW user schema Create a new table DW_CREDIT_DATA_F Import data Metadata Definition COLUMN DESCRIPTION DATA TYPE AGE Age of Person NUMBER(3,0) SEX Sex/Gender: male, female VARCHAR2(20) JOB 0 – unskilled and non-resident1 – unskilled and resident2 – skilled3 – highly skilled NUMBER(1,0) HOUSING Identify a person who owns a house or notown, rent, or free VARCHAR2(50) SAVING_ACCOUNTS Kind of Saving Accountslittle, moderate, quite rich, rich VARCHAR2(50) CHECKING_ACCOUNT NUMBER(22,7) CREDIT_AMOUNT NUMBER(22,7) DURATION Identify how long for cre NUMBER(4,0) PURPOSE The purpose of usage:car, furniture/equipment, radio/TV, domestic appliances, repairs, education, business, vacation/others VARCHAR2(150) RISK Value Target:goodbad VARCHAR2(50) SQL script to create a table CREATE TABLE DW_CREDIT_DATA_F(\n\n    ROW_ID                        NUMBER(10, 0),\n\n    AGE                                NUMBER(3, 0),\n\n    SEX                                 VARCHAR2(20),\n\n    JOB                                 NUMBER(1, 0),\n\n    HOUSING                       VARCHAR2(50),\n\n    SAVING_ACCOUNTS    VARCHAR2(50),\n\n    CHECKING_ACCOUNT VARCHAR2(50),\n\n    CREDIT_AMOUNT        NUMBER(22, 7),\n\n    DURATION                     NUMBER(4, 0),\n\n    PURPOSE                        VARCHAR2(150),\n\n    RISK                                 VARCHAR2(50)\n\n) 9. Create ODBC Connection in PowerBI Desktop In this step, we will use Devart ODBC for the Oracle driver to establish the connection bridge between PowerBI Desktop and Oracle Database Engine. Make sure that PowerBI Desktop is installed completely. Open PowerBI Desktop -> Click on Get Data -> Get Data window appears -> Choose Others. On the left panel, there are a lot of drivers that PowerBI supports Choose ODBC driver -> Connect From the Data Source dropdown list -> Choose the DNS: Devart_ODBC_ADW which is created in 7.Driver Configuration Click OK. Then enter username and password if they are required Click Connect. A Navigator window appears and we will add the DW_CREDIT_DATA table to PowerBI Desktop Click Transform Data. In this step, we will rename the column name to make them to be easy for understanding. The data set looks like as below after transforming Click on Close & Apply. Now, it’s ready for us to start building Credit Analytics in PowerBI Desktop. Right-click on the dataset in the Fields panel and rename it to Credit Data 10. Build Credit Analytics Before building any reports, we will need to create some measures in PowerBI. Click on the dataset -> Create new measures by following formulas: Total Bad Risk = SUMX(‘Credit Data’,IF(‘Credit Data'[Risk]=”Bad”,1,0)) Total Good Risk = SUMX(‘Credit Data’,IF(‘Credit Data'[Risk]=”Good”,1,0)) Right-Click on Age -> create a new Group and then set Bin size = 5 We will create two visualizations that will show Histogram By Age Group, using Clustered Column chart type PowerBI Desktop does not support Box-Plot chart type directly. In case, we would like to analyze the distribution of Credit Amount by Age Categories such as Student, Young, Adult or Senior, we will use a Python script that can be run in PowerBI Desktop in order to create a Box-Plot chart type, Select the Table chart type and then drag and drop the Age, Credit Amount and Risk fields into the chart. On the Visualization panel, click on Python visual to open the Python script editor panel where we will build our Python script to create the chart # The following code to create a dataframe and remove duplicated rows is always executed and acts as a preamble for your script: # dataset = dataset.drop_duplicates() # Paste or type your script code here: import pandas as pd\n\nimport numpy as np\n\nimport seaborn as sns\n\nimport matplotlib.pyplot as plt\n\n# List of Age Category\n\ninterval = (18, 25, 35, 60, 120)\n\ncats = ['Student', 'Young', 'Adult', 'Senior']\n\ndataset[\"Age_cat\"] = pd.cut(dataset.Age, interval, labels=cats)\n\n# Create Box-plot visualizaton\n\nfig, ax = plt.subplots(figsize=(20,10))\n\nbox = sns.boxplot(x=dataset[\"Age_cat\"],y=dataset[\"Credit Amount\"], hue=dataset[\"Risk\"], data=dataset)\n\nbox.set_xticklabels(box.get_xticklabels(), rotation=45)\n\nfig.subplots_adjust(bottom=0.5)\n\nplt.tight_layout()\n\nplt.show() The Age Distribution canvas will look like as below We can also download the BI Custom Visuals from [https://appsource.microsoft.com/en-us/marketplace/apps?page=1&product=power-bi-visuals](https://appsource.microsoft.com/en-us/marketplace/apps?page=1&product=power-bi-visuals) and import them into our PowerBI Desktop. 11. Conclusion Devart ODBC Driver for Oracle is a high-performance connectivity solution with enterprise-level features for accessing Oracle databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. It also supports connectivity on both Oracle on-premise database and the Oracle Cloud database. Tags [odbc](https://blog.devart.com/tag/odbc) [Oracle](https://blog.devart.com/tag/oracle) [Powerbi](https://blog.devart.com/tag/powerbi) [sql](https://blog.devart.com/tag/sql) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbuild-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Build+Credit+Card+Analytics+with+Oracle+Autonomous+Database+using+Devart+ODBC+for+Oracle&url=https%3A%2F%2Fblog.devart.com%2Fbuild-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/build-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html&title=Build+Credit+Card+Analytics+with+Oracle+Autonomous+Database+using+Devart+ODBC+for+Oracle) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/build-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html&title=Build+Credit+Card+Analytics+with+Oracle+Autonomous+Database+using+Devart+ODBC+for+Oracle) [Copy URL](https://blog.devart.com/build-credit-card-analytics-with-oracle-autonomous-database-using-devart-odbc-for-oracle.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/build-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Build Credit Card Analytics with Python using Devart ODBC for Oracle By [Max Remskyi](https://blog.devart.com/author/max-remskyi) April 28, 2023 [0](https://blog.devart.com/build-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html#respond) 2123 Contents 1. Abbreviation 2. Objective 3. Technology Background 3.1. Python with Anaconda platform 3.2. Oracle Autonomous Data Warehouse Database 3.3. Devart ODBC for Oracle Driver 4. Prerequisites 5. Architecture 6. Install Devart ODBC for Oracle 7. Driver Configuration 7.1. Wallet Configuration 7.2. Window DNS Configuration 8. Create Data 9. Create ODBC Connection with pyodbc library 10. Build Credit Analytics 11. Conclusion 1. Abbreviation Oracle Cloud Infrastructure Console OCI Oracle Autonomous Data Warehouse ADW 2. Objective This tutorial is to build Credit Card Analytics by leveraging Python, Devart ODBC for Oracle driver, and Oracle Autonomous Data Warehouse Database. You will execute your Python code with Anaconda what is a free and open-source platform. This article is not used to help you how to learn Python for building Credit Card Analytics or Prediction Analytics. 3. Technology Background 1. Python with Anaconda platform Anaconda is a distribution of Python and R programming languages for scientific computing; it’s easy to simplify package management and deployment. It allows you to write and execute Python code easily and quickly; and it also comes with a large number of libraries / packages that you can use in your projects. 2. Oracle Autonomous Data Warehouse Database Oracle Autonomous Data Warehouse (ADW) is fully-managed and offers high performance. It includes all of the performance of Oracle Database in the fully-managed environment that is turned and optimized for Data Warehouse workload. It means you don’t need to take more effort and resources (DBA role) to manage the database and optimize the workload. Self-Driving A user defines service levels, and the database makes them happen Self-Securing Protection from both external attacks and malicious internal users Self-Repairing Automated protection from all downtime 3. Devart ODBC for Oracle Driver [ODBC Driver for Oracle](https://www.devart.com/odbc/oracle/) is a high-performance connectivity solution with enterprise-level [features](https://docs.devart.com/odbc/oracle/features.htm) for accessing Oracle databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. Our ODBC driver fully supports standard ODBC API functions and data types and enables easy and secure access to live Oracle data from anywhere. A distinctive feature of this driver is the ease of establishing a connection – just specify the Host and Port of the Oracle server and there is no need to install and configure the Oracle Client. 4. Prerequisites Before starting to connect Oracle database by using Devart ODBD for Oracle and building Credit Card Analytics, make sure to download the necessary tools and drivers: Click [here](https://www.anaconda.com/products/distribution) to download the Anacondaapplication for Window OS Click [here](https://www.devart.com/odbc/oracle/) to download Devart ODBC for Oracle Driver Click [here](https://objectstorage.us-ashburn-1.oraclecloud.com/n/id66dobbdxlj/b/Data/o/german_credit_data.csv) to download the Credit Card data Click [here](https://codingsight.com/provisioning-oracle-autonomous-database/) to understand how to provision ADW instances on Oracle Cloud 5. Architecture Anaconda Python uses Devart ODBC for Oracle driver to connect ADW that is hosted in Oracle Cloud, and query data for building analytics. 6. Install Devart ODBC for Oracle Download and run the installer file to install Follow the instruction Select Destination Location Select Components -> Full Installation Click Next Click Next and enter the Activation Key or choose Trial option -> Next Click Next to start the installation After the installation is completed, we continue configuring the driver 7. Driver Configuration In this step, we will need to configure Oracle Call Interface and Windows DNS 1. Wallet Configuration Download client credentials and store the file in a secure folder on your client computer by following the link [https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/connect-download-wallet.html#GUID-DED75E69-C303-409D-9128-5E10ADD47A35](https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/connect-download-wallet.html#GUID-DED75E69-C303-409D-9128-5E10ADD47A35) Unzip/uncompress the credentials file into a secure folder on your client computer such as the folder C:\\instantclient-basic-19-3\\ADW_DEMO Create the TNS_ADMIN environment variable and set it to the location of the credentials file: Click Start, Run Type rundll32.exe sysdm.cpl,EditEnvironmentVariables Environment Variables window appears -> Click New… Enter Variable name: TNS_ADMIN Enter Variable value: C:\\instantclient-basic-19-3\\ADW_DEMO This folder is where the wallet file is uncompressed Click OK -> Restart your computer to ensure the variable is created completely 2. Window DNS Configuration Click Start, Run Type the C:\\WINDOWS\\SysWOW64\\odbcad32.exe if the system is 64bit to open ODBC Data Source Administrator Click on the Driver tab and make sure Devart ODBC Driver for Oracle i s in the list of drivers Select the User DSN or System DSN tab. Click Add. The Create New Data Source dialogue will appear. Select Devart ODBC Driver for ODBC Driver for Oracle and click Finish. The driver setup dialogue will open. Enter the connection information in the appropriate fields. Check on the checkbox Direct,  Oracle Client will not be required to download and install in our environment. Open tnsname.ora file in the folder where the wallet file is uncompressed. We will get the connection information in the file such as Host Name, Port, and Service Name. We will select one of three services to establish the DNS Configuration for Devart ODBC. Host: adb.us-ashburn-1.oraclecloud.com Port: 1522 Service Name: ghkzzgdddmcvkfb_adwdemo_medium.adb.oraclecloud.com User ID: < database user schema created in ADW instance > Password: < enter password > The above information will be different and depend on your ADW instance provisioned Autonomous Data Warehouse by default supports Mutual TLS (mTLS) connections that clients connect through a TCPS (Secure TCP) database connection using standard TLS 1.2 with a trusted client certificate authority (CA) certificate. Certification authentication with Mutual TLS uses an encrypted key stored in a wallet on both the client (where the application is running) and the server (where your database service on the Autonomous Data Warehouse is running). It means we will need to provide the wallet file for Devart ODBC to ensure the secured connection Go to Security Settings -> SSL Options Wallet Path: path to cwallet.sso file. This file is located in the folder where the wallet file is uncompressed C:\\instantclient-basic-19-3\\ADW_DEMO\\cwallet.sso Server Certificate DN: CN=adwc.uscom-east-1.oraclecloud.com,OU=Oracle BMCS US,O=Oracle Corporation,L=Redwood City,ST=California,C=US Server Certificate DN is extracted from tnsname.ora file that is the value of ssl_server_cert_dn adwdemo_medium = (description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=adb.us-ashburn-1.oraclecloud.com))(connect_data=(service_name=ghkzzgdddmcvkfb_adwdemo_medium.adb.oraclecloud.com))(security=(ssl_server_cert_dn=” CN=adwc.uscom-east-1.oraclecloud.com,OU=Oracle BMCS US,O=Oracle Corporation,L=Redwood City,ST=California,C=US “))) Click on Test Connection and make sure the connection is corrected. 8. Create Data Now, we will load Credit Card data into ADW user schema Create a new table DW_CREDIT_DATA_F Import data Metadata Definition COLUMN DESCRIPTION DATA TYPE AGE Age of Person NUMBER(3,0) SEX Sex/Gender: male, female VARCHAR2(20) JOB 0 – unskilled and non-resident1 – unskilled and resident2 – skilled3 – highly skilled NUMBER(1,0) HOUSING Identify a person who owns a house or notown, rent, or free VARCHAR2(50) SAVING_ACCOUNTS Kind of Saving Accountslittle, moderate, quite rich, rich VARCHAR2(50) CHECKING_ACCOUNT NUMBER(22,7) CREDIT_AMOUNT NUMBER(22,7) DURATION Identify how long for cre NUMBER(4,0) PURPOSE The purpose of usage:car, furniture/equipment, radio/TV, domestic appliances, repairs, education, business, vacation/others VARCHAR2(150) RISK Value Target:goodbad VARCHAR2(50) SQL script to create a table CREATE TABLE DW_CREDIT_DATA_F(\n\n    ROW_ID                        NUMBER(10, 0),\n\n    AGE                                NUMBER(3, 0),\n\n    SEX                                 VARCHAR2(20),\n\n    JOB                                 NUMBER(1, 0),\n\n    HOUSING                       VARCHAR2(50),\n\n    SAVING_ACCOUNTS    VARCHAR2(50),\n\n    CHECKING_ACCOUNT VARCHAR2(50),\n\n    CREDIT_AMOUNT        NUMBER(22, 7),\n\n    DURATION                     NUMBER(4, 0),\n\n    PURPOSE                        VARCHAR2(150),\n\n    RISK                                 VARCHAR2(50)\n\n) 9. Create ODBC Connection with pyodbc library In this step, we will use Devart ODBC for Oracle driver to establish the connection bridge between Python and Oracle Database Engine by using pyodbc library Open Anaconda Navigator -> Click on jupyter notebook to create a new session You are redirected to a local web page -> Click on New -> Python 3 To connect to Autonomous Database Warehouse (ADW), you need to use pyodbc library The connection string for pyodbc DRIVER =Devart ODBC Driver for Oracle Direct =True;Host=adb.us-ashburn-1.oraclecloud.com Port =1522 Service Name =ghkzzgdddmcvkfb_adwdemo_low.adb.oraclecloud.com User ID = Password = Use SSL =True Wallet Path = such C:\\instantclient-basic-19-3\\ADW_DEMO\\cwallet.sso Note: You must unzip the Wallet file after downloading it from Oracle ADW. Below is a full connection string import pyodbc import pyodbc\ncnxn = pyodbc.connect(\"DRIVER=Devart ODBC Driver for Oracle;Direct=True;Host=adb.us-ashburn-1.oraclecloud.com;Port=1522;Service Name=ghkzzgdddmcvkfb_adwdemo_low.adb.oraclecloud.com;User ID=DEMO;Password=xxxxxxxxx;Use SSL=True;Wallet Path=C:\\instantclient-basic-19-3\\ADW_DEMO\\cwallet.sso\") Run to verify the connection To work with Devart ODBC Driver, pyodbc library supports the functions cursor.execute() and cursor.fetchone() to retrieve data from the database table. cursor = cnxn.cursor()\ncursor.execute(\"SELECT ROW_ID,AGE,SEX,JOB,HOUSING,SAVING_ACCOUNTS,CHECKING_ACCOUNT,CREDIT_AMOUNT,DURATION,PURPOSE,RISK FROM DW_CREDIT_DATA_F\")\nrow = cursor.fetchone() \nwhile row: \n print (row)\n row = cursor.fetchone() The connection is established successfully. You are ready to start the Build Credit Card based on Python language and data hosted in ADW. 10. Build Credit Analytics For building Credit Analytics, you need to load some libraries #Load the librarys\nimport pandas as pd #To work with dataset\nimport numpy as np #Math library\nimport seaborn as sns #Graph library that use matplot in background\nimport matplotlib.pyplot as plt #to plot some parameters in seaborn\n# it's a library that we work with plotly\nimport plotly.offline as py \npy.init_notebook_mode(connected=True) # this code, allow us to work with offline plotly version\nimport plotly.graph_objs as go # it's like \"plt\" of matplot\nimport plotly.tools as tls # It's useful to we get some tools of plotly\nimport warnings # This library will be used to ignore some warnings\nfrom collections import Counter # To do counter of some features If you encounter the issue “ no module named ‘plotly’ jupyter notebook ”, you need to close the current session of Jupyter Notebook; and go back Anaconda Navigator. Click on CMD.exe prompt, you are redirected to Window Command Prompt; then type the command pip install plotly Then open a new Jupyter Notebook, the issue will be resolved Load table into a data frame by using read_sql function of Pandas library. This function requires two parameters: SQL Select statement and Connection sql ='SELECT ROW_ID,AGE,SEX,JOB,HOUSING,SAVING_ACCOUNTS,CHECKING_ACCOUNT,CREDIT_AMOUNT,DURATION,PURPOSE,RISK FROM DW_CREDIT_DATA_F'\ndf_credit = pd.read_sql(sql,cnxn)\nprint(df_credit) Now, you might want to know the Age Distribution within Credit Risk (Good, Bad). The following code to create Histogram charts based on Age Category and Credit Card Risk # Analysis Age Distribution within Risk Category\ndf_good = df_credit.loc[df_credit[\"RISK\"] == 'good']['AGE'].values.tolist()\ndf_bad = df_credit.loc[df_credit[\"RISK\"] == 'bad']['AGE'].values.tolist()\ndf_age = df_credit['AGE'].values.tolist()\n\n#First plot\ntrace0 = go.Histogram(\n x=df_good,\n histnorm='probability',\n name=\"Good Credit\"\n)\n#Second plot\ntrace1 = go.Histogram(\n x=df_bad,\n histnorm='probability',\n name=\"Bad Credit\"\n)\n#Third plot\ntrace2 = go.Histogram(\n x=df_age,\n histnorm='probability',\n name=\"Overall Age\"\n)\n\n#Creating the grid\nfig = tls.make_subplots(rows=2, cols=2, specs=[[{}, {}], [{'colspan': 2}, None]],\n subplot_titles=('Good','Bad', 'General Distribuition'))\n\n#setting the figs\nfig.append_trace(trace0, 1, 1)\nfig.append_trace(trace1, 1, 2)\nfig.append_trace(trace2, 2, 1)\n\nfig['layout'].update(showlegend=True, title='Age Distribuition', bargap=0.05)\npy.iplot(fig, filename='custom-sized-subplot-with-subplot-titles') The following code to create Boxlpot charts based on Age Category and Credit Amount # List of Age Category\ninterval = (18, 25, 35, 60, 120)\n\ncats = ['Student', 'Young', 'Adult', 'Senior']\ndf_credit[\"Age_cat\"] = pd.cut(df_credit.AGE, interval, labels=cats)\n# Create Box-plot visualizaton\nfig, ax = plt.subplots(figsize=(20,10))\nbox = sns.boxplot(x=df_credit[\"Age_cat\"],y=df_credit[\"CREDIT_AMOUNT\"], hue=df_credit[\"RISK\"], data=df_credit)\nbox.set_xticklabels(box.get_xticklabels(), rotation=45)\nfig.subplots_adjust(bottom=0.5)\nplt.tight_layout()\nplt.show() 11. Conclusion If you are Data Analyst, Data Scientist…. and you are familiar with Python language. Devart ODBC Driver for Oracle and Python are good choices for your program because it’s easy for installing and configuring; you don’t need to spend more effort on how establishing the connection between Python and Oracle Autonomous Database / Oracle Database. Just install the Devart ODBC Driver for Oracle and start your Python Code. Tags [howto](https://blog.devart.com/tag/howto-2) [Oracle](https://blog.devart.com/tag/oracle) [pyodbc](https://blog.devart.com/tag/pyodbc) [Python](https://blog.devart.com/tag/python) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbuild-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Build+Credit+Card+Analytics+with+Python+using+Devart+ODBC+for+Oracle&url=https%3A%2F%2Fblog.devart.com%2Fbuild-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/build-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html&title=Build+Credit+Card+Analytics+with+Python+using+Devart+ODBC+for+Oracle) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/build-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html&title=Build+Credit+Card+Analytics+with+Python+using+Devart+ODBC+for+Oracle) [Copy URL](https://blog.devart.com/build-credit-card-analytics-with-python-using-devart-odbc-for-oracle.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/build-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Build Prediction Sales Analytics with Power BI Desktop using Devart ODBC for Oracle By [DAC Team](https://blog.devart.com/author/dac) November 30, 2022 [0](https://blog.devart.com/build-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html#respond) 2545 This tutorial is to build the Sales Analytics dashboard by leveraging Power BI Desktop, Devart ODBC for Oracle driver and Oracle Database. It will not be used as the instruction on how to work on PowerBI Desktop or Oracle Database Engine instead we will demonstrate how we will work with Devart ODBC for Oracle driver. Technology Background Power BI Desktop Microsoft Power BI Desktop is built for the analyst. It combines state-of-the-art interactive visualizations, with industry-leading data query and modelling built-in. Create and publish your reports to Power BI. Power BI Desktop helps empower others with timely critical insights, anytime, anywhere. Oracle Database Oracle database services and products offer customers cost-optimized and high-performance versions of Oracle Database, the world’s leading converged, multi-model database management system, and in-memory, NoSQL and MySQL databases. Devart ODBC for Oracle Driver ODBC Driver for Oracle is a high-performance connectivity solution with enterprise-level features for accessing Oracle databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. Our ODBC driver fully supports standard ODBC API functions and data types and enables easy and secure access to live Oracle data from anywhere. Prerequisites Before starting to connect the Oracle database by using Devart ODBD for Oracle and building Prediction Sales Analytics, make sure to download all the necessary tools and drivers: Click [here](https://www.microsoft.com/en-us/download/details.aspx?id=58494) to download Power BI Desktop Click [here](https://www.devart.com/odbc/oracle/) to download Devart ODBC for\nOracle Driver Click [here](https://objectstorage.us-ashburn-1.oraclecloud.com/n/id66dobbdxlj/b/Data/o/sales_data_sample.csv) to download the Sales data Make sure to install the Oracle Database application as well. Architecture Power BI Desktop uses Devart ODBC for Oracle driver to connect Oracle Database to query data for building analytics. Install Devart ODBC for Oracle Download and run the installer file to install Follow the instructions Select Destination Location Select Components -> Full Installation Click Next Click Next and enter the Activation Key or choose Trial option -> Next Click Next to start the installation After the installation is completed, we continue configuring the driver Driver Configuration Click Start, Run Type the C:\\WINDOWS\\SysWOW64\\odbcad32.exe if the system is 64bit to open ODBC Data Source Administrator Click on the Driver tab and make sure Devart ODBC Driver for Oracle is in the list. Select the User DSN or System DSN tab. Click Add. The Create New Data Source dialogue will appear. Select Devart ODBC Driver for ODBC Driver for Oracle and click Finish. The driver setup dialogue will open. Enter the connection information in the appropriate fields. Make sure the Oracle database is running up. Click on Test Connection and make sure the connection is correct. Create Data Now, we will load Sales data into the DW_SALES user-schema. Create a new table DW_SALES_F Import data Metadata Definition Column\n Name Description Data\n Type ORDERNUMBER Order\n Number ID NUMBER(10,0) QUANTITYORDERED Quantity\n Ordered NUMBER(10,0) PRICEEACH Price\n Each NUMBER(10,2) ORDERLINENUMBER Order\n Number ID NUMBER(10,0) SALES Sales NUMBER(10,2) ORDERDATE Order\n Date DATE STATUS Status VARCHAR(50) QTR_ID Quarter\n ID of Order Date NUMBER(2,0) MONTH_ID Month\n ID of Order Date NUMBER(2,0) YEAR_ID Year\n of Order Date NUMBER(4,0) PRODUCTLINE Product\n Line VARCHAR(100) MSRP MSRP VARCHAR(100) PRODUCTCODE Product\n Code VARCHAR(100) CUSTOMERNAME Customer\n Name VARCHAR(100) PHONE Phone VARCHAR(100) ADDRESSLINE1 Address\n Line 1 VARCHAR(100) ADDRESSLINE2 Address\n Line 2 VARCHAR(100) CITY City VARCHAR(100) STATE State VARCHAR(100) POSTALCODE Postal\n Code VARCHAR(100) COUNTRY Country VARCHAR(100) TERRITORY Territory VARCHAR(100) CONTACTLASTNAME Contact\n Last Name VARCHAR(100) CONTACTFIRSTNAME Contact\n First Name VARCHAR(100) DEALSIZE Deal\n Size VARCHAR(100) CREATE TABLE DW_SALES_F(\n ORDERNUMBER NUMBER(10, 0),\n QUANTITYORDERED NUMBER(10, 0),\n PRICEEACH NUMBER(10, 2),\n ORDERLINENUMBER NUMBER(10, 2),\n SALES NUMBER(10, 2),\n ORDERDATE DATE,\n STATUS VARCHAR(50),\n QTR_ID NUMBER(2, 0),\n MONTH_ID NUMBER(2, 0),\n YEAR_ID NUMBER(4, 0),\n PRODUCTLINE VARCHAR(100),\n MSRP VARCHAR(100),\n PRODUCTCODE VARCHAR(100),\n CUSTOMERNAME VARCHAR(100),\n PHONE VARCHAR(100),\n ADDRESSLINE1 VARCHAR(100),\n ADDRESSLINE2 VARCHAR(100),\n CITY VARCHAR(100),\n STATE VARCHAR(100),\n POSTALCODE VARCHAR(100),\n COUNTRY VARCHAR(100),\n TERRITORY VARCHAR(100),\n CONTACTLASTNAME VARCHAR(100),\n CONTACTFIRSTNAME VARCHAR(100),\n DEALSIZE VARCHAR(100)\n); Create ODBC Connection in PowerBI Desktop In this step, we will use the Devart ODBC for Oracle driver to establish the connection bridge between PowerBI Desktop and Oracle Database Engine. Ensure the PowerBI Desktop is completely installed. Open PowerBI Desktop -> Click Get Data -> after the Get Data window appears -> Choose Others. On the left panel, there is a list of PowerBI-supported drivers. Choose ODBC driver -> Connect From the Data Source dropdown list -> Choose the DNS: Devart_ODBC_Oracle which was created in the Driver Configuration section of this article. Click OK. Then enter username and password if they are required Click Connect. A Navigator window appears and we will add the Sales table to PowerBI Desktop Click Transform Data. In this step, we rename the column name for better readability Right-click on the Column header -> Rename Click on Close & Apply. Now, we can start building Prediction Sales Analytics in PowerBI Desktop. Right-click on the dataset in the Fields panel and rename it to Sales Build Sales Analytics Before building any reports we need to create some measures in PowerBI. Click on the dataset -> Create new measures by following formulas: Total Sales = SUM(Sales[Sales]) Total Quantity Order = SUM(Sales[Quantity Ordered]) Total Order = DISTINCTCOUNT(Sales[Order Number]) Let’s build the first visualization based on the sales data in the Oracle database Create a new canvas: Sales By Country Create a map chart by choosing Map chart in the Visualization panel Drag and drop Country from Sales to Location and Legend Drag and drop Total Sales from the Sales dataset to Tooltips Create a Slicer to filter Territory. On the Visualizations panel, select Slicer chart type then drag and drop Territory to Field. Continue creating a new Slicer to filter Order Date Right-click on Country -> Create Hierarchy to create a Country Hierarchy with the following structure Country -> State -> City Right-click on State -> Add to Hierarchy -> Choose Country Hierarchy Right-click on City -> Add to Hierarchy -> Choose Country Hierarchy Now, we create a new Pivot visualization. Then drag and drop Country Hierarchy to Rows and Total Order, Total Sales to values Continue creating new canvas Sales By Product. In this canvas, we will analyze Sales By Product by creating a Line chart for forecasting Total Sales by Year, and a 100% Stacked Bar chart to analyze the % Share of each Product Line by Territory. When we create the Line chart for Total Sales by Order Date, we are able to add Forecast line to forecast Sales Amount for next three years. In the next step, we will need a new canvas to analyze the overview insights. Conclusion Devart ODBC Driver for Oracle is a high-performance connectivity solution with enterprise-level features for accessing Oracle databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. It will suit both small- and large-scale businesses, and come to the rescue even if you have literally loads of data to process. In this article, you discovered how easy it is to connect to the database, fetch all data you require, and neatly visualize it for the general audience, with graphs, bells, and whistles. Don’t wait for other signs – proceed to seamless connection and high performance right now! [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbuild-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Build+Prediction+Sales+Analytics+with+Power+BI+Desktop+using+Devart+ODBC+for+Oracle&url=https%3A%2F%2Fblog.devart.com%2Fbuild-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/build-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html&title=Build+Prediction+Sales+Analytics+with+Power+BI+Desktop+using+Devart+ODBC+for+Oracle) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/build-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html&title=Build+Prediction+Sales+Analytics+with+Power+BI+Desktop+using+Devart+ODBC+for+Oracle) [Copy URL](https://blog.devart.com/build-prediction-sales-analytics-with-power-bi-desktop-using-devart-odbc-for-oracle.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025 [ODBC](https://blog.devart.com/category/odbc) [What is Data Integration? Definition, Types, Examples & Use Cases](https://blog.devart.com/what-is-data-integration.html) May 5, 2025"} {"url": "https://blog.devart.com/build-report-with-devart-python-connector-snowflake.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Python Connectors](https://blog.devart.com/category/products/python-connectors) [Uncategorized](https://blog.devart.com/category/uncategorized) Build Reports With Devart Python Connector & Snowflake By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) July 5, 2024 [0](https://blog.devart.com/build-report-with-devart-python-connector-snowflake.html#respond) 1074 Content Objective Technology background Prerequisites Architecture Create a database in Snowflake Get Snowflake credentials Install Devart Python Connector for Snowflake Connect to a Snowflake database Conclusion Objective In the article, we will use Devart Python Connector for Snowflake to demonstrate how to connect to a Snowflake database within the Anaconda tool. Technology background Python Connector for Snowflake [Python Connector for Snowflake](https://www.devart.com/python/snowflake/) is a reliable connectivity solution for accessing the Snowflake data cloud from Python applications, in order to perform create, read, update, and delete operations on stored data. The solution fully implements the Python DB API 2.0 specification and is distributed as a wheel package for 32-bit and 64-bit Windows. High Performance The connector supports connection pooling and local data caching to increase access speed. It also lets you submit multiple update statements to the data warehouse for batch processing to improve execution time. Platform Support The Python connector is available for Windows and Windows Server (32-bit and 64-bit). Unicode-Compliant Connector The Unicode-compliant connector lets you retrieve and update multilingual data, regardless of its character encoding (Chinese, Cyrillic, Hebrew, and more) in any language environment. Data Types Support The connector supports all BigQuery and Python data types and offers additional options to control data type mapping between them. About the Snowflake platform Snowflake is a cloud-based platform that empowers data-driven organizations to mobilize data, apps, and AI across various industries and use cases. One of the key points of Snowflake is the Data Cloud Platform: Snowflake offers a single platform that eliminates data silos and simplifies architectures. Prerequisites Before starting to connect the Snowflake database by using Devart Python Connector for Snowflake and building Analytics, make sure to download the necessary tools and drivers: Anaconda IDE or any IDE for Python development Devart Python Connector for Snowflake (click [here](https://www.devart.com/python/snowflake/) to download it) A Snowflake account and a Snowflake sample database Architecture Create a database in Snowflake During this step, we will create a database in Snowflake and use it to create a connection. Click + Database and enter a name for your new database. In our case, it will be called DEMO . After the database is created, proceed to it and click + Schema to create a new schema in this database. In our case, we’ll call it HR . Next, let’s create a table and fill it with data. In our case, the table will be called EMPLOYEE_CHURN . We’ll import a list of active and non-active employees into this table and use it to analyze employee churn. This is what the list looks like. To import a file with data into a table, click Create > Table > From File . Keep all settings as default, enter the table name, and click Next . Next, check whether the data has been mapped correctly and click Load . That’s it, the data is in your Snowflake table. Get Snowflake credentials This information will help you create a connection to Snowflake during the next step. After registering your Snowflake account, you will receive an email confirming your account is active. In the content of the email, you must get two pieces of information: the Account Identifier (Domain URL or the hostname of Snowflake instance) and the User Name to access / log in to the Snowflake instance. The Account Identifier is extracted from Dedicated Login URL. In our case, it’s iiathjj-dl47211 . Install Devart Python Connector for Snowflake [Download the installation file from the Devart website](https://www.devart.com/python/snowflake/download.html) and extract its contents. To install the Python package, your environment needs to have the installed Python and pip library. Install Python in your environment. Download Python 3.12 from [here](https://www.python.org/downloads/) and run the installation file. After the installation, open the Command Prompt to check the version of the pip library. If pip is not available, run the command to install the pip library. cd C:\\Users\\dtdinh\\AppData\\Local\\Programs\\Python\\Python312 \n\npython get-pip.py In the Command Prompt, browse to the folder where you have extracted the installation package and run the following command. pip install devart_snowflake_connector-1.0.1-cp312-cp312-win_amd64.whl In our case, we have installed Anaconda in our environment, so we are able to install the package and leverage Anaconda to develop programs with Snowflake. Open Anaconda and launch the CMD.exe prompt. Run the following command to check if pip is installed. If it isn’t, you will need to install it. py -m pip –version Run the following command to create a new environment in Anaconda, where you will isolate the environment and packages. conda create -n Devart Python=3.12 anaconda Run the command to activate the newly created Devart environment. conda active Devart Run the command to install the Python Connector in the Devart environment. cd C:\\Working\\11-Tech-Published-Articles\\26-Devart-Python-Connector-Snowflake\\DevartPythonSnowflake\\whl Browse to the folder of Python Connector. pip install devart_snowflake_connector-1.0.1-cp312-cp312-win_amd64.whl Connect to a Snowflake database After you install the Devart Python Connector, you will be able to connect to your Snowflake database. You can use any Python IDE to develop Python-based programs that need to connect to the Snowflake database, such as Anaconda. Open Anaconda and launch the Jupyter Notebook in the Devart environment. Create a Notebook and write Python code to connect your Snowflake database and query data from your table. In our case, it all will look as follows. Step 1: We import the Devart Python Connector package. # Step1 : Import Python Connector library \n\nimport devart.snowflake Step 2: We establish a connection to our Snowflake database. # Step2: Make the connection \n\nsf_connection = devart.snowflake.connect( \n\n    Domain=\"Account Identifier\", \n\n    UserId=\"User to access the Snowflake database\", \n\n    Password=\"Password of UserID \", \n\n    Database=\"Snowflake Database\", \n\n    Schema=\"Schema in Database\" \n\n) Step 3: We create a cursor object to fetch data from our Snowflake database. # Step3: Create the cursor\n\nsf_cursor = sf_connection.cursor() Step 4: We query 10 employees from our EMPLOYEE_CHURN table. # Step4: Query 10 Employees \n\nsf_cursor.execute(\"SELECT * FROM EMPLOYEE_CHURN LIMIT 10\") Step 5: We print 10 employee records. # Step5: Print 10 Employee records \n\nfor row in sf_cursor.fetchall():  \n\n    print(row) Finally, the program looks as follows: Now, we execute Python and see the result in order to analyze the employee churn. Conclusion [Devart Python Connector for Snowflake](https://www.devart.com/python/snowflake/) is a reliable connectivity solution for accessing the Snowflake data cloud from Python applications to perform create, read, update, and delete operations on stored data. The solution fully implements the Python DB API 2.0 specification and is distributed as a wheel package for 32-bit and 64-bit Windows. Tags [python connectors](https://blog.devart.com/tag/python-connectors) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbuild-report-with-devart-python-connector-snowflake.html) [Twitter](https://twitter.com/intent/tweet?text=Build+Reports+With+Devart+Python+Connector+%26+Snowflake&url=https%3A%2F%2Fblog.devart.com%2Fbuild-report-with-devart-python-connector-snowflake.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/build-report-with-devart-python-connector-snowflake.html&title=Build+Reports+With+Devart+Python+Connector+%26+Snowflake) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/build-report-with-devart-python-connector-snowflake.html&title=Build+Reports+With+Devart+Python+Connector+%26+Snowflake) [Copy URL](https://blog.devart.com/build-report-with-devart-python-connector-snowflake.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/bulk-data-import.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Simplify Your Cross-Database Data Import with dbForge Edge By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) August 9, 2023 [0](https://blog.devart.com/bulk-data-import.html#respond) 2091 Microsoft Access is a rather popular relational database system that has quite a few advantages. It is very easy to install and manage, no matter whether you are a beginner or a seasoned database user. It has a set of flexible tools for data migration. It has a large community and lots of helpful resources all over the Internet, so you won’t have any trouble getting in-depth insights into it. Finally, Access is available as part of the Professional and higher editions of the Microsoft 365 suite, which makes it an affordable addition to Microsoft’s flagship office tools such as Word, Excel, and Outlook. But of course, there are lots of drawbacks and limitations to it. The features and capacities of Access can’t even hope to get close to its more advanced counterpart — Microsoft SQL Server. And it’s no wonder that many businesses that once started with Access eventually have to scale to a database system that delivers more firepower, such as SQL Server, MySQL, MariaDB, Oracle, or PostgreSQL. If that’s your case, chances are that you are already struggling with the migration of your tables from Access to one of the abovementioned databases. That is why we are going to show you how to do it most painlessly using [dbForge Edge](https://www.devart.com/dbforge/edge/) , a multidatabase solution that covers all of these systems with comprehensive IDEs. There are four IDEs included in dbForge Edge— [Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) —all of which offer support for up to 14 most widely used data formats for import and export. And without further ado, let’s show you how to bulk import table data to a SQL Server database using the corresponding Studio. Contents Prerequisites Bulk data import from Microsoft Access to SQL Server Error handling Adapting the process for other databases Prerequisites Now what do you need to get started? Your source Access database —one or multiple MDB files that contain all the tables you want to import Your target SQL Server database , fully set up as you require and ready to get your data imported dbForge Edge , [downloaded for a free 30-day trial](https://www.devart.com/dbforge/edge/download.html) and installed on your machine; alternatively, you can [download](https://www.devart.com/dbforge/sql/studio/download.html) and install dbForge Studio for SQL Server individually Once it’s all ready, open the Studio and establish a connection to your target SQL Server database Bulk data import from Microsoft Access to SQL Server Before we start, let us note that currently, the methods of importing individual and multiple tables from Access to SQL Server are different. In case you need a detailed guide on importing one Access table at a time, carefully and thoroughly going through data format settings, table mapping, import modes, error handling, and so on, refer to our blog post “ [How to Convert a Microsoft Access Database to a SQL Server Database](https://blog.devart.com/convert-microsoft-access-to-sql-server.html#import) “. When it comes to bulk import, we’ll choose a different way that involves the command line. Take a look at the following example. \"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:access.dit /table:sakila.dbo.country /inputfile:dbo.country.mdb /inputtable:country\n\n\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:access.dit /table:sakila.dbo.city /inputfile:dbo.city.mdb /inputtable:city\n\n\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:access.dit /table:sakila.dbo.address /inputfile:dbo.address.mdb /inputtable:address This example shows import from three tables; and in the same way, you can do it for dozens and even hundreds of tables. First, you need to write down a similar set of commands for all the tables you want to import, where: In /inputfile , you specify the name of the required MDB file In /inputtable , you specify the name of the source table from the said MDB file In /table , you specify the full name of the target SQL Server table, which includes the corresponding database You should also take note of /templatefile , which must be created in dbForge Studio beforehand. To create it, go to the Database menu > Import Data to open the Data Import Wizard. Select MDB as the required format. In the lower left corner of the Wizard, select Save Template from the dropdown menu next to the Save button, specify the name and path to your template, and click Save . For everything to run correctly, make sure that the template and all the involved files are stored on your main C: hard drive. Optionally, in the same lower left corner of the Wizard, you can go to Save > Save Command Line to open Command line execution file settings . Paste your set of commands into the text box and click Validate to make sure they don’t contain any errors. Now let’s see our example in action, shall we? We’ve got empty tables like this one, ready to be filled with data. We open the Command Prompt and run the commands as follows. Note that it is obligatory to open the Prompt as an administrator. Done! Now let’s make sure that our data has been successfully imported. Now let us briefly show you another example. What if you need to run your bulk import from Access to SQL Server through an ODBC connection ? Then the command may look as follows: \"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:odbc.dit /table:database.dbo.country /inputtable:dbo.country\n\n\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:odbc.dit /table:database.dbo.city /inputtable:dbo.city\n\n\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\" /dataimport /templatefile:odbc.dit /table:database.dbo.address /inputtable:dbo.address As you can see, the syntax is similar to that of the previous example. Note that the source database is specified when configuring your ODBC connection. By the way, if you work with ODBC, you can get your fair share of benefits from [Devart ODBC Drivers](https://www.devart.com/odbc/) , high-performance connectivity solutions that cover any integration you might require, enabling fast and secure access to data. Specifically, we can suggest [ODBC Driver for SQL Server](https://www.devart.com/odbc/sqlserver/) that, among other things, delivers seamless compatibility with Access. Error handling To configure error handling behavior, add /errormode to your command. The possible options are as follows: /errormode:ignore — ignores all errors /errormode:abort — aborts the operation upon encountering an error This is what it looks like in the Data Import Wizard. Adapting the process for other databases Now what if you need to perform the same operation with another database system? Never easier—just do the same thing in another corresponding Studio with your code adjusted accordingly. For instance, this is what the import of a table from the abovementioned example will look like in dbForge Studio for Oracle. \"C:\\Program Files\\Devart\\dbForge Studio for Oracle\\dbforgeoracle.com\" /dataimport /templatefile:access.dit /table:database.dbo.country /inputfile:country.mdb /inputtable:country Download dbForge Edge for a free 30-day trial today! The capabilities of each Studio included in dbForge Edge go far beyond data migration. Let us just briefly list the essentials that you most likely need in your daily work as well. Visualization of database structures on ER diagrams Visual table design Enhanced SQL coding: context-aware completion, formatting, refactoring, and debugging Visual query building that does not require any coding Query optimization via profiling Flexible data management in a visual editor Identification of differences in database schemas and table data Quick synchronization of source and target databases Data analysis: data aggregation in visual pivot tables, observation of related data in Master-Detail Browser, and generation of detailed data reports Generation of realistic test data Generation of full database documentation Administration: database and server monitoring, backup/recovery, user and session management, and more You can get started with dbForge Edge today—just [download it for a free month-long trial](https://www.devart.com/dbforge/edge/download.html) and give it a go to see the full power of its capabilities. We bet they won’t leave you indifferent. Tags [data import](https://blog.devart.com/tag/data-import) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL](https://blog.devart.com/tag/mysql) [odbc drivers](https://blog.devart.com/tag/odbc-drivers) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbulk-data-import.html) [Twitter](https://twitter.com/intent/tweet?text=Simplify+Your+Cross-Database+Data+Import+with+dbForge+Edge&url=https%3A%2F%2Fblog.devart.com%2Fbulk-data-import.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bulk-data-import.html&title=Simplify+Your+Cross-Database+Data+Import+with+dbForge+Edge) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bulk-data-import.html&title=Simplify+Your+Cross-Database+Data+Import+with+dbForge+Edge) [Copy URL](https://blog.devart.com/bulk-data-import.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Automating Bulk Data Import from MS Access to SQL Server By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) September 15, 2023 [0](https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html#respond) 2264 One of the daily tasks of database administrators is data migration between different databases. When dealing with large volumes of data, bulk data import comes in handy. It involves loading data from a file into a SQL Server table using a BCP utility, a SQL BULK INSERT statement, the command line, or SQL Server export/import tools such as dbForge Data Pump. In this article, we’ll examine how to transfer data in bulk from MS Access tables to SQL Server using dbForge Data Pump and the PowerShell command line. Contents SQL Bulk Insert vs BCP utility Introduction to Data Pump Retrieving all tables from an MS Access database Create a template to import settings Automating bulk data import For migration of large datasets, manual data entry may be time-consuming. Instead, bulk data import is better to use, as it quickly transfers data between sources without the need to enter each record manually, as well as reduces the risk of data entry errors. In addition, you can automate and schedule bulk data import to maintain regular data synchronization and keep data up to date. With data import, you can move data between tables in different formats. As you can see, there are a lot of benefits to using bulk data import. So, it is time to take a quick look at how it can be implemented, and we start with a SQL Bulk Insert statement and BCP utility. SQL BULK INSERT vs BCP utility The BULK INSERT statement is used to insert large volumes of data from an external file into a SQL Server table or view. It is suitable for scenarios where data loading speed and efficiency are critical. The statement has the following syntax: BULK INSERT table_name\nFROM 'path_to_file'\nWITH (options); where: table_name is the name of the table into which the data will be inserted. The name should be specified in the following format – database_name.schema_name.table_name . path_to_file is the full path to the file with the data you want to import. options are various options or configurations for the bulk insert operation. Another popular way to import data in bulk is to use the bcp command-line utility. BCP stands for B ulk C opy P rogram. It allows users to quickly import/export data from files into SQL Server tables or views. BCP is designed for high-performance data transfer. It can efficiently handle large datasets and is often used for data migration, data warehousing, and ETL (Extract, Transform, Load) processes. BCP allows you to specify various options, including data format, field and row terminators, and authentication details to tailor the import or export process to your needs. An example of the BCP command to import data from a CSV file into a SQL Server database table could be: bcp MyDatabase.dbo.MyTable in datafile.csv -c -T ',' where: MyDatabase.dbo.MyTable is a target table where you want to import data. In this example, you are importing data into the “MyTable” table located in the “dbo” schema of the “MyDatabase.” in indicates that you are performing an import operation, i.e. copying data from an external file into the specified table. datafile.csv is an external data file (CSV file) from which you want to import data. -c indicates that you are using character data format during the import. It is used for importing text-based data like CSV files. -T ',' is the field terminator used in the CSV file. In this case, it’s a comma ( , ), which is common for CSV files. This parameter tells BCP how to recognize the boundaries between fields in the CSV data. While the BULK INSERT and BCP methods serve for bulk data operations, they are implemented differently. Let’s explore the similarities and differences they have. Similarities : Efficient transfer of large volumes of data into SQL Server tables. Optimized for high-performance data loading. Support for data import from external sources into SQL Server tables. Differences : BCP is a command-line utility used for importing/exporting data into SQL Server. BULK INSERT is a Transact-SQL statement executed within SQL Server. It can only import data. BCP can be typically used by database administrators and developers familiar with command-line tools. Therefore, it may require more setup and configurations. BULK INSERT is more accessible for database developers and SQL Server users who are comfortable writing SQL queries. BCP may require passing login credentials and passwords as command-line arguments, which can be a security concern. BULK INSERT uses the SQL Server login when executing the statement, which can be more secure, especially when integrated with SQL Server security mechanisms. Introduction to Data Pump When you need to export/import data from external files to SQL Server databases easily and quickly, [dbForge Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) is your go-to solution. It is a SQL Server Management Studio robust and versatile data migration add-in for populating databases with external source data and transferring data between various database systems, regardless of their formats or structures. The tool supports popular data formats to export or import data, including Excel, CSV, XML, JSON, Text, MS Access, etc. In addition, it is possible to create a template file with the same settings applicable to export/import operations and then use it for repetitive scenarios. Let’s import data from MS Access tables to new SQL Server tables using dbForge Data Pump. Retrieving all tables from an MS Access database To begin with, prepare the Microsoft Access file that will contain data to be imported into SQL Server tables. First, retrieve a list of all tables from an MS Access database by executing the following SELECT statement: SELECT name FROM MSysObjects WHERE type = 4 where: name specifies the “name” column you want to get from the “MSysObjects” table. The “name” column typically stores the names of various database objects, including queries. MSysObjects is a system table in Microsoft Access that stores information about various database objects. type = 4 indicates a condition that filters the results. In Microsoft Access, a “type” with the value 4 typically refers to tables or saved queries. Thus, the query instructs the database to only select rows where the “type” column in the “MSysObjects” table is equal to 4. Once done, save a file with the .accdb extension. Next, move on to importing data using dbForge Data Pump. Let’s ensure that the tables to import data to are empty. To do this, open the SSMS and execute the SELECT statements to retrieve the data. As you can see, the tables currently have no data, and we can proceed to populate the SQL Server tables. Create a template to import settings To make things easy, we’ll create a template with predefined import settings in the Data Import wizard available in dbForge Data Pump. Templates save time and effort by eliminating the need to configure import settings for each import operation manually. Instead of recreating settings from scratch, you can reuse a template with the desired settings, which simplifies the import process and reduces potential human errors. Beforehand, download Microsoft Access Database Engine to install some components that can facilitate data transfer between Microsoft Access files and non-Microsoft Office applications to other data sources such as Microsoft SQL Server. Otherwise, the Import wizard will show the error: When all preparations are done, in SSMS Object Explorer, right-click the required database and select Data Pump > Import Data to open the Data Import Wizard. In the wizard, choose a .mdb format and the source file from which data will be imported, and click Next . On the Destination page, select a source table and a target connection, a database, and a table to import data to. Then, click Next . If you start the Data Import wizard from Object Explorer, it opens with the predefined connection parameters of the selected table. Optionally, you can go through all wizard pages to specify the necessary information and settings. At the bottom of the wizard, click Save Template , specify the name and path to your template, and click Save to create a template. For proper work, it it recommended to store the template and all related files on your main D: hard drive. Done! The template is ready to use. Now, we can import data in bulk with the PowerShell script. Automating bulk data import The PowerShell script is a collection of commands that instructs PowerShell to execute in sequence to run different actions. Let’s create the PowerShell script that explains each step it performs to import data in bulk. 1. Start the PowerShell Windows Integrated Scripting Environment (ISE) console. 2. On the toolbar, click New Script and enter the following instructions: # Define the path to the Microsoft Access database\n$dbPath = \"D:\\og.accdb\"\n\n# Path to your DBForge\n$diffToolLocation = \"C:\\Program Files\\Devart\\dbForge SQL Tools Professional\\dbForge Data Pump for SQL Server\\datapump.com\"\n\n# Define the provider and the data source\n$connectionString = \"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=`\"$dbPath`\";\"\n\n# Create a new OleDb connection\n$connection = New-Object System.Data.OleDb.OleDbConnection($connectionString)\n\n# Open the connection\n$connection.Open()\n\n# Get the schema information for the data source\n$tables = $connection.GetSchema('Tables')\n\n# Filter out system tables to get only user tables\n$userTables = $tables | Where-Object { $_.TABLE_TYPE -eq 'TABLE' -and $_.TABLE_NAME -notlike 'MSys*' }\n\n# Process each table\n$userTables | ForEach-Object {\n $tableName = $_.TABLE_NAME\n Write-Output \"Processing table: $tableName\"\n \n #Command-line string for importing data into your SQL Server database\n $process = Start-Process -FilePath $diffToolLocation -ArgumentList \"/dataimport /templatefile:`\"D:\\imptemplate.dit`\" /connection:`\"Data Source=;Integrated Security=True;User ID=`\" /inputfile:`\"D:\\og.accdb`\" /inputtable:$tableName /table:`\"Olympic_Games_Dev.dbo.$tableName`\" /errormode:abort\" #-PassThru -Wait -windowstyle hidden \n \n #If you need to process the tables one by one to reduce server load - uncomment it.\n #Start-Sleep -Seconds 10 \n }\n\n# Close the connection\n$connection.Close() 3. Substitute the following arguments with your relevant data: $dbPath : Path to the MS Access source file from which data should be imported. $diffToolLocation : Path to the installation folder of dbForge Data Pump. $connectionString : OLE DB Provider to interact with Microsoft Access databases in the .accdb file format. /templatefile : Path to the template .dit file you created in the Data Import wizard. /connection : Connection parameters for the destination database tables you want to import to. /inputfile : Name of the source file from which data should be imported. /table : Full name of the target SQL Server table, including the corresponding schema and database, for example, Olympic_Games_Dev.dbo.$tableName . Note that we declared a variable – $tableName – which means that all tables available in the database will be processed, and you won’t need to enter their names manually. This will simplify the process and reduce the risk of making mistakes. Once the script is created, execute it by clicking Run Script on the toolbar. Done! Now, get back to dbForge Data Pump to ensure the data has been transferred from MS Access to SQL Server database tables. To do this, execute the SELECT statements. The output displays the tables that have been populated with data. Conclusion To sum it up, bulk data import is a critical task in the daily activities of database administrators. In this article, we briefly reviewed the methods to transfer data between different databases, namely [how to import data from Microsoft Access to SQL Server](https://docs.devart.com/data-pump/importing-data/import-from-access.html) . Also, we have demonstrated the easiest way to import data in bulk using dbForge Data Pump and PowerShell ISE script. All you need to do is create a template, replace the arguments in the script with your actual data, and run the script. [Download](https://www.devart.com/dbforge/sql/sql-tools/download.html) a free 30-day trial version of dbForge SQL Tools, including dbForge Data Pump, to explore the capabilities of every tool included in the package. Tags [bulk data import](https://blog.devart.com/tag/bulk-data-import) [import data from MS Access to SQL Server](https://blog.devart.com/tag/import-data-from-ms-access-to-sql-server) [sql bulk insert](https://blog.devart.com/tag/sql-bulk-insert) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbulk-import-from-ms-access-to-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Automating+Bulk+Data+Import+from+MS+Access+to+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fbulk-import-from-ms-access-to-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html&title=Automating+Bulk+Data+Import+from+MS+Access+to+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html&title=Automating+Bulk+Data+Import+from+MS+Access+to+SQL+Server) [Copy URL](https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/bulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Bulletproof database synchronization with dbForge Schema Compare for SQL Server v1.50 By [dbForge Team](https://blog.devart.com/author/dbforge) September 17, 2009 [0](https://blog.devart.com/bulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html#respond) 2797 Devart, a vendor of native connectivity solutions and [database development tools](https://www.devart.com/dbforge/) for Oracle, SQL Server, MySQL, PostgreSQL, InterBase, Firebird, and SQLite databases, has announced the release of [dbForge Schema Compare for SQL Server 1.50](https://www.devart.com/dbforge/sql/schemacompare/) , a sophisticated tool specially designed to meet a diversity of comparison tasks, help analyze schema differences at a glance, and synchronize them correctly, saving time and efforts. With the new release, Devart continues its dedication to providing a line of safe as well as powerful tools for SQL Server database synchronization. The highlights of Schema Compare for SQL Server 1.50 include: * Table data verification after synchronization dbForge Schema Compare for SQL Server 1.50 moves forward in delivering safe synchronization. The present-day market, saturated with all sorts of speedy over-featured tools for schema comparison and synchronization, leaves many developers insecure while managing their schemas. This proves that capability to synchronize any types of schemas does not always guarantee the correct results, and speed is nothing if any data is lost. Following best practices in database synchronization, dbForge Schema Compare for SQL Server 1.50 checks table data after synchronization and notifies if any data loss has happened. This approach eliminates stumbling blocks on the table modification path giving more control and safety. With data verification capability developers can be sure that table changes as common as table recreation or changing data types in table columns are protected with timely detection of any data losses. * The full comparison and synchronization – all database objects are supported The latest version of dbForge Schema Compare for SQL Server can compare and synchronize all database objects in SQL Server 2000, 2005, and 2008 databases preserving high performance and safety. The expanded object support facilitates cross-version comparison and exempts developers from limitations based on the list of supported objects. * Support of schema snapshots One more major novelty in dbForge Schema Compare for SQL Server 1.50 is capability to make schema snapshots and compare them with each other or with a live database. Snapshots open up possibilities to develop databases without a direct connection to SQL Servers saving time and traffic. In addition, snapshots greatly contribute to version control of databases, producing a simple way for any member of the development team to compare and analyze the changes made to the database. Pricing and Availability [Download dbForge Schema Compare for SQL Server 1.50](https://www.devart.com/dbforge/sql/schemacompare/download.html) now and check the benefits yourself. The license price of dbForge Schema Compare for SQL Server 1.50 is $149.95, it comes with a free 1-year technical support. Go to the [Feedback Page](https://www.devart.com/dbforge/sql/schemacompare/feedback.html) and tell us what you think about the new version. We are looking forward to your comments and suggestions. Tags [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fbulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html) [Twitter](https://twitter.com/intent/tweet?text=Bulletproof+database+synchronization+with+dbForge+Schema+Compare+for+SQL+Server+v1.50&url=https%3A%2F%2Fblog.devart.com%2Fbulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/bulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html&title=Bulletproof+database+synchronization+with+dbForge+Schema+Compare+for+SQL+Server+v1.50) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/bulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html&title=Bulletproof+database+synchronization+with+dbForge+Schema+Compare+for+SQL+Server+v1.50) [Copy URL](https://blog.devart.com/bulletproof-database-synchronization-with-dbforge-schema-compare-for-sql-server-1-50.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/case-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Case Study: How the Engineers of SMD Found a Way to Process Millions of Database Records Faster With dbForge Studio for MySQL By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) September 22, 2023 [0](https://blog.devart.com/case-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html#respond) 1717 Here comes the story of SMD (which stands for Specialized Media Dashboard ), an international project that comprises an open-source media monitoring system based on Google News. The specialized media in question encompasses the spheres of journalism, law enforcement, and climate change. For instance, their most recent project under development is an extensive database-centered system dedicated to the climate change crisis news, raising awareness of the critical global warming issues—the issues whose relevance is expected to grow over the next ten years. This system will expose environmental crimes and thus take part in the world’s efforts to prevent global climate change. The challenges Yet our today’s case dates back to early 2020, when the world witnessed the beginning of the COVID-19 pandemic. It was only natural for Specialized Media Dashboard to focus their coverage on this issue, aggregating, analyzing, and processing related news articles in vast numbers. That’s a huge amount of data, and it caused a number of gradually growing challenges. The sheer number of incoming information was increasing up to the point when it became just too large to handle This caused the need to populate SMD’s MariaDB databases with all that data automatically The SMD team also needed to sort that data by categories, sub-categories, and keywords That’s where the trouble was. SMD’s then-current database tools were simply unable to handle those challenges. They virtually didn’t have the means and capacities to process that much data. But a little bit of research led SMD to [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , our multifunctional IDE that proved to be capable of manipulating and processing millions of records without any impact on performance. It was exactly what the software engineers of SMD were looking for. Daniel K., the Climate Change Crisis Project Coordinator, noted: “Thanks to its administration tools, SMD engineers were able to manipulate and process millions of records in a faster and more efficient way. We became 50% more productive.” The results Besides the abovementioned performance, dbForge Studio delivered tools for the full automation of news processing. Articles could be easily sorted, automatically sent to Google documents, and then exported to MariaDB databases in the blink of an eye. The entire process was conveniently streamlined, much to the relief of the team. It is also worth noting that SMD became this year’s finalist for a prestigious MariaDB award in the Database Transformation of the Year category. Daniel noted that this accomplishment would not have been possible without the contribution of our tools. We are thankful for this, and we are happy to be part of a noble cause. The key features of the Studio noted by the SMD team Flexible import and export of data [Data export and import](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) between different sources and MySQL/MariaDB databases have never been easier with dbForge Studio at hand. The supported formats include HTML, TXT, XLS, XLSX, MDB, RTF, PDF, JSON, XML, CSV, ODBC, DBF, SQL, and Google Sheets. For each of these formats, dbForge Studio provides a rich set of flexible settings, options, and, of course, CLI-powered automation capabilities that helped the SMD team streamline their routine operations. Backup and recovery When working with such huge volumes of data, it is also vital to make sure that accidental data loss is not an issue. That’s why Daniel pinpointed the [reliable backup functionality](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) of the Studio. Like with the previous feature, automation does a great job here and relieves the team of tiresome manual operations, considering how much data they have to work with. Database comparison and synchronization Lastly, Daniel’s team singled out the handy built-in tools for the comparison and synchronization of [database schemas/objects](https://www.devart.com/dbforge/mysql/studio/mysql-database-schema-compare.html) and [actual table data](https://www.devart.com/dbforge/mysql/studio/mysql-data-comparison.html) , which can be easily carried out in the Studio. Associated features include quick comparison of a source and a target, convenient highlighting of differences for analysis, and instant generation of synchronization scripts that can be either run immediately or scheduled for execution. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [success story](https://blog.devart.com/tag/success-story) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcase-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=Case+Study%3A+How+the+Engineers+of+SMD+Found+a+Way+to+Process+Millions+of+Database+Records+Faster+With+dbForge+Studio+for+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fcase-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/case-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html&title=Case+Study%3A+How+the+Engineers+of+SMD+Found+a+Way+to+Process+Millions+of+Database+Records+Faster+With+dbForge+Studio+for+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/case-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html&title=Case+Study%3A+How+the+Engineers+of+SMD+Found+a+Way+to+Process+Millions+of+Database+Records+Faster+With+dbForge+Studio+for+MySQL) [Copy URL](https://blog.devart.com/case-study-how-the-engineers-of-smd-found-a-way-to-process-millions-of-database-records-faster-with-dbforge-studio-for-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/centralized-vs-distributed-version-control.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) What is the Difference Between Distributed and Centralized Version Control Systems By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) May 15, 2024 [0](https://blog.devart.com/centralized-vs-distributed-version-control.html#respond) 1623 Safe and efficient source code management is crucial during application development. The code should be stored securely, with all changes meticulously documented to catch and fix any errors, especially when multiple developers are involved. This is typically done using version control systems (VCS). Version control enables teams to collaborate effectively, reduce risks, and maintain stability. This article will discuss version control systems, including their types, the advantages and disadvantages of each, and the significant role they play in modern software delivery processes. Contents What is Version Control Centralized Version Control Systems Distributed Version Control Systems Centralized vs Distributed Advantages and disadvantages Reliability and fail-safes Learning curve and complexity Are centralized systems obsolete? Conclusion What is Version Control A version control system (VCS) is the software that records changes made to source code, including separate files, sets of files, and any digital assets and related project metadata. This way, the system enables tracking of all project changes and allows developers to revert files to specific versions as needed. Without version control, projects risk becoming disorganized with numerous file versions. Also, VCS lets all team members work simultaneously on the latest file version for a specific project. This ensures faster and more efficient product development and delivery, crucial in the DevOps workflows. Moreover, VCS allows for conflict detection and resolution before issues reach production. Overall, version control systems offer several benefits: Version management : VCS stores all versions of changes down to the smallest developer commit. Version restoration : Developers can restore specific versions to address conflicts or errors. Security : VCS provides role-based access control, preventing unauthorized code access. Integrity : Integration with other DevOps tools ensures stable processes. Collaboration : VCS facilitates collaborative teamwork with a focus on individual work goals. Backup : Modern VCS can serve as a backup for project codebases. There are two main types of version control systems: Centralized Version Control Systems (CVCS) and Distributed Version Control Systems (DVCS). Let’s explore both. Centralized Version Control Systems A centralized version control system operates on a client-server model. In this setup, the central server hosts the master repository, which contains all versions of the code. Developers begin by pulling the latest source code version to their local machines to make modifications. Once changes are made, they commit these changes to the central repository, where conflicts are resolved, and the updates are merged. In this model, only one developer can modify a specific piece of code at a time. The system locks the file to prevent other developers from accessing it while it is being edited. Team members may create branches to work independently, but all changes are eventually committed to the central repository. After merging, the server unlocks the files. The centralized approach is best suited for smaller teams where direct communication and coordination are feasible, as this is vital for maintaining an effective workflow. Subversion (SVN) is a widely used centralized version control system. Unlike other systems that support branching, SVN manages all project files in a single line. It simplifies scaling for large projects but requires comprehensive, robust security policies to control access to different areas of the project. Distributed Version Control Systems Distributed version control systems (DVCS) function similarly to centralized models in most aspects but with a significant difference. Instead of a single server holding a repository, each developer maintains their own repository on the local machine. That local repository contains the entire history and all branches. In practice, using DVCS means that each user has a mirrored copy of the entire repository on their local machine. They can branch, commit, and merge changes locally without needing the server to store physical branch files; it only needs the commit differences. After making changes, the developer commits them to the local repository. However, at this stage, the local repository is separate from the master repository, resulting in different sets of changes between the developer’s contribution and the master repository. Developers don’t directly merge their code into the master repository. Instead, they request to push these changes from their local copy to the master repository. The main advantage of the distributed model is that such systems allow users to work independently, even without a direct connection to the central repository. Therefore, even a failure in the central repository won’t affect local work. Besides, with code review processes in place, only clean, high-quality code can be merged into the main repository. Although DVCS can be complex, especially for new developers, the benefits of the distributed model justify investing time and effort in mastering these systems. Multiple developers can collaborate efficiently and deliver excellent software. The most widely used examples of DVCS are Git and Mercurial. Git stands out as the most popular version control system overall. It’s an open-source DVCS suitable for projects of any size and complexity, widely used in startups and enterprises alike. Mercurial is another DVCS with straightforward branching and merging features, ideal for scalable projects, and it offers an intuitive user interface. This visual mode makes it easy for even new users to quickly grasp the functionalities and work efficiently. Centralized vs Distributed Let us review the specific features of both systems in the comparison table: Centralized Distributed Repository Single central repository Separate cloned repositories for every user Connection Requires constant connection to the central repository Does not require constant connection, can work offline Branching and merging Limited Extensive History The central repository contains the version history Each local repository has a full version history Access control Permissions are set at the central server Permissions are set at both local and remote repositories Speed Slower performance as operations rely on the central server Faster performance as most tasks are done locally Data loss risk Failures on the central server can lead to data loss, and comprehensive backups are required A full copy of the repository is available to each user, reducing data loss risks Collaboration Requires constant coordination with the central server Allows concurrent work and less centralized control Adaptability More rigid More adaptable Let’s also explore the typical workflow for these two types. Centralized VCS workflow: Users connect to the central repository where all project files are stored and download the latest version to their local machines, also updating their local copies to minimize conflicts. Users make changes to the files on their computers. Once all local changes are finalized, users commit them to the central repository, making their changes accessible to all other users. Distributed VCS workflow: Every developer has a local clone of the entire repository, including all branches and changes history. Work starts by updating this local copy with the latest changes from the central repository. Developers work locally, creating work branches, making commits, and merging local branches with the master/main branch. These operations are possible even offline. After committing and merging local branches, developers push changes to the remote master repository, making all their changes available to others. The choice between these version control systems depends on each team’s specific goals and environment. Advantages and disadvantages of CVCS vs. DVCS Centralized VCS: Pros: High performance for binary files Complete visibility of code for all team members The lower learning curve for system setup and workflow implementation Cons: Single point of failure: If the centralized server goes down, work halts. Slower speed due to constant communication with the remote server Less flexibility in customizing workflows Distributed VCS: Pros: Excellent branching and merging capabilities Management of offline changes Redundancy and backup features High flexibility supporting diverse workflows Better performance in distributed mode Cons: Steep learning curve requiring advanced technical skills Higher disk space consumption as each clone contains full project history Complex conflict management in heavily branched projects Some aspects mentioned above are worth more detailed exploring. Reliability and fail-safes Ensuring data safety is crucial in software development. One key aspect that version control systems must address is redundancy and backup options. Distributed VCS ensures redundancy and fail-safes through: Multiple copies of the entire project code, including the full history and all branches created by developers, are distributed across various machines. Remote repositories hosted on platforms like GitHub, GitLab, Bitbucket, etc., serve as backups. Even if a local copy is damaged or lost, the code remains safe remotely. Cloning repositories from remote servers is standard practice. It enables users to recover all data in case of local failures or other issues. In contrast, Centralized VCS have a single point of failure. Since all code and version history reside on one central server, any issues or damage there can lead to severe corruption or loss of the entire project. While backing up the project is critical and standard practice, restoring backups may cause workflow delays. This comparison underscores the distributed model’s advantage over the centralized one. DVCSs offer greater safety and resilience, making them less vulnerable to data loss. Learning curve and complexity A commonly cited disadvantage of distributed version control systems is their complexity, which demands higher technical knowledge and skills. Besides, users must manage the complete history of all versions and navigate through numerous branches. However, modern DVCS client solutions often feature user-friendly graphical interfaces that make it easier to master their functionality. Additionally, as DVCSs dominate the market, there is an abundance of learning materials available in various formats. In all cases, effective communication and coordination remain critical for successful software development in a distributed environment. Are centralized systems obsolete? Using distributed version control systems instead of centralized ones is widely advocated by experts today. DVCSs offer all the functionalities of centralized systems and also introduce additional features. Most popular version control systems belong to the “distributed” type. However, this doesn’t mean centralized systems are obsolete. They still excel for small teams working closely together on shared files in a single location. Apache Subversion (SVN), a well-known CVCS, remains a preferred choice, especially in industries like game development. SVN’s strength lies in its efficient handling of large databases and binary files, along with an immutable change history that aligns well with stringent security requirements. Conclusion Version control plays a crucial role in modern software development. Distributed version control systems (DVCSs) have become the standard due to their robust functionalities and security. Git, Mercurial, and other similar DVCSs are default choices for most software development teams, although centralized VCSs are still used. The question isn’t whether to choose one system over another but rather to find the solution that best suits your team for a specific project. Integrating version control software into DevOps processes is a significant task. Many users prefer having a complete toolkit within the same platform to avoid switching between different software for various tasks. In response to this demand, software vendors have introduced integrated development environments (IDEs) with built-in version control features. For instance, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/studio-sql.html) , a multi-featured IDE that covers database development, management, and administration. The Studio, among other options, allows you to commit changes, revert them, and resolve conflicts directly from its interface. You can explore this solution for your workflows as well. A [fully functional free trial](https://www.devart.com/dbforge/sql/studio/download.html) lasting 30 days lets you test all the capabilities of the Studio under real workload conditions. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [version control](https://blog.devart.com/tag/version-control) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcentralized-vs-distributed-version-control.html) [Twitter](https://twitter.com/intent/tweet?text=What+is+the+Difference+Between+Distributed+and+Centralized+Version+Control+Systems&url=https%3A%2F%2Fblog.devart.com%2Fcentralized-vs-distributed-version-control.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/centralized-vs-distributed-version-control.html&title=What+is+the+Difference+Between+Distributed+and+Centralized+Version+Control+Systems) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/centralized-vs-distributed-version-control.html&title=What+is+the+Difference+Between+Distributed+and+Centralized+Version+Control+Systems) [Copy URL](https://blog.devart.com/centralized-vs-distributed-version-control.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Champs Recap: Award-Winning dbForge Database Solutions in Q1 2023 By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) May 12, 2023 [0](https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html#respond) 2363 Now that Q1 2023 is over, it’s time to deliver our traditional champs recap and showcase the awards and badges garnered by dbForge solutions on independent review platforms over this period. Although [the previous recap](https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html) set the bar quite high, we believe our recent achievements will not pale in comparison. Without further ado, let’s get started! dbForge Studio for MySQL Ready for a big one? Undoubtedly, our today’s biggest champ in terms of the sheer quantity of accolades is [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , an IDE that makes the management of MySQL and MariaDB databases almost effortless—and now it has earned 14 new badges on G2 , which is an incredibly pleasant gesture of recognition on behalf of our users. And if that hasn’t been enough, the Studio has won the Most Affordable badge on SoftwareSuggest. Although technically these aren’t badges, we’d love to show the impressive ratings of the Studio on Capterra (4.9/5), GetApp (5/5), and Software Advice (4.9/5). None of that would have been possible without our users. We love you, and we’ll keep doing our best to maintain the quality and convenience of our tools at an all-time high. And if you don’t trust all those ratings, we gladly invite you to [download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) for a free 30-day trial and get some firsthand experience with its performance and functional capabilities. dbForge Studio for SQL Server Next comes [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , our full-fledged IDE for the development and management of SQL Server databases, which now holds the rank of G2 High Performer . This badge is earned by software products with the highest customer satisfaction scores in their respective categories. We’ve also got a badge called User most likely to Recommend Winter 2023 on SoftwareSuggest, another big review platform dedicated to business software with over 800 categories to choose from. We’re happy that the users of dbForge Studio for SQL Server find it so valuable. But you don’t have to take anyone’s word for it. Just [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) , give it a go, and see everything for yourself. dbForge SQL Complete The next product we’ll talk about is [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , a top-tier SSMS add-in that delivers advanced SQL code completion, formatting, refactoring, and debugging. It has scored the High Performer Spring 2023 badge on G2 and the User most likely to Recommended Winter 2023 badge on SoftwareSuggest. Additionally, SQL Complete has been included in [Top 100 Bestselling Products of 2023](https://www.componentsource.com/help-support/bestselling-product-awards-2023) on ComponentSource. We’re just as proud of this achievement—and thankful to our users as well. SQL Complete is available for a free 14-day trial, so feel free to [download it](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and double your coding speed in no time. dbForge Studio for Oracle Now let’s move on to [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , which has scored 4 new badges on G2: High Performer Winter 2023 Easiest To Do Business With Winter 2023 High Performer Spring 2023 Easiest To Do Business With Spring 2023 Another badge worth mentioning is Great User Experience , awarded by SoftwareSuggest. In fact, consistently smooth user experience is one of our priorities, and it applies to all dbForge products. So if you frequently deal with the development and management of Oracle databases, you might as well find your perfect IDE in dbForge Studio. Simply [download it for a free month-long trial](https://www.devart.com/dbforge/oracle/studio/download.html) and give it a go today! dbForge Studio for PostgreSQL Last but not least comes [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , which has seen [a rather big update](https://blog.devart.com/meet-dbforge-studio-for-postgresql-v3-0.html) recently. Now this Studio has scored the Faster Growing Software Products badge on the already mentioned SoftwareSuggest. It is also nice to see that the Studio also holds a consistent 5-star rating on Software Advice, yet another renowned platform focused on professional advice and personalized recommendations regarding business software. Like in all previous cases, you are free to [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) . 3 more Crozdesk awards for dbForge Studios and Data Compare for PostgreSQL Now what about Crozdesk, another major product review platform that helps businesses discover optimal software for the most precise and demanding needs? Well, all of our Studios (accompanied by [Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) ) are no slouches here either, with the Trusted Vendor , Quality Choice , and Happiest Users awards—whose titles speak for themselves—all firmly in place. Get our new multidatabase solution for a free trial today! That’s it for today! If you haven’t joined the ranks of dbForge users yet, we’d love you to join in, because we’re sure you will find the right tools for your daily database-related tasks. And if these tasks cover more than one database system, we suggest you try [dbForge Edge](https://www.devart.com/dbforge/edge/) , a hefty bundle of four Studios with support for multiple databases and cloud services. Tags [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [Crozdesk badges](https://blog.devart.com/tag/crozdesk-badges) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Data Compare for PostgreSQL](https://blog.devart.com/tag/dbforge-data-compare-for-postgresql) [dbforge studio](https://blog.devart.com/tag/dbforge-studio) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SoftwareSuggest awards](https://blog.devart.com/tag/softwaresuggest-awards) [SQL Server](https://blog.devart.com/tag/sql-server) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-award-winning-dbforge-database-solutions-in-q1-2023.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Award-Winning+dbForge+Database+Solutions+in+Q1+2023&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-award-winning-dbforge-database-solutions-in-q1-2023.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html&title=Champs+Recap%3A+Award-Winning+dbForge+Database+Solutions+in+Q1+2023) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html&title=Champs+Recap%3A+Award-Winning+dbForge+Database+Solutions+in+Q1+2023) [Copy URL](https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Champs Recap: Award-Winning dbForge Products in Q2-Q4 2022 By [dbForge Team](https://blog.devart.com/author/dbforge) November 24, 2022 [0](https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html#respond) 2836 Following our [previous recap](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html) that covered the awards garnered by dbForge products in Q1 2022, and a compelling selection of [DBTA 2022 Readers’ Choice Awards](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html) , we believe now is the perfect moment to catch up with the rest of the year as it’s coming to a close. We have gathered all of the accolades in one place—and as it turned out, there’s a whole lot of them to tell you about. Now sit back, relax, and enjoy the ride! 2022 Capterra Shortlist: dbForge Studios & dbForge SQL Complete First off, the entire [dbForge Studio product line](https://www.capterra.com/p/196325/dbForge-Studio/) , alongside [dbForge SQL Complete](https://www.capterra.com/p/233932/SQL-Complete/reviews/) , made it onto the 2022 Capterra Shortlist as a result of a special assessment of user reviews and the average monthly search volume for a standardized set of keywords related to the product. In other words, shortlisted products are popular and held in high regard by their users. It is also worth noting that all four dbForge Studios consistently rank among the best database-centered solutions on Capterra. Most notably, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) has been recently spotted in [The Top 10 Free Database Software](https://www.capterra.com/resources/free-database-software/) and [9 Best Open Source Database Software](https://www.capterra.com/resources/open-source-database-software/) . GetApp Leaders 2022 in the Database category: dbForge Studios [dbForge Studios](https://www.devart.com/dbforge/studio/) were also noted as [GetApp Leaders 2022](https://www.getapp.com/it-management-software/a/dbforge-studio/) in the Database category. The lists of Leaders are based on ratings provided by end users in five main areas: ease of use, value for money, functionality, customer support, and likelihood to recommend. Now it is safe to say that, according to the GetApp users, the Studios excel at all of these parameters. Software Advice FrontRunners 2022: dbForge Studios & dbForge SQL Complete More recognition scored by [dbForge Studios](https://www.softwareadvice.com/database-management-systems/dbforge-studio-profile/) and [SQL Complete](https://www.softwareadvice.com/database-management-systems/sql-complete-profile/) ? You got it! The Software Advice FrontRunners 2022 award is based on recently published user reviews that score products on two primary dimensions: usability and customer satisfaction. 2022 Crozdesk Awards: dbForge Studios for MySQL, Oracle, and PostgreSQL We already had a [post](https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html) dedicated to the most recent awards earned by dbForge Studio for SQL Server on Crozdesk. But this time, [all four Studios](https://crozdesk.com/search?utf8=%E2%9C%93&query=dbForge+Studio) can finally boast this achievement. Let’s take a closer look at these awards. Crozdesk Happiest Users 2022 – a badge scored by products with consistently high product ratings and reviews. A software product that aspires to get this badge must have an average user rating of at least 4.5/5.0 across a high number of ratings. This can be achieved by no more than 10% of all solutions on Crozdesk. Crozdesk Trusted Vendor 2022 – a badge meant for vendors with a considerable market presence or market share. This metric is based on an algorithmic estimate of the number of users, which is calculated by Crozdesk’s ranking algorithm. No more than 20% of all software solutions can get this badge as an award. Crozdesk Quality Choice 2022 – a badge earned by achieving a Crozscore (the platform’s internal scoring system) of 80/100 or higher. Only about one third of all software products on Crozdesk can achieve this. G2 Awards: dbForge Studio for MySQL Last but not least, here come the G2 Awards . And the undisputed G2 champ in our camp is [dbForge Studio for MySQL](https://www.g2.com/products/dbforge-studio-for-mysql-2018-12-04/reviews) , which effectively scored the following badges: Easiest To Use Small Business Fall 2022 Leader Small Business Fall 2022 Easiest Admin Fall 2022 Easiest To Use Fall 2022 Momentum Leader Fall 2022 Leader Fall 2022 High Performer Mid-Market Fall 2022 G2 Awards: dbForge Studio for Oracle Now when it comes to [dbForge Studio for Oracle](https://www.g2.com/products/dbforge-studio-for-oracle-2018-12-04/reviews) , it earned a more modest achievement on G2—in terms of quantity, that is. However, in terms of quality, both badges— High Performer Fall 2022 and Easiest To Do Business With Fall 2022 —are a valuable addition to our story. Especially the latter. Why the latter, you might ask? Well, that’s because we treasure lasting relationships with our clients, both corporate and individual ones, and we do our best to make things simple and transparent for them. Being easy to do business with might just be the best kind of recognition for us. G2 Awards: dbForge SQL Complete The final achievement for today is the High Performer Fall 2022 badge scored by [SQL Complete](https://www.g2.com/products/dbforge-sql-complete/reviews) on G2. That’s it for today! And there is no better way to conclude this story than to say that all this would not have been possible without the consistently positive response on behalf of our users. Thank you for your recognition. Thank you for inspiring us. And if you haven’t joined the ranks of [dbForge](https://www.devart.com/dbforge/) users yet, we’d really love to invite you to explore our products further—and we believe you will definitely find something to meet your needs, match your requirements, and make your daily work an easy breeze. If you are still hesitant, let us remind you that we adhere to the try-before-you-buy policy, and our tools come either with a free trial or in an absolutely free Express Edition. Tags [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [Crozdesk badges](https://blog.devart.com/tag/crozdesk-badges) [dbforge](https://blog.devart.com/tag/dbforge) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [SoftwareSuggest awards](https://blog.devart.com/tag/softwaresuggest-awards) [sql complete](https://blog.devart.com/tag/sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-award-winning-dbforge-products-in-q2-q4-2022.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Award-Winning+dbForge+Products+in+Q2-Q4+2022&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-award-winning-dbforge-products-in-q2-q4-2022.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html&title=Champs+Recap%3A+Award-Winning+dbForge+Products+in+Q2-Q4+2022) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html&title=Champs+Recap%3A+Award-Winning+dbForge+Products+in+Q2-Q4+2022) [Copy URL](https://blog.devart.com/champs-recap-award-winning-dbforge-products-in-q2-q4-2022.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) Champs Recap: Devart’s Award-Winning Products in Q1 2022 By [dbForge Team](https://blog.devart.com/author/dbforge) April 19, 2022 [0](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html#respond) 3099 It’s time for a brief recap that covers all the badges that dbForge products for MySQL, Oracle, and PostgreSQL were awarded in Q1 2022—all owing to the steadily positive response from our users on a variety of review platforms. dbForge Studio for MySQL: 9 new badges on G2 [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is a top-tier IDE for effective development, management, and administration of MySQL and MariaDB databases. It delivers multiple tools that help you build, execute, and optimize queries, streamline database object management, compare and synchronize database schemas, compare and analyze table data, perform smart refactoring, generate test data, and much more than that. Now it has an updated set of awards based on G2 user reviews. G2 users appreciate the sheer versatility of dbForge Studio for MySQL, as well as numerous individual features. You can take a look at their reviews on the [Reviews & Product Details](https://www.g2.com/products/dbforge-studio-for-mysql-2018-12-04/reviews) page. Additionally, you can [check a similar set of awards](https://blog.devart.com/absolute-trophy-champion-dbforge-studio-for-mysql-got-nine-awards.html) garnered by the Studio in January 2022. Please note that you can [download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) for a FREE 30-day trial and check all of its capabilities in action! dbForge Studio for Oracle: 2 new badges on G2 Two more awards go to [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , another feature-rich IDE, this time designed for Oracle databases. The Studio also [earned similar badges](https://blog.devart.com/one-more-achievement-in-the-new-year-dbforge-studio-for-oracle-got-two-g2-badges.html) for its performance and popularity in winter 2022, and it shows no signs of stopping. Feel free to get acquainted with G2 user reviews on the [Reviews & Product Details](https://www.g2.com/products/dbforge-studio-for-oracle-2018-12-04/reviews) page. dbForge Studio for Oracle can also be [downloaded](https://www.devart.com/dbforge/oracle/studio/download.html) for a FREE 30-day trial , so don’t miss out on that opportunity! dbForge Studios: 6 new badges on SoftwareSuggest Now let’s move on to the next review platform—SoftwareSuggest—whose users helped our Studios for MySQL, Oracle, and PostgreSQL be awarded 2 new badges each. These are the badges earned by dbForge Studio for MySQL: These badges were earned by dbForge Studio for Oracle: Finally, these are the new badges of dbForge Studio for PostgreSQL: We haven’t talked about the latter yet, so here it is: [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is yet another integrated development and administration solution that doubles the performance of your SQL coding and streamlines your data management and reporting. So if you are actively dealing with PostgreSQL databases, you might as well consider this handy toolkit for daily use. Just [download it](https://www.devart.com/dbforge/postgresql/studio/download.html) and give it a go under the same FREE 30-day trial ! 3 new Crozdesk awards We’re not through yet! The three abovementioned Studios along with [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) were awarded a few rather important badges on Crozdesk: Speaking about Data Compare, it is the best way to locate and synchronize differences in table data across your PostgreSQL databases. The entire process can be easily customized up to the automated generation of synchronization scripts that ensure safe deployment of changes in your data to the target database. Data Compare is also [available for a free 30-day trial](https://www.devart.com/dbforge/postgresql/datacompare/) . dbForge Studio for MySQL: Best Database DevOps Software on TrustRadius Finally, we’d love to tell you about dbForge Studio for MySQL winning the title of [Best Database DevOps Software on TrustRadius](https://www.trustradius.com/database-devops) . To quote TrustRadius, such solutions “allow for the DevOps cycle to be applied to databases specifically, and have feature sets that are more fine-tuned.” Additionally, they “can help with automating CI/CD for databases, database version control, and detecting database drift.” This perfectly applies to all of our Studios and other specialized tools from the dbForge product line. These awards are a real honor for us. And it’s all even better considering that it’s our users who are the real contributors here. That said, dear users, all this would not have been possible without you, and we promise to do our best in order to keep up with your expectations—or exceed them as much as we can. Tags [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [Crozdesk badges](https://blog.devart.com/tag/crozdesk-badges) [dbforge](https://blog.devart.com/tag/dbforge) [g2 awards](https://blog.devart.com/tag/g2-awards) [SoftwareSuggest awards](https://blog.devart.com/tag/softwaresuggest-awards) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q1-2022.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2022&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q1-2022.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2022) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2022) [Copy URL](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2022.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2024.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Champs Recap: Devart’s Award-Winning Products in Q1 2024 By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) March 19, 2024 [0](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2024.html#respond) 1421 [Last season was quite fruitful](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html) in terms of awards scored by our products on independent review platforms. And today, we’ve got a lot more of those to share. Like in the previous case, the majority of awards comes from G2, one of the world’s biggest sources of business software reviews, where achievements are solely based on what real users think. And now, without further ado, let’s get embark on our traditional overview! dbForge Studio for MySQL We’ll traditionally begin with the product that has garnered the biggest number of awards, and this time it’s [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , our high-end IDE that helps you take care of MySQL and MariaDB databases without effort. First and foremost, the Studio has been ranked #1 on the list of [Best database design software of 2024](https://www.techradar.com/best/best-database-design-software) by TechRadar . Next, we’ve got quite a few new G2 badges, the most notable of which is Leader Winter 2024 , which has been secured by less than 3% of all products and services listed on G2, according to [the insights from their winter 2024 reports](https://company.g2.com/news/g2-winter-2024-reports) . We’ve also got a High Performer Winter 2024 badge from SoftwareSuggest . However, no badges can replace actual firsthand experience, so we gladly invite you to [download dbForge Studio for MySQL for a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) and explore all of its features, including the recently introduced [Source Control](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) . We bet your daily work with MySQL and MariaDB databases won’t be the same afterwards. dbForge for SQL Server The next few winners all belong to the [dbForge product line for SQL Server](https://www.devart.com/dbforge/sql/) . And before we scrutinize them one by one, we’d like to mention that the entire line has received the Customers Love Us award on SourceForge , a resource where potential buyers can explore, compare, and review business software and IT services. dbForge Studio for SQL Server Our runner-up this time is [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , our equally powerful IDE that delivers everything you might ever need to develop and manage SQL Server databases. It’s got a strong start this year with 11 new badges scored on G2 . If you are not using this Studio yet, now is the perfect moment to give it a try. Simply [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and see what it offers. We believe that its capabilities and performance won’t leave you disappointed. dbForge SQL Complete Next comes [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , as good an add-in for SSMS and Visual Studio as one can get. It was designed to drastically expand your SQL coding capabilities with context-aware code completion and relevant object suggestions, code formatting with predefined and custom profiles, a library of SQL snippets, smart refactoring, and an integrated T-SQL debugger to boot. All in all, it has 3 new badges on G2 : Easiest to Use (indeed!), High Performer , and Users Love Us (and we love them just as well). So, if you are a dedicated user of SSMS, or if you are using Visual Studio to manage SQL Server databases, and you’re not planning to switch to another IDE anytime soon, feel free to [download SQL Complete for a free 14-day trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and see how it can speed up your daily SQL coding and help you produce more output with less effort. dbForge SQL Tools The previously mentioned SQL Complete is part of a bundle called [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , which includes 15 standalone apps and SSMS/Visual Studio add-ins covering a diversity of tasks related to SQL Server databases. The name of its newly scored G2 award, Users Love Us , speaks for itself. Want to see dbForge SQL Tools in action? Not a problem at all – [download them for a free trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) and see how you can fine-tune your daily work with their timely and well-focused help. dbForge Studio for Oracle What if you work with Oracle Database? Well, we’ve got a suitable [Studio](https://www.devart.com/dbforge/oracle/studio/) for you, and it has earned this winter’s High Performer and Easiest To Do Business With badges on G2, repeating the success of the previous season. If Oracle Database is your main focus, and if you are in search of an advanced toolset to match your requirements, you might as well take a look at this Studio. As usual, you can [get started right now with a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) . dbForge Edge Finally, we’ll talk a bit about [dbForge Edge](https://www.devart.com/dbforge/edge/) , our multidatabase solution that covers a rich selection of the most popular database systems and cloud services with a suite of consistently designed and feature-rich IDEs. This is where we go back to SourceForge and its Customers Love Us award. All of the abovementioned awards are primarily based on the real user feedback, which makes them all the more precious. We’d like to thank everyone who contributes to the high reputation of dbForge products, we’d like to thank every single user, and we’d love to invite you personally to join our ranks, if you haven’t done it yet. Take a look at the tools we offer, you will definitely find something to fit your workflow to a tee. Tags [Awards](https://blog.devart.com/tag/awards) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [MySQL](https://blog.devart.com/tag/mysql) [SQL Server](https://blog.devart.com/tag/sql-server) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q1-2024.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2024&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q1-2024.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2024.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2024) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2024.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q1+2024) [Copy URL](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q1-2024.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q2-q3-2023.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Champs Recap: Devart’s Award-Winning Products in Q2-Q3 2023 By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) September 27, 2023 [0](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q2-q3-2023.html#respond) 2057 As September is slowly but steadily coming to a close, it’s time to deliver a summary of our achievements on independent review platforms over Q2-Q3 2023. [Our previous recap](https://blog.devart.com/champs-recap-award-winning-dbforge-database-solutions-in-q1-2023.html) was rather rich with awards, and this time we hope you won’t be disappointed either. After all, it’s you who can directly affect the distribution of these awards, and it’s all based on your opinion and satisfaction as the user. Without further ado, let’s get started! dbForge Studio for MySQL This isn’t the first time that our flagship IDE for MySQL/MariaDB database management, [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , gets the biggest number of well-deserved accolades, and we’ll start with 9 newly garnered badges that further facilitate the reputation of this Studio as one of the best-reviewed database products on G2 , one of the world’s top platforms dedicated to business software and services. Additionally, dbForge Studio for MySQL has been [shortlisted for the first position](https://www.techradar.com/best/best-database-design-software) among the Best database design software of 2023 on TechRadar , an influential blog dedicated to technology. Still, there’s nothing quite like trying it all yourself, so we gladly invite you to [download dbForge Studio for MySQL for a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) and get some firsthand experience with its vast functional capabilities and high performance. dbForge Studio for SQL Server Now let’s move on to [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , another top-notch IDE for the development, management, and administration of SQL Server databases, which proudly carries on with the rank of G2 High Performer . This badge is earned by software products with the highest customer satisfaction scores in their respective categories. The actual [feature set](https://www.devart.com/dbforge/sql/studio/features.html) of this Studio is exceedingly rich, and it’s just as easy to try your hand at it. All you need to do is [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and give it a go today! dbForge SQL Complete Next comes [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , a powerful add-in for SSMS that doubles the user’s SQL coding speed and accuracy with context-aware code completion, as well as flexible formatting, refactoring, and debugging capabilities. And the first thing that we should mention here is that SQL Complete made it to [Capterra’s 2023 Database Software Shortlist](https://www.capterra.com/database-management-software/shortlist/) . Secondly, we were pleased by the inclusion of SQL Complete in FrontRunners 2023 on Software Advice and by the newly acquired rank of Leader in Database on GetApp . Finally, getting back to G2 badges , we have three more of those earned by SQL Complete. All in all, if you are an avid user of SSMS, feel free to [download SQL Complete for a free 14-day trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) . We believe you won’t imagine your daily work with SQL code without it afterwards. dbForge Studio for Oracle When it comes to our third IDE, [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , we can mention two more G2 badges that reflect the consistently high user satisfaction with our products: High Performer (Summer 2023) and Easiest To Do Business With (Summer 2023) . That said, if you frequently deal with the development and management of Oracle databases, you might as well find your perfect IDE in dbForge Studio. Simply [download it for a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) and give it a go today! dbForge Edge Finally, a few words must be said about [dbForge Edge](https://www.devart.com/dbforge/edge/) , our hefty bundle of four Studios (which comprises the three abovementioned Studios alongside [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) ) that provides support for a multitude of database systems and cloud services. As a single solution, dbForge Edge earned its place among SoftwareWorld’s Top Rated Backup Software of 2023 . This also applies individually to dbForge Studios for SQL Server and MySQL. It is also worth mentioning that each of the four individual Studios keeps up the good work with the Trusted Vendor , Quality Choice , and Happiest Users awards on Crozdesk , yet another big review platform for business software. Last but definitely not least, we would like to mention that all of our Studios [became the winners of DBTA Readers’ Choice Awards 2023](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html) in the Best Database Performance Solution category. That’s it for today! If you haven’t joined the ranks of dbForge users yet, we’d love you to give it a shot, because we’re sure you will find the optimal tools for any database-related tasks you might run across. Tags [Awards](https://blog.devart.com/tag/awards) [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [Quality Choice Award](https://blog.devart.com/tag/quality-choice-award) [SoftwareSuggest awards](https://blog.devart.com/tag/softwaresuggest-awards) [sql complete](https://blog.devart.com/tag/sql-complete) [Trusted Vendor Award](https://blog.devart.com/tag/trusted-vendor-award) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q2-q3-2023.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q2-Q3+2023&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q2-q3-2023.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q2-q3-2023.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q2-Q3+2023) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q2-q3-2023.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q2-Q3+2023) [Copy URL](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q2-q3-2023.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Champs Recap: Devart’s Award-Winning Products in Q4 2023 By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) December 8, 2023 [0](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html#respond) 1641 The end of the year isn’t all that far away, and it’s time to tell you about the latest awards garnered by dbForge database tools. This time we’ll be mostly talking about the badges that we’ve recently scored on G2, one of the world’s top platforms where you can both find top-rated business software to match your specific needs—and review it, thus contributing to its independent recognition and rankings on the platform. That said, let’s get started! dbForge product line However, it’s not the badges that we’ll start with. [Devart has been named Leading Experts in Database Development Issue Resolutions 2023](https://www.thenewworldreport.com/winners/devart/) by New World Report, an insightful and informative business news platform that provides readers throughout both North and South America with news, advice, and success stories to inspire and support business growth and continuity. Without a doubt, this title is especially relevant to [dbForge](https://www.devart.com/dbforge/) , Devart’s flagship product line dedicated to database development, management, and administration. dbForge Studio for SQL Server Now as for individual products, it’s [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) that takes the cake with a whopping 12 G2 badges with such telling names as Easiest Setup , Fastest Implementation , Easiest To Use , High Performer , and our personal favorites – Users Love Us and Easiest To Do Business With . Not using dbForge Studio yet? Then there’s never been a better moment to get acquainted with it than today! You only need to [download it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) , give it a go, and explore the sheer diversity of its capabilities! dbForge Studio for MySQL If you are interested in MySQL and MariaDB databases more, take a look at today’s runner-up, [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , which scored 9 new G2 badges, including but not limited to Leader , High Performer , Easiest Setup , and Easiest Admin . By the way, the latest update of dbForge Studio for MySQL became one of our biggest updates ever and introduced a new killer feature— [Source Control](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) —which enables effective collaboration on databases via seamless integration with top version control systems. And if you would like to try it all yourself, we gladly invite you to [download dbForge Studio for MySQL for a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) and level up your daily development and management of databases. dbForge Studio for Oracle Next comes [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , this fall’s High Performer and Easiest To Do Business With . If Oracle Database is your main focus, and if you are in search of an advanced toolset to match your needs and requirements, you might as well check out dbForge Studio. [Get started right now with a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) ! dbForge SQL Complete Now a few words about [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , arguably our most powerful add-in for SSMS that hugely expands your SQL coding capabilities with context-aware code completion and relevant object suggestions, code formatting with predefined and custom profiles, a library of SQL snippets, as well as smart refactoring and T-SQL debugging. It’s got 4 new badges in total: two High Performers , Momentum Leader , and, of course, Users Love Us . That said, if you are a dedicated user of SSMS, and you’re not planning to switch to another IDE anytime soon, feel free to [download SQL Complete for a free 14-day trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) . We believe you won’t imagine your daily SQL development without it afterwards. It simply won’t be the same. dbForge Edge Now if we take four dbForge Studios—the three abovementioned Studios plus [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) —and pack them into one cost-effective bundle, we’ll get [dbForge Edge](https://www.devart.com/dbforge/edge/) , which is best considered in case you need support for a multitude of database systems and cloud services. And since it’s all about the strengths of each Studio, we believe that all of these individual awards belong to the entire bundle just as well. That’s it for today! If you haven’t joined the ranks of dbForge users yet, we’d love you to give it a shot, because we’re sure you will find the optimal tools for any database-related tasks you might run across. Tags [Awards](https://blog.devart.com/tag/awards) [dbforge studio](https://blog.devart.com/tag/dbforge-studio) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q4-2023.html) [Twitter](https://twitter.com/intent/tweet?text=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q4+2023&url=https%3A%2F%2Fblog.devart.com%2Fchamps-recap-devarts-award-winning-products-in-q4-2023.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q4+2023) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html&title=Champs+Recap%3A+Devart%E2%80%99s+Award-Winning+Products+in+Q4+2023) [Copy URL](https://blog.devart.com/champs-recap-devarts-award-winning-products-in-q4-2023.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/change-tabs-colors-in-ssms.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Streamline Your SSMS Workflow: Tab Color Customization With Native Features and SQL Complete By [Nataly Smith](https://blog.devart.com/author/nataly-smith) January 5, 2025 [0](https://blog.devart.com/change-tabs-colors-in-ssms.html#respond) 2571 In the world of efficient database management and enhanced productivity, the ability to customize tab colors is a game-changer. Be it with the advent of native features or with the power of [SQL Complete for SQL Server](https://www.devart.com/dbforge/sql/sqlcomplete/) , you have the opportunity to take your tab organization to new heights. This article delves into the benefits of tab color customization, exploring how the combination of native SSMS and SQL Complete features empowers you with seamless navigation and a visually appealing workspace. Contents Changing Tab Colors in SSMS Tab Color Customization with SQL Complete How SQL Complete simplifies the process of changing tab colors in SSMS Comparing SSMS and SQL Complete for Tab Color Customization Features and benefits of using SQL Complete Conclusion By assigning different colors to tabs, you can visually categorize your work based on different projects, database types, or priorities. This makes it easier to identify and switch between tabs quickly, especially when working with multiple open scripts or database connections simultaneously: Development, Testing, and Production, for example. To begin with, let us take a closer look at how SSMS comes about changing the tabs alone and what SQL Complete brings to the table. Changing Tab Colors in SSMS When it comes to changing tabs colors using the default interface of SQL Server Management Studio, you can do it while creating a new connection. 1. Open SSMS. 2. Click Connect Object Explorer : 3. In the window that opens, click Options to expand more settings: 4. Go to the Connection Properties tab. 5. In this example, we are going to set a color for the demo-mssql\\SQLEXPRESS server, BicycleStore database. Type the name in the Connect to database field or select it from the drop-down menu by clicking the Browse server option: 6. You will see a pop-up window indicating that this action requires a connection to the server. Click Yes to continue: 7. Upon opening the Browse Server for Database window, locate the User Databases folder within the comprehensive list of available SQL Server instances and their respective databases. Open the folder and select the BicycleStore database. Then proceed by clicking the OK button. 8. Select the Use custom color checkbox after choosing the desired database: 9. Choose a color from the grid of basic colors or create a custom one. Then, click OK . 10. Once done, click the Connect button: Now, let us check the fruits of our labor by opening a new SQL window. To do so, navigate to the Object Explorer and expand the demo-mssql\\SQLEXPRESS server and the Databases folder. Then, right-click the BicycleStore database and select New Query . The status bar within the editor window for any new query directed to the previously determined database and server will be automatically set to the specified color. Tab Color Customization with SQL Complete Now that we have already talked about the basics of tab coloring in SSMS, let us explore the broad functionality provided by SQL Complete: 1. Just like earlier, the first step is to open SSMS. 2. Then, navigate to the SQL Complete menu and select Options . 3. In the window that opens, switch to the Tabs Color . 4. Under Environment Categories , you can: Assign an environment category to an entire server or just to a single database. Set a new environment category by clicking the New color match icon. Remove an environment category by clicking the Delete color match icon. 5. Under Tabs Color , you can: Create a new environment or modify the existing one. Change or set up a custom color for an environment. Add or remove an environment. Servers coloring SQL Complete empowers you to easily assign colors to servers, allowing for effective separation of different environments, such as testing and production servers. By associating specific colors with each server, you can visually distinguish between environments and ensure clarity in your development process. Whether it’s Development, Production, Sandbox, Testing, or even None, you have the flexibility to allocate a color that corresponds to the desired environment. This feature in SQL Complete enables the seamless organization and helps you maintain a structured approach while working with multiple server environments. Databases coloring Separate database coloring enables the highlighting of specific databases, providing a clear visual distinction and facilitating efficient navigation through database-specific objects. There are two ways to set the tab color: In Object Explorer , right-click a database you want to assign a color and select SQL Complete > Tabs Color > the environment with the corresponding color . Here, you can choose between Development, Production, Sandbox, Testing, or None. Right-click the SQL document tab and select Tabs Color > the environment with the corresponding color . All tabs, the SQL document status bar, and the database in Object Explorer will have the corresponding color. How SQL Complete simplifies the process of changing tab colors in SSMS SSMS provides basic tab coloring settings, allowing some customization but having rather limited options. On the other hand, SQL Complete offers a significantly more versatile and flexible range of possibilities for this task. It provides an intuitive interface that simplifies the process of configuring and managing tab colors based on various criteria such as server, database, environment, or any other desired parameters. Comparing SSMS and SQL Complete for Tab Color Customization As we have already looked closely at the practical side of tab coloring both in SSMS and SQL Complete, we can now see some differences between the features: Feature SQL Server Management Studio SQL Complete Intuitive Interface The interface is quite user-friendly. However, in some cases, it may require some more advanced features and customization options. The add-in offers an intuitive interface that simplifies SQL development tasks and provides a seamless user experience. Tab Coloring Basic tab coloring settings allow for limited customization options. A broader range of options for customizing tab colors provides greater flexibility. Status bar coloring There are no built-in status bar coloring options. Status bar coloring features help highlight different status or notification indicators. Servers and databases coloring There are no specific features for coloring servers or databases. SQL Complete allows assigning colors to servers and databases for better visual differentiation. Settings Export The tool allows for exporting only overall settings that include tab colors. The tool allows for exporting the list of saved document categories to share settings with your team. Refer to the [Tab coloring](https://docs.devart.com/sqlcomplete/tab-and-document-management/tab-coloring.html) topic of our Documentation for clear guidance on handling its specific aspects. Features and benefits of using SQL Complete SQL Server Management Studio is one of the most popular tools among database administrators, developers, and data professionals for managing and working with Microsoft SQL Server databases. Throughout the course of your career, a significant number of you have been utilizing this tool from its inception, gaining firsthand experience of both its advantages and limitations. Fortunately, with SQL Complete, you no longer have to endure any of those drawbacks while still being able to relish all the benefits this solution has to offer. [Intelligent Code Completion](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html) SQL Complete provides intelligent code completion capabilities, suggesting SQL keywords, object names, and even column names as you type. It helps to save time and reduces syntax errors by offering relevant suggestions based on the context. [SQL Snippets](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html#sql_snippets) The tool includes a collection of code snippets for common SQL operations, such as creating tables, stored procedures, and queries. These snippets can be quickly inserted into your code, boosting productivity and reducing repetitive typing. [Code Formatting](https://www.devart.com/dbforge/sql/sqlcomplete/sql-code-formatter.html) The solution offers powerful SQL formatting options that automatically format your SQL code according to the default or customized formatting rules. It ensures consistency and readability, making your code more manageable and professional. [Code Refactoring](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html) SQL Complete enables you to easily refactor your SQL code. It provides functionality to rename objects, extract SQL code into a separate stored procedure, and perform other code refactoring tasks, helping you maintain a clean and organized codebase. [Productivity Extension](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html) SQL Complete’s productivity features include code snippets, intelligent code formatting, quick object search, SQL code highlighting, code navigation, code refactoring, and advanced code suggestion, all of which facilitate faster and more efficient SQL development. [Code Highlighting and Analysis](https://www.devart.com/dbforge/sql/sqlcomplete/code-completion.html#highlight_identifier_occurrences) The add-in provides advanced code highlighting and analysis capabilities. It identifies syntax errors, unresolved references, and potential performance issues in your SQL code, helping you catch and fix problems early in the development process. These are just a few of the features and benefits of using SQL Complete as an add-in for SSMS. It enhances the development experience, improves productivity, and helps maintain high-quality SQL code. However, let us now focus on how exactly SQL Complete features change the tab coloring process and what its profits are. Conclusion To draw a final point, while SQL Server Management Studio (SSMS) is undeniably a valuable solution for SQL development, [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) takes the tab coloring process to a new level of convenience and precision. With SQL Complete, you can effortlessly configure tab colors and have the flexibility to customize the status bar, server colors, database colors, and much more. Its useful features empower users to optimize their coding environment and maximize productivity. If you are ready to give it a try, a free, [fully-functional 14-day trial version](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) is waiting for you to download it. FAQ How to change tab colors in SSMS? In SSMS, you can set tab colors as you’re creating a new connection. First, click the Options button and go to Connection Properties, then check Use custom color, and select a color. When all is done, the status bar in the query editor will show the color you’ve chosen for queries connected to the specific database. How do you color tabs in SQL Complete? In SQL Complete, you can color tabs by assigning environment categories (like Development, Testing, or Production) to servers or databases. Simply go to SQL Complete > Options > Tabs Color, where you can configure or modify color assignments for better visual organization. How do you color tabs in dbForge Studio? In dbForge Studio, you can color tabs by assigning environment categories to servers or databases via Tools > Options > Environment > Tabs Color. This helps visually distinguish connections such as Development, Testing, or Production, making it easier to navigate multiple environments. How to change color in SQL Management Studio? In SQL Management Studio, tab colors are set during the connection setup. Navigate to Object Explorer Window > Connect (on the Objects Explorer toolbar) > Database Engine > Options > Connection Properties; after that, enable Use custom color and select a color that will differentiate the connections of your database visually. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [ssms](https://blog.devart.com/tag/ssms) [tab color](https://blog.devart.com/tag/tab-color) [tab color ssms](https://blog.devart.com/tag/tab-color-ssms) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchange-tabs-colors-in-ssms.html) [Twitter](https://twitter.com/intent/tweet?text=Streamline+Your+SSMS+Workflow%3A+Tab+Color+Customization+With+Native+Features+and+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fchange-tabs-colors-in-ssms.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/change-tabs-colors-in-ssms.html&title=Streamline+Your+SSMS+Workflow%3A+Tab+Color+Customization+With+Native+Features+and+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/change-tabs-colors-in-ssms.html&title=Streamline+Your+SSMS+Workflow%3A+Tab+Color+Customization+With+Native+Features+and+SQL+Complete) [Copy URL](https://blog.devart.com/change-tabs-colors-in-ssms.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/char-vs-varchar-in-mysql-key-differences-and-best-use-cases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Key Differences Between VARCHAR And CHAR in MySQL By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) March 27, 2025 [0](https://blog.devart.com/char-vs-varchar-in-mysql-key-differences-and-best-use-cases.html#respond) 646 Most database performance issues don’t start with complex queries or heavy traffic—they begin at the foundation: data type selection. One seemingly small decision—CHAR vs. VARCHAR—can dictate whether your database remains fast and efficient or bloated and sluggish as it scales. It’s a mistake even experienced developers make, and it’s easy to see why. At a glance, CHAR and VARCHAR look nearly identical, but their underlying mechanics differ significantly. To build a lean, high-performing MySQL database, you need to understand their key differences upfront and how they impact performance. Moreover, you must prioritize schema design early on to ensure long-term efficiency. Schema analysis and query optimization tools, like [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , help maintain optimal performance and eliminate inefficiencies before slowdowns creep in. This guide focuses on the difference between CHAR and VARCHAR in MySQL. We’ll highlight their ideal use cases, and how to choose the right type for lasting efficiency. Read on! Table of contents Overview of CHAR and VARCHAR What is CHAR in MySQL? What is VARCHAR in MySQL? Key distinctions between CHAR and VARCHAR When to use CHAR and VARCHAR Advantages and disadvantages of CHAR vs VARCHAR How dbForge Studio for MySQL simplifies CHAR and VARCHAR management Conclusion FAQ Overview of CHAR and VARCHAR While CHAR and VARCHAR store character strings, they use different storage mechanisms that impact how MySQL stores, retrieves, and manages text data over time. CHAR ensures speed and consistency by allocating the same space for every value, making indexing and retrieval predictable. On the other hand, VARCHAR prioritizes storage efficiency, adjusting to the text length while introducing overhead that may impact performance over time. Pro tip: These are not the only MySQL data types. Beyond CHAR and VARCHAR, MySQL supports TEXT, BLOBs, and ENUMs. Understanding these types also helps optimize database performance. Now, let’s look closer at CHAR vs VARCHAR in MySQL. What is CHAR in MySQL? CHAR is a fixed-length data type, meaning MySQL allocates the same space for every value, regardless of length. If a value is shorter than the defined length, MySQL pads it with spaces to maintain uniformity. This structure makes CHAR highly efficient for indexing and retrieval since MySQL knows precisely where each row begins and ends, streamlining performance. However, this efficiency comes at a cost—storage waste. For variable-length data, this results in wasted storage due to excessive padding. How CHAR optimizes read-heavy workloads CHAR excels in high-read workloads where speed and consistency matter most. Since every row is identical in size, MySQL doesn’t need to calculate row lengths on the fly, making range scans, sorting, and indexed lookups noticeably faster. This makes CHAR ideal for lookup tables, fixed-length identifiers (such as country codes or hashes), and frequently queried small datasets. Examples: how CHAR affects query results Let’s explore three key scenarios where CHAR can impact query behavior. 1. Direct equality checks work as expected Imagine a table storing transaction codes, each expected to be exactly 10 characters long . CREATE TABLE transactions ( txn_code CHAR(10) );  \nINSERT INTO transactions (txn_code) VALUES ('TXN123'); Since TXN123 is only six characters long, MySQL pads it with four spaces, storing it internally as ‘TXN123 ‘ to match the defined length of 10 characters. Query SELECT * FROM transactions WHERE txn_code = 'TXN123'; A match was found even though TXN123 is stored with extra spaces. When using the equality ( = ) operator , MySQL ignores trailing spaces in CHAR values during comparison. This makes direct lookups work as expected despite the extra spaces. 2. Pattern matching with LIKE may not work as expected MySQL automatically pads CHAR(n) columns with trailing spaces to meet the fixed length. However, MySQL normally ignores these trailing spaces when using LIKE, so a value like ‘TXN123 ‘ (with spaces) should still match the pattern ‘TXN123%’. Query SELECT * FROM transactions WHERE txn_code LIKE 'TXN123%'; Expected Result: This query should return a match because MySQL ignores trailing spaces when using LIKE. Fix: Use CONCAT() Use a safer approach with CONCAT(). SELECT * FROM transactions WHERE txn_code LIKE CONCAT('TXN123', '%'); This approach ensures compatibility regardless of padding. 3. Concatenation includes trailing spaces Another quirk of CHAR is that it preserves trailing spaces during string operations. This can cause unintended formatting issues. Query SELECT CONCAT(txn_code, '-CHECK') FROM transactions; Output TXN123-CHECK Problem: The spaces from CHAR(10) remain, pushing the “-CHECK” suffix farther than expected. Unlike VARCHAR , which trims excess space, CHAR retains its full allocated length—even when used in string functions. Solution: trim spaces before concatenation To avoid formatting issues, explicitly trim the value. SELECT CONCAT(TRIM(txn_code), '-CHECK') FROM transactions; Output (Corrected) TXN123-CHECK These traits make CHAR a solid choice for fixed-length fields but less ideal for text processing. However, VARCHAR is usually preferred when working with text operations, string manipulation, or external data formatting. Let’s take a closer look. What is VARCHAR in MySQL? VARCHAR is a flexible, variable-length data type designed for storing text with varying lengths. Unlike CHAR, which reserves a fixed space, VARCHAR dynamically adjusts its storage, using only the required space plus 1 or 2 bytes of metadata to track length. This optimizes storage efficiency in tables where values vary widely in size. How MySQL stores VARCHAR VARCHAR’s flexibility comes from its dynamic storage mechanism: Strings ≤ 255 characters: MySQL stores 1 extra byte for length metadata. Strings > 255 characters: MySQL stores 2 extra bytes for length metadata. VARCHAR dynamically adjusts storage, but in InnoDB, large values may be stored off-page, adding retrieval overhead. Examples: how VARCHAR behaves in updates Let’s explore how MySQL handles updates to VARCHAR fields and how to mitigate potential performance issues. 1. Initial storage Let’s create a table with a VARCHAR column and insert a short text value: CREATE TABLE example_varchar ( description VARCHAR(50) );  \nINSERT INTO example_varchar (description) VALUES ('Sample Text'); Here’s how MySQL stores this value: ‘Sample Text’ is 11 bytes long (10 characters + 1 metadata byte). The value fits neatly into the row, and MySQL efficiently stores it within the table’s data structure. Retrieving the value works as expected. SELECT description FROM example_varchar; Output Sample Text At this stage, MySQL efficiently handles the storage, and queries remain fast and predictable. 2. Updating to a longer value Now, let’s update the row to store a longer string. UPDATE example_varchar  \nSET description = 'This is a much longer text than before'  \nWHERE description = 'Sample Text'; Since the new value exceeds the original storage allocation, MySQL cannot store it in the same place. Instead, it must: Move the row to a new location that can accommodate the larger value. Leave behind a pointer in the original location to reference the new storage position. Increase read operations since MySQL must follow this pointer to retrieve the data. This process is known as row relocation and contributes to table fragmentation, which slows down queries over time. 3. Checking for fragmentation issues To check for fragmentation caused by frequent updates, you can run the following code. SHOW TABLE STATUS LIKE 'example_varchar'; This will display the Data_free column, indicating how much unused space (fragmentation) exists in the table. If this number grows significantly, your queries will slow down because MySQL must scan across multiple locations to retrieve rows. 4. Optimizing a fragmented table If you notice significant fragmentation, you can reorganize the table with: OPTIMIZE TABLE example_varchar; This forces MySQL to repack and defragment the table, improving performance by eliminating row relocation overhead. However, VARCHAR is not always better than CHAR—in update-heavy tables, VARCHAR values that increase in size may cause row relocation, leading to fragmentation and increased disk I/O. In InnoDB, large VARCHAR values may be stored off-page, which can slow down retrieval in read-heavy workloads. Key distinctions between CHAR and VARCHAR The table below highlights the difference between VARCHAR and CHAR in MySQL and when to use each. Aspect CHAR VARCHAR Storage Fixed-length: always reserves the defined space, padding shorter values with spaces. This makes row sizes consistent but wastes storage for shorter values. Variable-length: stores only the actual characters plus 1–2 bytes of metadata. Saves space but causes fragmentation when rows expand. Performance Faster for indexing, sorting, and lookups because MySQL doesn’t need to calculate row lengths dynamically. Slightly slower because MySQL must process variable lengths before retrieving values. Frequent updates can degrade performance over time due to row relocation. Indexing Behavior Works best in B-tree indexes where uniform row sizes improve efficiency. Range queries and sorting are faster. Indexing is slightly less efficient due to variable row sizes. MySQL needs extra calculations before processing indexed queries. Fragmentation Risk None—fixed storage means rows are always aligned, preventing fragmentation. High—in MyISAM, expanding VARCHAR values can fragment storage, while in InnoDB, large VARCHAR values may be stored off-page, affecting retrieval speed. Best Use Cases Fixed-length data like cryptographic hashes (SHA-256), country codes (ISO 3166), and logging tables, where uniform storage speeds up retrieval. Variable-length text like names, descriptions, email addresses, and dynamic content, where storage efficiency is more important than retrieval speed. Now, let’s explore the difference between CHAR and VARCHAR in MySQL with examples. Examples of CHAR vs. VARCHAR in performance Here are the instances where each one works best. Scenario 1: Storing fixed-length data (CHAR) Let’s say we store cryptographic hash values, which are always 32 characters long. CREATE TABLE user_hashes ( \n    hash CHAR(32) PRIMARY KEY \n); CHAR is the best choice because its fixed size makes indexing and lookups faster—MySQL doesn’t have to guess where each row starts. VARCHAR, on the other hand, just adds extra overhead without saving space, so there’s no real benefit in using it here. Scenario 2: Storing usernames (VARCHAR) Now, let’s store usernames , which vary in length. CREATE TABLE users ( \n    username VARCHAR(50) UNIQUE \n); Here, VARCHAR is the best choice since usernames range from short (e.g., ‘Joe’) to long (‘JonathanDoe123’), VARCHAR prevents unnecessary storage waste. However, if usernames are frequently updated and get longer, MySQL may have to move rows, causing fragmentation. Need to convert an existing CHAR column to VARCHAR or adjust its size? Follow this step-by-step guide on how to [rename a column type in MySQL](https://blog.devart.com/rename-a-column-type-in-mysql.html) . Scenario 3: Sorting speed – CHAR vs. VARCHAR Test 1: Sorting on CHAR column CREATE TABLE test_char ( \n    product_code CHAR(11) PRIMARY KEY \n); \n \nINSERT INTO test_char VALUES  \n('A123456789'), ('B234567890'), ('C345678901'); \n \nSELECT * FROM test_char ORDER BY product_code; Sorting is fast because CHAR ensures uniform row sizes. Test 2: Sorting on VARCHAR column CREATE TABLE test_varchar ( \n    product_code VARCHAR(10) PRIMARY KEY \n); \n \nINSERT INTO test_varchar VALUES  \n('A123'), ('B234567890'), ('C345'); \n \nSELECT * FROM test_varchar ORDER BY product_code; Sorting takes longer because MySQL must calculate each row’s actual length before ordering results. When to use CHAR and VARCHAR Knowing when to use CHAR or VARCHAR comes down to data consistency, speed, and storage efficiency. Use CHAR when: Data is always the same length – If every value fits a fixed size, like country codes (‘US’, ‘CA’) or cryptographic hashes (SHA-256), CHAR keeps things efficient and structured. Speed matters more than storage – CHAR’s fixed size means MySQL doesn’t waste time checking lengths, making lookups, and sorting lightning-fast. Predictable performance is needed – CHAR keeps row sizes consistent, making it ideal for high-read tables, reference data, and logs. No fragmentation, no surprises. Use VARCHAR when: Data varies in length – VARCHAR efficiently stores names, email addresses, descriptions, and user-generated text without unnecessary padding. Storage needs optimization – VARCHAR only stores what’s needed, preventing the extra padding that CHAR forces on shorter values. Frequent updates occur – If a VARCHAR value exceeds its allocated space, MySQL relocates the row, causing fragmentation and slowing down queries over time. Pro tip: While CHAR and VARCHAR serve different purposes, MySQL also offers VARCHAR(MAX), which is helpful for storing large text values. For a deeper look into how these data types compare, check out this guide on CHAR vs VARCHAR vs VARCHAR(MAX) Advantages and disadvantages of CHAR vs VARCHAR Both CHAR and VARCHAR have strengths and trade-offs. Choosing the right one depends on whether speed, storage, or flexibility matters most. Here’s a quick breakdown. CHAR: +Faster lookups – Fixed-size storage lets MySQL pinpoint rows instantly. +Best for uniform data – Ideal for IDs, hash values, and reference tables. – Massive storage waste – Storing a 10-char string in CHAR(255) wastes 96% of space. VARCHAR: +Space-efficient – Stores only actual text, cutting storage by up to 60%. +Ideal for dynamic data – Best for user input, descriptions, and variable-length text. -Performance trade-offs – Slower indexing due to metadata lookups and row relocation. How dbForge Studio for MySQL simplifies CHAR and VARCHAR management Choosing between CHAR and VARCHAR is just one piece of keeping a MySQL database efficient. Over time, performance can suffer due to fragmentation, slow queries, and inefficient indexing. dbForge Studio for MySQL helps developers avoid these issues by streamlining schema design, query execution, and data management, ensuring CHAR and VARCHAR fields are used effectively without slowdowns. Writing queries that run faster Clean, efficient queries are key to database performance. The [SQL Editor](https://www.devart.com/dbforge/mysql/studio/mysql-code-editor.html) in dbForge Studio highlights syntax, autocompletes commands, and suggests fixes in real time, reducing common errors like mismatched column names or missing commas. [Execution plans](https://www.devart.com/dbforge/mysql/studio/explain-plan.html) provide insights into how MySQL processes CHAR and VARCHAR queries, making it easier to optimize sorting, filtering, and indexing. Enhancing query efficiency with advanced code completion dbForge Studio’s [SQL Coding Assistance](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) goes beyond mere column name suggestions by also indicating data types directly in the autocomplete list. This feature speeds up query writing by eliminating the need for developers to repeatedly refer to the schema for data type verification. Providing immediate data type insights boosts the development process speed and enhances coding accuracy and efficiency. Enhance text data handling with intuitive viewing and easy export options dbForge Studio’s [Data Viewer](https://docs.devart.com/studio-for-mysql/working-with-data-in-data-editor/viewing-data-in-grid.html) enhances the way you handle text-heavy columns within large databases by displaying data in a reader-friendly format. This powerful tool not only facilitates the visualization of text data but also offers the convenience of saving this information as a file or copying it directly to the clipboard. These features significantly simplify data management tasks, providing a more efficient and user-friendly experience for database professionals. Keeping MySQL databases efficient Schema design and data type selection shape [MySQL performance](https://www.devart.com/dbforge/mysql/studio/mysql-performance-tips.html) , but the right tools make optimization easier. dbForge Studio helps developers write better queries, manage data efficiently, and avoid performance bottlenecks. Whether working with CHAR for structured data or VARCHAR for flexible storage, it simplifies MySQL management for long-term efficiency. The takeaway Building a high-performing database starts with the right choices at every level. While selecting between CHAR and VARCHAR in MySQL is essential, proper optimization comes from well-structured schemas, efficient queries, and a scalable design. Every decision affects how smoothly your system handles growth and workload demands. For deeper insights into MySQL performance tuning, check out [this MySQL tutorial](https://blog.devart.com/mysql-tutorial.html) . FAQ What is the difference between CHAR and VARCHAR data types in MySQL? CHAR is a fixed-length data type, meaning it always reserves the same amount of space, regardless of the string length. This ensures predictable performance but can waste storage. VARCHAR, on the other hand, is variable-length, storing only the actual characters plus a small metadata overhead. CHAR is optimized for speed and consistency, while VARCHAR prioritizes storage efficiency and flexibility. Is it better to use CHAR or VARCHAR in MySQL? The choice depends on performance needs and storage considerations: Use CHAR when storing fixed-size values like status codes, hash values, or short identifiers, where fast lookups and indexing matter more than storage savings. Use VARCHAR for names, addresses, descriptions, or any text with unpredictable length, as it minimizes wasted space and scales better in dynamic applications. Why is VARCHAR preferred over CHAR? VARCHAR is often the default choice because it efficiently stores variable-length data without padding empty spaces like CHAR. In large-scale applications, this reduces storage consumption and improves overall database efficiency. However, CHAR still holds an advantage in indexing speed and query optimization for high-read workloads requiring consistent row sizes. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchar-vs-varchar-in-mysql-key-differences-and-best-use-cases.html) [Twitter](https://twitter.com/intent/tweet?text=Key+Differences+Between+VARCHAR+And+CHAR+in+MySQL%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fchar-vs-varchar-in-mysql-key-differences-and-best-use-cases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/char-vs-varchar-in-mysql-key-differences-and-best-use-cases.html&title=Key+Differences+Between+VARCHAR+And+CHAR+in+MySQL%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/char-vs-varchar-in-mysql-key-differences-and-best-use-cases.html&title=Key+Differences+Between+VARCHAR+And+CHAR+in+MySQL%C2%A0) [Copy URL](https://blog.devart.com/char-vs-varchar-in-mysql-key-differences-and-best-use-cases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/char-vs-varchar-vs-varchar-max.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) When to Use CHAR, VARCHAR, and VARCHAR(MAX) in SQL By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) November 27, 2024 [0](https://blog.devart.com/char-vs-varchar-vs-varchar-max.html#respond) 766 A data type is a particular kind of data item that is defined by the values it can take and the operations that can be performed on it. SQL supports various data types, including numeric, date and time, character and string, binary, and more. The choice of data type affects data integrity, storage, and performance; the choice of the optimal data type is not always all that obvious. This article will focus on three commonly used data types: CHAR, VARCHAR, and VARCHAR(MAX), comparing and discussing their characteristics and uses. Character data types in SQL CHAR – fixed-length data type VARCHAR – variable-length data type VARCHAR(MAX) – maximum text storage TEXT – deprecated large text data type Differences between CHAR, VARCHAR, and VARCHAR(MAX) Common errors in working with character data types How to choose the right data type Conclusion Character data types in SQL Character data types store alphanumeric data, including letters, numbers, symbols, and whitespaces. These types can be either fixed-size (CHAR) or variable-size (VARCHAR). Since SQL Server 2019, character data types support the full range of Unicode characters using the UTF-8 encoding. However, if a non-UTF-8 collation is applied, CHAR and VARCHAR will only store the subset of characters supported by the corresponding code page of that collation. The choice of the optimal character data types matters for several reasons: Query performance : The choice of data type can impact performance. For instance, CHAR can be faster because it avoids length calculations, whereas VARCHAR is preferable for columns with varying data lengths. Data integrity : Character data types ensure that text data meets specific rules, like adhering to a maximum length. However, this can sometimes cause compatibility issues with other databases or systems. Storage efficiency : The data type directly affects storage requirements. For example, CHAR(50) always allocates 50 bytes, regardless of the actual data length, while VARCHAR(50) uses only the space needed for the actual data plus 2 bytes. Let’s dive deeper into the specifics of character data types. CHAR – fixed-length data type The CHAR(n) data type is designed for fixed-length, non-Unicode character data, where n specifies the string size in bytes (ranging from 1 to 8,000). It stores any character—letters, numbers, symbols, and even the null character. If the stored data is shorter than the defined length, the database pads it with spaces to meet the fixed length. CHAR data types work well for data that maintains a consistent length, such as phone numbers or postal codes. Advantages of CHAR Uniform data length : CHAR is good for storing standardized identifiers of a consistent size Predictable storage : Fixed length makes it easier to determine the required storage and optimize performance Faster access : There is no need for variable-length calculations, which offers slight speed gains Simpler indexing : Fixed-length data allows for uniform index entries, which can improve indexing speed Reduced row fragmentation : In high-update environments, CHAR columns are less prone to fragmentation Data integrity : CHAR can store spaces instead of NULLs, enabling non-null constraints without complex handling for empty values Disadvantages of CHAR Padding with spaces : Data shorter than the defined length is automatically padded with spaces, potentially complicating retrieval Higher disk usage : CHAR columns can consume more storage, especially when many entries are shorter than the defined length Reduced flexibility : Operations like pattern matching may require extra trimming due to trailing spaces Fixed design : Changing the length of a CHAR column requires a schema alteration, which can be cumbersome CHAR is commonly used in small, frequently queried lookup tables, such as those for status tracking or code validation. Fixed length improves performance, making CHAR ideal when data integrity and predictable schemas are more critical than storage efficiency. However, CHAR is less suitable for data of varying lengths, where it is recommended to use VARCHAR as a more flexible and storage-efficient option. VARCHAR – variable-length data type VARCHAR is a variable-length string data type with a maximum length of 8,000 characters. It can store any characters, including numbers, letters, special characters, non-printing characters, and the ASCII null character. Each character in a VARCHAR string uses 1 byte. Unlike fixed-length columns, VARCHAR only occupies the space needed for the actual data stored, without padding. Although it may perform slightly slower than CHAR due to length calculations, VARCHAR provides significant storage savings. Advantages of VARCHAR Efficient storage : VARCHAR uses space only for the data stored, unlike CHAR, which always allocates a fixed size, even when data is shorter Flexibility : VARCHAR is great for variable-length data like names, addresses, and descriptions Improved performance for smaller data : With no need to process empty spaces, calculations avoid unnecessary I/O and memory use Adjustable length : You can set a maximum length (like VARCHAR(255)), allowing control over storage based on expected data size Disadvantages of VARCHAR Potential performance impact : Handling variable lengths can increase memory management complexity and slow down data retrieval from large tables Indexing challenges : Indexes on VARCHAR columns may be slower and less efficient than those on fixed-length columns, especially in large datasets Sorting and comparison overhead : Sorting and comparing VARCHAR data can require extra processing due to variable lengths Inefficiency for consistently short data : If data in a VARCHAR field is usually shorter than the maximum length, CHAR may be more efficient Risk of data truncation : If the data length exceeds the specified maximum, truncation can occur, leading to data loss While VARCHAR offers numerous benefits, it may not be the best choice for large-scale applications with heavy indexing or for data with minimal length variation. In such cases, CHAR or another data type might be more efficient. VARCHAR(MAX) – maximum text storage The VARCHAR(MAX) data type supports variable-length character strings up to 2 GB, making it ideal for storing large texts. Key points to note: VARCHAR(MAX) columns do not allow a fixed length limit VARCHAR(MAX) columns cannot be used as key columns in an index Advantages of VARCHAR(MAX) Flexible data size : With a capacity of up to 2 GB, VARCHAR(MAX) is suitable for applications needing to store highly variable text, like comments or notes Optimized space : Small VARCHAR(MAX) values are stored in-row with other data and are moved off-row when data exceeds 8 KB, storing a pointer in place Reduced schema changes : Using VARCHAR(MAX) minimizes the need for schema updates as data grows Storing JSON or XML data : VARCHAR(MAX) is well-suited for storing large, variable-length JSON or XML data Disadvantages of VARCHAR(MAX) Potential performance impact : VARCHAR(MAX) can slow down query performance, especially when large data storage is unnecessary Limited indexing : VARCHAR(MAX) columns cannot be directly indexed, which restricts their effectiveness in search, filter, and sort operations High memory usage : Loading multiple large strings can strain server memory while moving data off-row requires extra storage for pointers Compatibility concerns : Operations like GROUP BY and DISTINCT may not be fully optimized for VARCHAR(MAX) columns with highly variable lengths Locking conflicts : Storing and updating large data in VARCHAR(MAX) columns can cause table or page locks, degrading performance Risk of data overload : Without a length limit, it’s easy to insert more data than necessary, risking data consistency The VARCHAR(MAX) data type helps handle large text data but should be used carefully. It’s most suitable when data lengths vary widely and can exceed 8,000 bytes. For smaller, predictable text sizes, fixed-length CHAR fields are often more efficient. TEXT – deprecated large text data type The TEXT data type is used to store large amounts of text data, including both single-byte and multibyte characters supported by the locale. A table can include up to 195 columns of the TEXT data type. In SQL Server, the TEXT data type has been deprecated since SQL Server 2005. Microsoft recommends using VARCHAR(MAX) or NVARCHAR(MAX) for development, as support for the TEXT data type will be removed in the upcoming version of SQL Server. Differences between CHAR vs VARCHAR vs VARCHAR(MAX) The below table describes the differences between the CHAR, VARCHAR, and VARCHAR(MAX) data types: Feature CHAR VARCHAR VARCHAR(MAX) Data type Fixed-length string data Variable-length string data Variable-length string data with large capacity Storage allocation Allocates defined space regardless of data Allocates space based on content (actual data size + 2 bytes) Allocates space based on content (actual data size + 2 bytes) Maximum length Up to 8,000 bytes Up to 8,000 bytes Up to 2 GB Padding Padding with spaces up to the defined length No padding No padding Performance Faster for fixed-length data due to less overhead Slightly slower due to variable length handling Slower due to potentially large size and using LOB (Large Object) storage Use case Short, fixed-length data (codes, flags, etc.) Variable-length, constrained data (names, addresses, etc.) Large text data, beyond 8,000 characters (comments, posts, etc.) Common errors in working with character data types Working with CHAR, VARCHAR, and VARCHAR(MAX) can lead to specific errors that database specialists should keep in mind. Below, we’ll cover the most common issues associated with these data types and how to address them. Truncation errors : Truncation occurs when a user attempts to insert a string longer than the defined length of a CHAR or VARCHAR column, resulting in data being cut off. To prevent this, ensure that the length of the data matches the allowed length for the column. Misuse of VARCHAR(MAX) : VARCHAR(MAX) is often misused, such as for columns containing small, fixed-size data, where CHAR is appropriate. It’s crucial to match the data type with the intended data. Trailing spaces in CHAR : The CHAR data type pads strings with spaces to reach the specified length, which can lead to unexpected behavior in comparisons and JOINs. To avoid these issues, use VARCHAR for variable-length strings where padding is unnecessary. Data migration and compatibility issues : Migrating data from other systems may introduce inconsistencies in CHAR and VARCHAR values, leading to truncation or padding problems. Standardize data lengths and formats before beginning the migration process to minimize these issues. How to choose the right data type Previously, we discussed various aspects of the CHAR vs VARCHAR vs VARCHAR(MAX) usage. Choosing the right type depends on data characteristics, length requirements, storage, and performance considerations. The following table summarizes this information. CHAR VARCHAR VARCHAR(MAX) When to use When your data always has a consistent length When data lengths vary but have a defined maximum length under 8000 characters When data length is highly variable and can exceed 8000 characters Typical use cases ISO country codes, abbreviations, product codes, reference numbers, serial numbers, etc. Names, addresses, email addresses, descriptions, comments, product names, etc. Detailed descriptions, documents, logs, articles, lengthy comments, JSON and XML data Specifics Best for large datasets with data of small fixed length Allocating only the required space for the actual data size Suitable if data is expected to grow in length over time Advantages Predictable storage requirements and faster performance for larger datasets More efficient storage for the data of variable length and reducing the wasted space Allows storing up to 2GB of data without setting any hard limits Considerations Padding the strings with spaces to match the defined length may waste space The maximum length is 8000 characters; if the data exceeds it, refer to VARCHAR(MAX) Possible performance issues due to the size of data stored as Large Objects (LOBs) Conclusion Considering CHAR vs VARCHAR vs VARCHAR(MAX) when managing database tables is essential. This article covered these data types along with their unique characteristics to help you choose the right option for each particular case. Besides, database-related tasks all become easier with the right tools, such as [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . It is a powerful all-in-one solution for SQL Server professionals that offers an intuitive visual Data Editor, which helps manage data of all types in the most convenient way as well as perform all other database development, management, and administration tasks. The [fully functional trial of the Studio](https://www.devart.com/dbforge/sql/studio/download.html) is available for 30 days, so feel free to download it and utilize all the robust capacities for your daily tasks. Tags [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchar-vs-varchar-vs-varchar-max.html) [Twitter](https://twitter.com/intent/tweet?text=When+to+Use+CHAR%2C+VARCHAR%2C+and+VARCHAR%28MAX%29+in+SQL&url=https%3A%2F%2Fblog.devart.com%2Fchar-vs-varchar-vs-varchar-max.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/char-vs-varchar-vs-varchar-max.html&title=When+to+Use+CHAR%2C+VARCHAR%2C+and+VARCHAR%28MAX%29+in+SQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/char-vs-varchar-vs-varchar-max.html&title=When+to+Use+CHAR%2C+VARCHAR%2C+and+VARCHAR%28MAX%29+in+SQL) [Copy URL](https://blog.devart.com/char-vs-varchar-vs-varchar-max.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/charindex-function-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Use SQL Server CHARINDEX() Function By [Victoria Shyrokova](https://blog.devart.com/author/victorias) February 27, 2025 [0](https://blog.devart.com/charindex-function-in-sql-server.html#respond) 344 Developers often find it slow and difficult to search for specific parts of a string in SQL. Searching through large text can be time-consuming and make work more complicated. SQL Server CHARINDEX function helps by quickly finding the position of a SQL Server substring. This makes it easier and faster to work with text in SQL. In this guide, we’ll explain how CHARINDEX() works, highlight its uses, and walk you through a few simple examples to help you grasp it easily. If you want to improve your string-handling skills, this guide will help you use CHARINDEX() with confidence. Table of contents What is the SQL Server CHARINDEX function? Parameters of CHARINDEX Examples of using CHARINDEX() in SQL Server Try it yourself with dbForge Studio Case sensitivity in CHARINDEX() Using the optional start position parameter with CHARINDEX Further learning FAQ What is the SQL Server CHARINDEX function? In the world of SQL Server, functions like PATINDEX and CHARINDEX help developers find the position of a substring in a string. However, their main difference lies in the match specifics. SQL server PATINDEX allows pattern matching with wildcards, and CHARINDEX looks for an exact match, i.e., the first occurrence of a substring in a string. What is the SQL server CHARINDEX() function? The CHARINDEX function in SQL Server helps you define the substring position in a string. If the [SQL Server substring](https://blog.devart.com/substring-function-in-sql-server.html) is present, it returns the starting position as a number. If it’s not found, the function returns 0. Syntax CHARINDEX (substring, string, start) substring – The text you’re searching for. string – The main text where you’re looking for the substring function SQL server. start (optional) – The position in the string where the search starts (default is 1). How it works CHARINDEX SQL Server works like a scanner. It checks the string from left to right. By definition, it treats uppercase or lowercase letters as the same. Example SELECT CHARINDEX('SQL', 'Learning SQL is fun!', 1); This returns 10 because “SQL” starts at position 10 in the sentence. Parameters of CHARINDEX The CHARINDEX() function has three parameters to help you find a substring in a string. substring – The text you want to find with the substring SQL server function. string – The main text where you’re searching. start (optional) – The position in the string where the search begins. If you don’t specify another starting point, it starts from the beginning (position 1). Let’s take a look at how the parameters of CHARINDEX work. Here’s an example: SELECT CHARINDEX('is', 'This is a crisis.'); As you can see, there are 4 “is” in the string. If we use this query without parameters, it will find “is” in “This” (the first occurrence) and will return “3”. However, if we’re going to use the following query with a parameter added to its end: SELECT CHARINDEX('is', 'This is a crisis.', 4); It will find “is” in “is” at position “6”, but won’t locate it in other parts of the string containing this combination of characters. Examples of using CHARINDEX() in SQL Server As we have found out, the CHARINDEX() function assists in finding the position of a substring inside a string. This is a so-called substring function in SQL server. It returns a number showing where the SQL Server substring starts. If the substring isn’t found, it returns 0. Finding a word SELECT CHARINDEX('SQL', 'Learning SQL is fun!'); Result: 10 (because “SQL” starts at position 10). Using the start position SELECT CHARINDEX('o', 'Hello World', 6); Result: 8 (starts searching from position 6 and finds “o” in “World”). Column search In real database queries, CHARINDEX in SQL server is often used to search within table columns. Here’s an example: DECLARE @Employees TABLE (Name NVARCHAR(100));\nINSERT INTO @Employees (Name) VALUES\n('Johnathan Smith'),\n('Alice Johnson'),\n('Olivia Benson'); Using this query, we have inserted three values into the table. Now, let’s use a query to find if there’s “John” found in the name or surname of any of the employees. SELECT Name, CHARINDEX('John', Name) AS Position\nFROM @Employees; This query will look for “John” to appear in each employee’s name. If “John” isn’t found (e.g., in “Olivia Benson”), the query will return 0. Here’s an output: Name Position Johnathan Smith 1 Alice Johnson 7 Olivia Benson 0 Replacing the first occurrence of a substring The REPLACE() function in SQL Server replaces all occurrences of a SQL Server SQL substring, but if you need to replace only the first occurrence, you can use STUFF() and CHARINDEX(). How it works CHARINDEX() finds where the substring in SQL server appears. STUFF() removes the substring and inserts a new one in its place. STUFF() and CHARINDEX() help with precise string modifications. They work well for targeted replacements but can make queries more complex, especially for large datasets. [Learn more about this use case here.](https://www.devart.com/dbforge/sql/studio/sql-server-replace-function.html#:~:text=The%20CHARINDEX()%20function) From these examples, you can see how SQL Server CHARINDEX helps you to easily search for text in strings, filter data, and manipulate values in SQL Server. Try it yourself with dbForge Studio Want to test CHARINDEX() and other SQL functions in a more user-friendly environment? [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is a great alternative to SSMS, offering extra features to make your work easier. Why use dbForge Studio instead of SSMS? While SSMS is the default tool for SQL Server, dbForge Studio provides: Smarter SQL coding – Autocomplete, formatting, and debugging help speed up your work. Visual query builder – Create queries without the necessity to write complicated SQL. Easy data comparison – Quickly find and sync differences between databases. Performance tracking – Detect slow queries and optimize them. Check out this [comparison of dbForge Studio vs. SSMS](https://www.devart.com/dbforge/sql/studio/alternative-to-ssms.html) to see the full benefits. See it in action Watch this YouTube video to learn how dbForge Studio can improve your workflow. Get started dbForge Studio for SQL Server is a handy platform with quick and easy onboarding. You should just follow the basic steps to set up your environment and start working with SQL. [Download dbForge Studio for the latest version](https://www.devart.com/dbforge/sql/studio/download.html) . Install it with the step-by-step guide. Try features like the query builder, debugging tools, and performance monitoring. Write queries using the SQL editor with autocomplete and formatting. Case sensitivity in CHARINDEX() In SQL Server, the CHARINDEX() function doesn’t consider substring SQL server cases by default. As we’ve mentioned before, it treats uppercase and lowercase letters the same. Case-insensitive search (Default) SELECT CHARINDEX('sql', 'Learning SQL is fun!'); Result: 10 (finds “SQL” even though the case doesn’t match). Making searches case-sensitive If your task is a case-sensitive search, you can use the COLLATE clause like SQL_Latin1_General_CP1_CS_AS. SELECT CHARINDEX('sql', 'Learning SQL is fun!' COLLATE SQL_Latin1_General_CP1_CS_AS); Result: 0 (because “sql” in lowercase doesn’t match “SQL” in uppercase). Case-sensitive example with a table Let’s assume that we have added six new employees to our table: DECLARE @Employees TABLE (Name NVARCHAR(100));\nINSERT INTO @Employees (Name) VALUES\n('Johnathan Smith'),\n('Alice Johnson'),\n('john Doe'),\n('Olivia Benson'),\n('JOHN DOE'); Now let’s check for “john” in lowercase using COLLATE clause for case sensitivity. SELECT\nName,\nCHARINDEX('john', Name COLLATE SQL_Latin1_General_CP1_CS_AS) AS Position\nFROM\n@Employees; Since the collation is case-sensitive, only names that exactly match ‘john’ in lowercase will return a position greater than 0. Here’s our test output: Name Position Johnathan Smith 0 Alice Johnson 0 john Doe 1 Olivia Benson 0 JOHN DOE 0 Using the optional start position parameter with CHARINDEX The CHARINDEX function in SQL server has an optional start position. You can choose where to begin searching in the string. How the start position works The start position parameter is responsible for telling CHARINDEX() where to start its search. You give a number, and it starts from that string point. Example 1: Starting search from a specific position SELECT CHARINDEX('o', 'Hello World', 6); Result: 8 As you can tell, the first ‘o’ appears at position 5, but the search starts at position 6. Looking from position 6, the next occurrence of ‘o’ is at position 8. Example 2: Searching for recurring substrings For example, if you want to find the second occurrence of a character or substring in SQL server substring function, you can set the start position to skip the first match. SELECT CHARINDEX('o', 'Hello World, Hello Again', 12); Result: 17 Since the search starts at position 12, the next occurrence of ‘o’ after position 12 is at position 17 (‘o’ in the second ‘Hello’). You can use the SQL substring to extract a part of a string from a specific position. This way, you can detect repeated characters. Further learning Want to learn more about SQL Server and dbForge Studio? We have some helpful resources for you to dive into: [SQL Server Tutorials](https://blog.devart.com/sql-server-tutorial) ; Detailed help and instructions in [dbForge Studio Documentation](https://docs.devart.com/studio-for-sql-server/) ; Practicable [dbForge Studio Video Tutorials](https://www.youtube.com/playlist?list=PLpO6-HKL9JxXSZgO3LOMxO_Tt3QxpFbJNt) ; Courses and resources to further develop your SQL skills in [Devart Academy](https://www.devart.com/academy/sql-server-studio/) . These materials will help you become more comfortable and skilled with SQL Server and dbForge Studio. FAQ What is the difference between PATINDEX vs CHARINDEX in SQL Server? In SQL Server, both PATINDEX and CHARINDEX help find the position of a substring in a string. But there are key differences in how they work. CHARINDEX in SQL Server CHARINDEX function SQL Serverfinds the position of a specific substring. It doesn’t care about cases by default. If the substring is found, it returns the starting position. If not, it returns 0. CHARINDEX example in SQL Server Server finds SELECT CHARINDEX(‘apple’, ‘I have an apple tree’); This gives the position of “apple” in the string. PATINDEX in SQL Server [PATINDEX](https://blog.devart.com/patindex-function-in-sql-server.html) finds the position of a pattern in a string. Unlike CHARINDEX, PATINDEX lets you use wildcards like % and _ to match any characters. How to use PATINDEX in SQL Server: You give PATINDEX a pattern and a string. It will return the position of the first match. Example: SELECT PATINDEX(‘%apple%’, ‘I have an apple tree’); This finds “apple” using a pattern search. PATINDEX Example in SQL Server We can simply explain PATINDEX in SQL server with example. If you want to search for a pattern that includes wildcard characters, PATINDEX is the right choice. F.e., searching for any word that starts with ‘a’ and ends with ‘e’: SELECT PATINDEX(‘%a%e%’, ‘I have an apple tree’); This will find the position of any pattern in the string that starts with ‘a’ and ends with ‘e’. For one, ‘”ave” in “have”’ matches, starting at position 4. The word “apple” also contains “a…e”, but the first match takes precedence, so our query will return “4”. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcharindex-function-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Use+SQL+Server+CHARINDEX%28%29+Function&url=https%3A%2F%2Fblog.devart.com%2Fcharindex-function-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/charindex-function-in-sql-server.html&title=How+to+Use+SQL+Server+CHARINDEX%28%29+Function) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/charindex-function-in-sql-server.html&title=How+to+Use+SQL+Server+CHARINDEX%28%29+Function) [Copy URL](https://blog.devart.com/charindex-function-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/chatgpt-vs-bard-for-postgresql-developers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) ChatGPT-4 vs Bard: What Are the Differences for PostgreSQL Developers? By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) May 17, 2023 [0](https://blog.devart.com/chatgpt-vs-bard-for-postgresql-developers.html#respond) 2668 In the fast-paced world of artificial intelligence (AI) advancements, developers are looking for the most efficient and groundbreaking solutions to expedite and improve their work quality. For PostgreSQL developers, it is essential to select the ideal AI-powered tool that addresses their queries with utmost professionalism. The popularity of AI tools has soared in recent years, with developers increasingly recognizing their potential in streamlining various aspects of their work. Some of the most renowned AI tools include OpenAI’s ChatGPT, Google’s Bard, IBM’s Watson, and Microsoft’s Azure Cognitive Services, among others. These tools have revolutionized the way developers approach problem-solving and have made previously time-consuming tasks more manageable. In this article, we will focus on comparing the responses of ChatGPT-4 and Google Bard to a range of common SQL development-related questions. By doing so, we aim to get a clear understanding of each tool’s capabilities and help you determine which one is better suited to your specific needs in the realm of SQL development. Contents: What is ChatGPT? What is Google Bard? ChatGPT vs Google Bard How to use ChatGPT and Bard for PostgreSQL development Example #1: Function Example #2: Date function Example #3: Window function Example #4: JOIN clause Example #5: Pivot table Which AI tool is better? Conclusion What is ChatGPT? ChatGPT, developed by OpenAI, is a state-of-the-art AI language model based on the GPT (Generative Pre-trained Transformer) architecture. As a large-scale language model, ChatGPT is designed to generate human-like text and engage in conversation with users, understanding the context and providing relevant responses. It is capable of performing various tasks such as answering questions, offering recommendations, creating content, and more. Developers and businesses can harness the power of ChatGPT by integrating it into their applications, services, or products, enhancing user experience through natural language understanding and generation. ChatGPT is successfully applied in areas like customer support, content creation, virtual assistance, and many other domains where natural language processing is essential. What is Google Bard? Google Bard is a large language model, also known as a conversational AI or chatbot trained to be informative and comprehensive. Bard is trained on a massive amount of text data and is able to communicate and generate human-like text in response to a wide range of prompts and questions. Although still in development, this tool can already assist SQL developers in various ways, including answering questions about SQL syntax and usage, aiding in SQL query debugging, generating SQL code tailored to particular tasks, and offering tutorials and documentation on SQL, among other features. ChatGPT vs Google Bard ChatGPT and Google Bard are both large language models, but they have some key differences. Data: ChatGPT is trained on a dataset of text and code that was collected up to 2021, while Google Bard is trained on a dataset that is constantly being updated. This means that Google Bard has access to more recent information and can provide more precise answers. Accuracy: Google Bard is generally more accurate than ChatGPT, especially when it comes to factual information. This is because Google Bard is trained on a larger and more up-to-date dataset. Creativity: ChatGPT is more creative than Google Bard when it comes to generating text formats, such as poems, code, scripts, musical pieces, emails, letters, etc. This is because ChatGPT is trained on a dataset that includes a wider variety of creative text formats. Availability: ChatGPT is available to anyone who wants to use it, while Google Bard is currently only available to a limited number of users. ChatGPT Google Bard Developer OpenAI Google Language model Customized version of OpenAI’s Generative Pre-training Transformer 3 (GPT-3) or Generative Pre-training Transformer 4 (GPT-4), depending on the version Google’s Language Model for Dialogue Applications (LaMDA) Data source ChatGPT was trained using an extensive collection of textual data, encompassing resources like Common Crawl, Wikipedia, books, articles, and various documents obtained from the open internet. However, its training data only extends up to 2021, which restricts its knowledge of the most recent world events and research developments. Bard was trained using Infiniset, a dataset comprising Common Crawl, Wikipedia, documents, as well as conversations and dialogues sourced from the internet. Allegedly, Bard can perform real-time web searches to provide the most up-to-date answers to queries and the latest research findings. Pricing ChatGPT is available to users at no cost, while ChatGPT Plus comes with a subscription fee of $20 per month. Subscribers to ChatGPT Plus benefit from access during high-demand periods, expedited response times, priority access to new features, and the utilization of GPT-4. Bard is available at no cost for users who have access. How to use ChatGPT and Bard for PostgreSQL development AI can play a significant role in PostgreSQL development by providing guidance on syntax and usage, generating SQL code, assisting with query debugging, and more. In this article, we will pose identical questions related to SQL development to both Bard and ChatGPT, and subsequently compare and validate their responses. Our objective is to assess the reliability and usefulness of either AI in the context of PostgreSQL development. To validate the answers provided by AI, we will use one of the best PostgreSQL GUI tools on the market – dbForge Studio for PostgreSQL. Prerequisites [Pagila sample database](https://www.postgresql.org/ftp/projects/pgFoundry/dbsamples/pagila/pagila/) [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/download.html) Access to ChatGPT-4 Access to Google Bard Example #1: Function Suppose, we need to create a function that returns the most rented films during the given period, along with their rental counts. Let us ask ChatGPT-4 to accomplish the task. ChatGPT-4 generated the following SQL code: CREATE OR REPLACE FUNCTION most_rented_films(start_date DATE, end_date DATE)\nRETURNS TABLE (film_id INTEGER, title TEXT, rental_count INTEGER) AS $$\nBEGIN\n RETURN QUERY\n SELECT f.film_id, f.title, COUNT(r.rental_id) AS rental_count\n FROM film AS f\n JOIN inventory AS i ON f.film_id = i.film_id\n JOIN rental AS r ON i.inventory_id = r.inventory_id\n WHERE r.rental_date BETWEEN start_date AND end_date\n GROUP BY f.film_id, f.title\n ORDER BY rental_count DESC;\nEND;\n$$ LANGUAGE plpgsql; Let us now open dbForge Studio for PostgreSQL and validate the code created by the AI. As you can see the Studio throws an error because the schema for creating the function is not specified. Let us modify the code a little bit and try again. The query was successful this time, and the function is now visible in the Database Explorer. Now, it is time to run the SELECT statement to make sure that everything works as required. Oops. There is an error again. In the function syntax, the rental_count column datatype has been specified as INTEGER, which is obviously wrong. We need to modify the syntax and try again. Thus, the working code is as follows: CREATE OR REPLACE FUNCTION public.most_rented_films(start_date DATE, end_date DATE)\nRETURNS TABLE (film_id INTEGER, title TEXT, rental_count BIGINT) AS $$\nBEGIN\n RETURN QUERY\n SELECT f.film_id, f.title, COUNT(r.rental_id) AS rental_count\n FROM film AS f\n JOIN inventory AS i ON f.film_id = i.film_id\n JOIN rental AS r ON i.inventory_id = r.inventory_id\n WHERE r.rental_date BETWEEN start_date AND end_date\n GROUP BY f.film_id, f.title\n ORDER BY rental_count DESC;\nEND;\n$$ LANGUAGE plpgsql; So, we drop the function we created earlier, create a new one, and then run SELECT. This time it is successful. Okay, now let us assess Google Bard. So, we got the following code: CREATE OR REPLACE FUNCTION most_rented_films (start_date DATE,\nend_date DATE) RETURNS TABLE (\n film_id INT,\n film_title VARCHAR(255),\n rental_count INT\n)\nAS\n$func$\nBEGIN\n\n RETURN QUERY\n SELECT\n f.film_id,\n f.film_title,\n COUNT(*) AS rental_count\n FROM\n film_rentals fr\n JOIN\n films f ON f.film_id = fr.film_id\n WHERE\n fr.rental_date >= start_date\n AND fr.rental_date < end_date\n GROUP BY\n f.film_id,\n f.film_title\n ORDER BY\n rental_count DESC;\n\nEND\n$func$; Let us run it in the Studio, shall we? The pressure is on! And we get the same error – the schema has not been specified. So, we modify the code and run it again. And get another error. We need to modify the query further. After specifying the language and executing the syntax, we run SELECT. And… error again. Google Bard refers to the non-existing table in its script; besides that, the datatype of the rental_count column is wrongly specified as INT. After we fix all these issues, we finally get the working code. Not fun, right? With Google Bard, we had to adjust the generated code for quite a while, and that required a certain level of expertise, which means that beginners in SQL might not be able to take advantage of its prompts. However, it is just the first example, let us move on. Example #2: Date function Suppose, we want to create a date function that returns the day of the week with the highest film rentals within the specified range. Let us first ask ChatGPT-4 to assist us with this task. So, ChatGPT-4 has generated the following query for us: CREATE OR REPLACE FUNCTION highest_rental_day(start_date DATE, end_date DATE)\nRETURNS TEXT AS $$\nDECLARE\n day_of_week TEXT;\nBEGIN\n SELECT TO_CHAR(r.rental_date, 'Day') AS day_name\n INTO day_of_week\n FROM rental r\n WHERE r.rental_date BETWEEN start_date AND end_date\n GROUP BY EXTRACT(DOW FROM r.rental_date)::INTEGER, day_name\n ORDER BY COUNT(*) DESC\n LIMIT 1;\n\n RETURN day_of_week;\nEND;\n$$ LANGUAGE plpgsql; As you might remember from Example #1, we need to specify the schema before we run this code in dbForge Studio for PostgreSQL. Otherwise, we will get an error. The function has been successfully created. Let us check how it is working. Unbelievable! It is working without any adjustments. Well, almost without. Let us look at how Google Bard will cope with the task. Here is the code that we got. CREATE FUNCTION get_highest_rental_day(\n start_date DATE,\n end_date DATE\n) RETURNS VARCHAR(10) AS\n$func$\nBEGIN\n DECLARE\n -- Get the total number of film rentals for each day of the week.\n rental_counts TABLE (\n day_of_week VARCHAR(10),\n total_rentals INT\n );\n BEGIN\n -- Populate the rental_counts table.\n INSERT INTO rental_counts (day_of_week, total_rentals)\n SELECT\n EXTRACT(DOW FROM rental_date),\n COUNT(*)\n FROM\n film_rentals\n WHERE\n rental_date >= start_date\n AND rental_date <= end_date\n GROUP BY\n EXTRACT(DOW FROM rental_date);\n \n -- Get the day of the week with the highest number of rentals.\n SELECT\n day_of_week\n FROM\n rental_counts\n ORDER BY\n total_rentals DESC\n LIMIT\n 1;\n END;\nEND\n$func$ LANGUAGE plpgsql; We specify the schema and run it in the Studio for PostgreSQL. When trying to execute SELECT from this function, we get an error. There is no TABLE datatype in PostgreSQL. We asked Google Bard to rewrite the syntax, yet we didn’t get the functional code. As a result, it receives a demerit for this particular task. Example #3: Window function A window function in PostgreSQL is a type of function that performs a calculation across a set of rows related to the current row. It allows you to perform advanced calculations that involve comparing the current row with other rows within a specified window or partition. Window functions are useful for tasks such as ranking, cumulative sums, moving averages, and more. Suppose, we want to calculate the cumulative sum of payments for each customer and get those ordered by payment date. Let us first ask ChatGPT-4 for help. Here is the syntax we got, in case you would like to check it yourself: SELECT \n customer_id,\n payment_date,\n amount,\n SUM(amount) OVER (PARTITION BY customer_id ORDER BY payment_date) as cumulative_amount\nFROM\n payment\nORDER BY\n payment_date; Now we open dbForge Studio for PostgreSQL and run the query that ChatGPT-4 has generated for us. Excellent. right? Let us ask the same question to Google Bard. This is the code we got: SELECT\n customer_id,\n SUM(payment_amount) AS cumulative_sum_of_payments,\n payment_date\nFROM payments\nGROUP BY customer_id\nORDER BY payment_date; However, there is a bunch of problems with it: There is no payments table in the pagila database, it is called payment . The payment_amount column does not exist either. It is called amount . There is no grouping by date. Sorting is done by the wrong column. Novices may struggle to use that prompt, as numerous modifications are needed for the query to work properly. Example #4: JOIN clause [JOINs](https://youtu.be/MIOv8qJQaqA) are used to combine data from two or more tables in a relational database based on a related column between them. They allow you to retrieve information from multiple tables in a single query, making it an essential tool for working with relational databases. Suppose, we want to get a list of all films with the categories they belong to. Let us first ask ChatGPT-4. Here is the syntax we got: SELECT \n f.title,\n c.name as category\nFROM \n film f\nJOIN \n film_category fc ON f.film_id = fc.film_id\nJOIN \n category c ON fc.category_id = c.category_id\nORDER BY\n f.title; The query looks fine at first glance. Let us validate it. And it is working as expected. Well done, ChatGPT-4. Let us approach its rival now. Below is the syntax we got: SELECT\n film.film_id,\n film.title,\n category.name\nFROM film\nJOIN film_category ON film.film_id = film_category.film_id\nJOIN category ON film_category.category_id = category.category_id; Let us run the query generated by Google Bard in the Studio. As you can see, Google Bard has coped with the task too. The queries are pretty much similar, the only difference is that ChatGPT-4 has added the ORDER BY clause for better results analysis and visibility. Example #5: Pivot table In PostgreSQL, a [pivot table](https://blog.devart.com/pivot-tables-in-postgresql.html) is a data summarization tool that allows you to aggregate and transform data from a database into a more readable format. Suppose, we want to get a pivot table that shows the total rental amount for each customer by film category. Let us go and ask ChatGPT to write a corresponding query for us. ChatGPT has provided us with the following query: WITH rental_amounts AS (\n SELECT\n c.customer_id,\n cat.name AS category,\n SUM(p.amount) AS total_amount\n FROM\n rental r\n JOIN payment p ON r.rental_id = p.rental_id\n JOIN inventory i ON r.inventory_id = i.inventory_id\n JOIN film f ON i.film_id = f.film_id\n JOIN film_category fc ON f.film_id = fc.film_id\n JOIN category cat ON fc.category_id = cat.category_id\n JOIN customer c ON r.customer_id = c.customer_id\n GROUP BY\n c.customer_id,\n cat.name\n)\nSELECT\n customer_id,\n SUM(CASE WHEN category = 'Action' THEN total_amount ELSE 0 END) AS \"Action\",\n SUM(CASE WHEN category = 'Animation' THEN total_amount ELSE 0 END) AS \"Animation\",\n SUM(CASE WHEN category = 'Children' THEN total_amount ELSE 0 END) AS \"Children\",\n -- Add more categories as needed\n SUM(total_amount) AS \"Total\"\nFROM\n rental_amounts\nGROUP BY\n customer_id\nORDER BY\n customer_id; Now it’s time to validate it in dbForge Studio for PostgreSQL. Ready? And it is working! Quite impressive, isn’t it? Now we go to Google Bard with the same request. And we get the following code: WITH rental_amount AS (\n SELECT\n customer_id,\n film_id,\n SUM(rental_amount) AS total_rental_amount\n FROM rental\n GROUP BY customer_id, film_id\n)\nSELECT\n customer_id,\n category.name AS category,\n SUM(rental_amount) AS total_rental_amount\nFROM rental_amount\nJOIN film_category ON rental_amount.film_id = film_category.film_id\nJOIN category ON film_category.category_id = category.category_id\nGROUP BY customer_id, category\nORDER BY customer_id; However, when we run the query that Google Bard has generated for us, we face a couple of mistakes. There is no film_id column in the rental table, therefore grouping by it is also impossible. The rental_amount column does not exist. The rental_amount.film_id = film_category.film_id JOIN is not valid, since, as mentioned above, there is no film_id in rental . Which AI tool is better? Both Google Bard and ChatGPT-4 are continuously evolving, and these tools possess immense potential in the field of AI language models. However, based on the analysis conducted in this article, ChatGPT-4 demonstrates superior performance in handling PostgreSQL prompts. The code generated by ChatGPT-4 typically requires fewer modifications, making it more efficient. Additionally, ChatGPT-4 boasts greater accessibility, as obtaining access to this AI model is a more straightforward process compared to its counterpart. Furthermore, the code generated by ChatGPT is a little bit more user-friendly; for example, the AI thoughtfully includes clauses such as ORDER BY and GROUP BY, making the results more comprehensible and easier to analyze. This facilitates the identification of trends and patterns, ultimately improving the overall user experience when working with the output. Want to learn more? Refer to our blog posts on utilizing AI for SQL and database development: [How to Use ChatGPT to Write SQL JOIN Queries](https://blog.devart.com/how-to-use-chatgpt-to-write-sql-join-queries.html) [Power Up Your MySQL Queries: How ChatGPT Can Help You Retrieve MySQL Data](https://blog.devart.com/power-up-your-mysql-queries-how-chatgpt-can-help-you-retrieve-mysql-data.html) [How ChatGPT Can Help Database Developers Write Unit Tests for SQL Server](https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html) [Exploring ChatGPT’s Capabilities in Creating and Optimizing SQL Queries for Oracle](https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html) Conclusion In this article, we have explored and compared the capabilities of two prominent AI language models, Google Bard and ChatGPT-4, in the context of assisting PostgreSQL developers with query writing. Our analysis reveals that ChatGPT-4 outperforms Google Bard in generating more user-oriented code, requiring fewer adjustments, and boasting greater accessibility. As AI continues to advance and shape the future, it becomes increasingly crucial for developers to learn how to effectively interact with such powerful assistants. To validate the AI-generated responses, we utilized dbForge Studio for PostgreSQL, an all-in-one solution for database management, development, and administration. The Studio offers an extensive range of features and tools designed to simplify and streamline the work of database professionals. To experience its benefits first-hand, we encourage you to [download the free trial of dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/download.html) and explore its capabilities for your own projects. Tags [ai in postgresql development](https://blog.devart.com/tag/ai-in-postgresql-development) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchatgpt-vs-bard-for-postgresql-developers.html) [Twitter](https://twitter.com/intent/tweet?text=ChatGPT-4+vs+Bard%3A+What+Are+the+Differences+for+PostgreSQL+Developers%3F&url=https%3A%2F%2Fblog.devart.com%2Fchatgpt-vs-bard-for-postgresql-developers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/chatgpt-vs-bard-for-postgresql-developers.html&title=ChatGPT-4+vs+Bard%3A+What+Are+the+Differences+for+PostgreSQL+Developers%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/chatgpt-vs-bard-for-postgresql-developers.html&title=ChatGPT-4+vs+Bard%3A+What+Are+the+Differences+for+PostgreSQL+Developers%3F) [Copy URL](https://blog.devart.com/chatgpt-vs-bard-for-postgresql-developers.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/check-if-a-database-exists-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Check if a Database Exists in SQL Server By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) February 21, 2024 [0](https://blog.devart.com/check-if-a-database-exists-in-sql-server.html#respond) 2039 Checking if a particular database exists on the server is one of those routine tasks that are typically a tiny yet important part of a more complex job. In this article, we examine a common use case of comparing and synchronizing databases that involves checking whether a specific database exists in SQL Server. Contents Prerequisites How to check if a database exists in SQL Server Run a check using T-SQL Compare database schemas in dbForge Studio for SQL Server Configure database synchronization Automate synchronization from the command line Conclusion Prerequisites To proceed with our demonstration, we need the following: SQL Server of any edition, locally installed. We use the [SQL Server Express edition](https://www.microsoft.com/en-us/sql-server/sql-server-downloads) which is free of charge, and its functionality is enough for our task. [Microsoft ODBC Driver](https://learn.microsoft.com/en-us/sql/connect/odbc/microsoft-odbc-driver-for-sql-server?view=sql-server-ver16#download) for Windows. If you work with a different operating system, note that Microsoft also provides this ODBC driver for macOS and Linux. Also note that the ODBC Driver is only compatible with a locally installed SQL Server, and you need it on the same computer where your SQL Server is installed. The [sqlcmd utility](https://learn.microsoft.com/en-us/sql/tools/sqlcmd/sqlcmd-utility?view=sql-server-ver16&tabs=go%2Cwindows&pivots=cs1-bash#download-and-install-sqlcmd) . [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . It is a multi-featured IDE for SQL Server that is more powerful than the standard SQL Server Management Studio (SSMS). We are going to utilize the integrated [Schema Compare](https://www.devart.com/dbforge/sql/studio/sql-server-schema-compare.html) feature. Now, let us see how to check if a specified database exists and run a schema comparison process afterward. How to check if a database exists in SQL Server Comparing and synchronizing database schemas is a fundamental task of database management. As a rule, it is necessary when deploying changes from the development stage to testing or production. dbForge Studio for SQL Server allows performing this comparison quickly and easily, as well as automating it with the help of the command line. However, when configuring the process, especially in large organizations with many projects, it can be vital to ensure that a certain required database exists in SQL Server (and not moved to the cloud, for instance) before launching the comparison. Otherwise, the entire process may fail because the target database will be unavailable. Thus, the first step in configuring the database comparison is to check if the database exists in our SQL Server. Run a check using T-SQL To define if the database in question exists in SQL Server, we can use the standard T-SQL means, querying the DB_ID() function or the sys.databases catalog view. Querying the DB_ID() function is the simplest approach. Use the below command to check if the database AdventureWorks2022 exists on the server: IF DB_ID('AdventureWorks2022') IS NOT NULL\nBEGIN\n PRINT 'Database Exists'\nEND Querying the sys.databases catalog view is another popular method. The following command will bring the answer if the database exists on your server: IF EXISTS(SELECT * FROM master.sys.databases \n WHERE name='AdventureWorks2022')\nBEGIN\n PRINT 'Database Exists'\nEND Which one to choose depends on your preferences. Compare database schemas in dbForge Studio for SQL Server Let us explore the database schema comparison in dbForge Studio for SQL Server. In our demonstration, we use two test databases, OlympicGames_Dev and OlympicGames_Prod . 1. Launch the schema comparison wizard by selecting New Schema Comparison from the Comparison menu. 2. Now follow the wizard to configure the database comparison task. On the Source and Target page, specify the databases to be compared. 3. Specify the comparison options on the Options page. 4. Check and adjust the mapping where necessary. First, on the Schema Mapping page: Then on the Table Mapping page: 5. Once ready, click Compare to start the schema comparison process. The Studio will present the results visually, so you can view all the discrepancies between the two databases. As you can see in the screenshot, there is a difference between the OlympicGames_Dev and the OlympicGames_Prod databases. Now we can synchronize these databases. Also, we can automate the sync process with the help of the command line. This process will also include checking if the database exists on the server. Configure database synchronization After examining the database comparison results, we can configure the database synchronization process. 1. Click the green arrow button to launch the Schema Synchronization Wizard . 2. On the Output page of the Schema Synchronization Wizard, specify the preferences for the database syncing process. dbForge Studio for SQL Server generates the deployment script; further, it can present the script to the user at once, save it to a file, or execute it directly against the target database. 3. View the synchronization options provided on the Options page. You can leave them as the Devart Defaults or adjust them according to your preferences and requirements and save configurations for the future. Then click Next . 4. On the Additional Scripts page, you can set the Studio to execute certain scripts before or/and after the synchronization procedure. Note that you may enter the scripts directly into the Wizard or point to the SQL scripts stored locally. 5. Finally, view the information on the Summary page and choose the synchronization mode. To start the database synchronization process immediately, click Synchronize . If you want to automate the sync process, click Save Command Line in the bottom left corner of the Schema Synchronization Wizard. Automate synchronization from the command line After clicking the Save Command Line button, we can save them as a BAT file. In our demonstration, we save it under the SyncOlympicGames.bat name in the root directory of the C drive. Now, do the following to automate the schema comparison task and include the algorithm of checking if the database exists on the server in this process. 1. Save the below script as a separate DBExists.sql file in the C root directory too. IF NOT EXISTS \n (\n SELECT name FROM master.dbo.sysdatabases \n WHERE name = $(dbname)\n )\nBEGIN\n SELECT 'Database does not exists!!' AS Message;\n RAISERROR (50002, 10, 127);\nEND\nELSE\nBEGIN\n SELECT 'Synhroniaztion started!!' AS Message; \nEND 2. We need to include the verification of database existence as the first step of the database schema synchronization process. Proceed to the SyncOlympicGames.bat file, right-click it, and click Edit . Enter the following lines on the top of the SyncOplympicGames.bat contents: @echo off \necho Running DBExists.sql \nsqlcmd –S localhost\\SQLEXPRESS -v dbname= 'OlympicGames_Prod' -i DBExists.sql -b -o out.log if not errorlevel 1 \ngoto next1 echo == An error occurred \n:next1 Notice the following: -S requires the local SQL Server name – v defines the target database name – i defines the path to the DBExists.sql file; it is a script that will check if the OlympicGames_Prod database exists on the server – b –o are necessary to write the information about possible errors into the log file; by default, the log file will be saved in the same directory where sqlcmd is installed The modified SyncOlympicGames.bat file looks as follows in our case. It might look different in your environment, so make sure that all paths are specified correctly. 3. Save the changes in the SyncOlympicGames.bat file and execute it. As you can see, the synchronization of databases has been completed successfully, with updates from OlympicGames_Dev now deployed onto OlympicGames_Prod . However, if any failure takes place during the task performance, you will be notified immediately about that. The warning will be present in the .bat file and in dbForge Studio for SQL Server itself. Detailed information about the causes of the error will also be provided. Resolve the trouble and re-run the task. Conclusion Verifying the existence of a database and taking actions based on the outcome is a fundamental operation critical for ensuring the safety and efficiency of your work. With dbForge Studio for SQL Server, configuring this process with all necessary steps becomes a simple task that can be automated for seamless performance. To explore this functionality and other features of the Studio, you can take advantage of the [30-day fully functional trial](https://www.devart.com/dbforge/sql/studio/download.html) . This trial provides complete access to the extensive capabilities of the most robust edition of dbForge Studio for SQL Server, allowing you to evaluate its performance under a full workload. Tags [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcheck-if-a-database-exists-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Check+if+a+Database+Exists+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fcheck-if-a-database-exists-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/check-if-a-database-exists-in-sql-server.html&title=How+to+Check+if+a+Database+Exists+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/check-if-a-database-exists-in-sql-server.html&title=How+to+Check+if+a+Database+Exists+in+SQL+Server) [Copy URL](https://blog.devart.com/check-if-a-database-exists-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/choosing-the-best-gui-client-for-sql-databases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Choosing the Best GUI Client for SQL Databases By [Victoria Shyrokova](https://blog.devart.com/author/victorias) May 28, 2024 [0](https://blog.devart.com/choosing-the-best-gui-client-for-sql-databases.html#respond) 1761 Even the most skilled software and database developers, DBAs, architects, managers, and analysts benefit from using GUI clients for SQL databases. Compared to manual input of text-based commands in one’s console, such tools give you way more efficiency. They offer a visual interface to build queries and robust functionality to establish connections, help you edit the database structure using graphical clues and elements, let you analyze and increase database performance, ensure security, and debug your code. SQL GUI clients come in all flavors, differing in price, capabilities, and supported DBMSs. In this article, we’ll overview the top ten options that you might consider using and compare the features list to ensure you don’t miss out on anything important before getting the option that works for you best. Key features to look for in a SQL GUI tool General features of SQL database GUI tools UI-specific features comparison dbForge Edge MySQL Workbench Beekeeper Studio DBeaver DataGrip HeidiSQL Navicat Premium DbVisualizer RazorSQL OmniDB Wrapping up Key features to look for in a SQL GUI tool First, let’s check what you can expect from a GUI client for SQL database development and management. The basic features range from SQL editing and execution to building visual queries. However, some features relate specifically to GUI, such as visual interface flexibility and prowess. For your convenience, we have separated them into two blocks, describing them in detail to provide a clear impression of what stands behind each feature. General features of SQL database GUI tools Most SQL GUI tools share common features that can enhance one’s experience working with a database. Below, we have listed some of the most important ones, as well as made a short comparison table where we analyze the most popular SQL GUI clients and highlight what they have under the hood. Features list dbForge Edge MySQL Workbench Beekeeper Studio DBeaver DataGrip HeidiSQL Navicat Premium DBVisualizer RazorSQL OmniDB SQL editing and execution ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Code completion ✔ ✔ ✔ ✔ ✔ ✘ ✔ ✔ ✔ ✔ Visual query builder ✔ ✘ ✘ ✔ ✘ ✘ ✔ ✔ ✔ ✔ Database design ✔ ✔ ✘ ✔ ✔ ✘ ✔ ✔ ✘ ✔ Table designer ✔ ✔ ✘ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Object editor ✔ ✔ ✘ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Debugger ✔ ✘ ✘ ✔ ✔ ✘ ✔ ✔ ✘ ✔ Database explorer ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Data editor ✔ ✔ ✘ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Schema & Data comparison ✔ ✘ ✘ ✔ ✘ ✘ ✔ ✔ ✔ ✔ Data export ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Data import ✔ ✔ ✘ ✔ ✔ ✔ ✔ ✔ ✔ ✔ Data analysis ✔ ✘ ✘ ✔ ✔ ✘ ✔ ✔ ✔ ✔ Source control ✔ ✘ ✘ ✘ ✔ ✘ ✘ ✔ ✘ ✘ Performance tuning ✔ ✔ ✘ ✔ ✔ ✘ ✔ ✔ ✘ ✘ Test data generation ✔ ✘ ✘ ✔ ✘ ✘ ✔ ✔ ✘ ✘ Database documenter ✔ ✔ ✘ ✘ ✘ ✘ ✘ ✘ ✘ ✘ Database administration ✔ ✔ ✘ ✔ ✘ ✘ ✔ ✔ ✘ ✘ User & Session management ✔ ✔ ✘ ✔ ✘ ✔ ✔ ✘ ✘ ✔ As you can see, all of the SQL database GUI tools we have compared can help you with SQL editing and execution to some extent, combining automatic syntax check functionality, code snippets support, code outlining, and SQL history. However, some of these tools are limited to pure basics, while others provide extended features, such as navigation through large scripts, bookmarking, or fast access to the object editor. Yet some of the features are pretty rare. For example, dbForge Edge is the only tool in the list that provides the database documentation functionality, which can be extremely helpful for those looking to ensure smooth team collaboration and consistency throughout all processes. Among other rare features is a visual query builder , which provides coding-less query construction, visual editing of sub-queries, drag-and-drop table editing, and preview of diagrams, which one can print if necessary. Source control is another invaluable yet rare asset, especially if you’re looking to establish integration with Git, GitHub, or SVN, want to switch to a local environment without conflicts, or wish for version control of static data. Also, if you are into database development, you might need your SQL GUI tools to be capable of generating test data , which only some of them can do. We strongly recommend that you study the comparison table and look for the most valuable features in your work to find the tool that truly matches your needs. UI-specific features comparison Since GUI stands for “graphical user interface,” we cannot overlook the UI-specific features that enhance one’s experience while working with a database. To ensure that you have considered them in your choice, we provide the following comparison table, where you can explore the UI perks of SQL GUI clients. User Interface dbForge Edge MySQL Workbench Beekeeper Studio DBeaver DataGrip HeidiSQL Navicat Premium DbVisualizer RazorSQL OmniDB UI skins ✔ ✘ ✔ ✘ ✔ ✔ ✔ ✔ ✔ ✘ Customizable window layout ✔ ✘ ✘ ✔ ✔ ✘ ✔ ✔ ✔ ✘ Multiple shortcut schemes with full shortcut customization ✔ ✘ ✘ ✔ ✔ ✘ ✘ ✔ ✔ ✘ Syntax highlighting customization ✔ ✘ ✘ ✔ ✔ ✔ ✔ ✘ ✔ ✔ Tabbed groups for documents ✔ ✘ ✘ ✘ ✔ ✘ ✘ ✘ ✘ ✘ Toolbar customization ✔ ✘ ✘ ✔ ✔ ✘ ✘ ✘ ✔ ✘ Wizard for sharing common code standards and templates ✔ ✘ ✘ ✘ ✔ ✘ ✘ ✔ ✘ ✘ As you can see, some of the tools providing GUI for SQL databases do not offer extended user interface customization, which makes them less flexible. You might need to get accustomed to their interface and feel less comfortable working with syntax or multiple windows and tabs when using them. When choosing the best SQL GUI tool, you definitely should consider the visual aspect. Now that we have examined the most important features in SQL database GUI clients, you might feel more confident about what to expect from such tools. Still, many more aspects might affect your choices, such as the tool’s compatibility with your operating system, the client’s licensing, DBMS support, and pricing. That’s why we encourage you to read the list of overviews to explore the tools we’ve already briefly encountered in an in-depth comparison. Also, for your convenience, we’re adding a score that specifies how many features the tool provides. dbForge Edge [Overview >](https://www.devart.com/dbforge/edge/) [Try for free >](https://www.devart.com/dbforge/edge/download.html) OS compatibility: Windows, macOS, Linux. General features score: 19 out of 19. UI features score: 7 out of 7. dbForge Edge is a powerful bundle of four dbForge Studios for SQL Server, MySQL (MariaDB), Oracle, and PostgreSQL, that combines a fully customizable user interface with a wide range of functionality fit for most of the tasks you face when working with popular databases and cloud servers. dbForge is built with the thought of the daily challenges software and database developers, DBAs, architects, managers, and analysts face when they have to write queries, optimize performance, and refactor objects, all while considering multiple table dependencies and specifics. The visual interface simplifies most of the tasks and is helpful both for experienced users as well as for beginners looking for an easy way to manage a database. Recommended for: database administrators, data analysts, developers, IT managers, database consultants, data engineers, data scientists, BI developers, QA engineers, and enterprise architects. Licensing: Yearly recurrent subscription or perpetual license Special offers for enterprise clients Free licensing for MVPs Free express edition (activated after the trial ends) Pricing: from $699.95/yr. per license. MySQL Workbench [Check the tool >](https://www.mysql.com/products/workbench/) [Try for free >](https://dev.mysql.com/downloads/workbench/) OS compatibility: Windows, macOS, Linux. General features score: 12 out of 19. UI features score: 0 out of 7. MySQL Workbench is a SQL GUI tool built to help specifically with the MySQL DBMS, offering basic features for smooth SQL editing and execution and robust code completion functionality. It is able to power table design and can help with essential database administration tasks, such as server status monitoring, access to server logs, and backup and restore features. Still, it misses out greatly on interface customization, cannot generate test data, doesn’t provide source control, and isn’t of much help when you need to boost your database performance. Recommended for: database administrators and developers. Licensing: open source (GPL license), commercial. Pricing: depends on a license. Beekeeper Studio [Check the tool >](https://www.beekeeperstudio.io/) [Try for free >](https://www.beekeeperstudio.io/get) OS compatibility: Windows, macOS, Linux. General features score: 4 out of 19. UI features score: 1 out of 7. Beekeeper Studio takes a steady approach to resolving basic tasks, such as SQL editing and formatting. It also offers context-aware code completion and lets you edit tables, export data, and explore databases. However, it lacks more advanced functionality, so you won’t be able to use it for database administration performance tuning or data analysis. Still, it’s an excellent option for some database development tasks. Recommended for: database developers. Licensing: commercial (per user). Pricing: depends on a license and starts at $7/mo. DBeaver [Check the tool >](https://dbeaver.com/) [Try for free >](https://dbeaver.io/download/) OS compatibility: Windows, macOS, Linux. General features score: 16 out of 19. UI features score: 4 out of 7. DBeaver Ultimate is an SQL GUI tool provided under a commercial license. It can potentially improve one’s workflow with such functionality as context-aware code completion, advanced diagrams, and drag-and-drop tables. DBeaver comes with a visual database and table designer, and its users often find its customization options, such as shortcut customization and custom syntax highlighting, rather helpful. Recommended for: database administrators, IT managers, developers, and application architects. Licensing: commercial. Pricing: from $110 per year. DataGrip [Check the tool >](https://www.jetbrains.com/datagrip/) [Try for free >](https://www.jetbrains.com/datagrip/download/) OS compatibility: Windows, macOS, Linux. General features score: 13 out of 19. UI features score: 7 out of 7. DataGrip is listed among the most easy-to-customize SQL GUI tools for a reason — it comes with a window layout you can tweak to match your needs, window float and auto-hide options, syntax highlighting that you can customize, and UI skins you can choose from. Moreover, if you want a tool to edit objects and data, work with database design, and perform SQL editing, it will get the job done. Still, some features, like database documentation or test data generation, are missing. Recommended for: database administrators, IT managers, data engineers, developers, data scientists, BI developers, QA engineers. Licensing: commercial. Pricing: $99 for individual use and $299 /yr. per user for organizations. HeidiSQL [Check the tool >](https://www.heidisql.com/) [Use for free >](https://www.heidisql.com/download.php) OS compatibility: Windows. General features score: 8 out of 19. UI features score: 2 out of 7. HeidiSQL is another free SQL GUI tool you can use for database development and administration if you are looking only for purely basic functionality, like current statement execution, easy code snippet management, flat table editor, partitioning, and preview for changes in schemas. Also, it provides an option to connect to multiple databases. Still, there is no code autocompletion functionality, debugging, refactoring, or database design tools. There’s no way to compare data or schemas. Work with database projects or use it for data analysis. Still, HeidiSQL can be helpful to those looking for a free, open-source tool. Recommended for: database developers and enterprise architects. Licensing: open source. Pricing: free. Navicat Premium [Check the tool >](https://navicat.com/) [Try for free >](https://www.navicat.com/en/download/navicat-premium) OS compatibility: Windows, macOS, Linux. General features score: 17 out of 19. UI features score: 3 out of 7. Navicat Premium is one of the best SQL GUI tool options when it comes to database design. Coming with a visual database designer for tables and views, it also supports smart ER diagram layout that can search for a specific object on a diagram. With Navicat Premium, one also gets a debugger, database explorer, data editor, and multiple data export and import options, along with basic SQL editing and execution functionality. However, in Navicat Premium, you will only have some of the code completion features, and its visual query builder is very limited compared to other tools on this list. Recommended for: database administrators, data analysts, data scientists, developers, BI developers, QA engineers, and enterprise architects. Licensing: Yearly licensing Perpetual license Pricing: $699.99 / yr. or $1599 for a perpetual license. DbVisualizer [Check the tool >](https://www.dbvis.com/) [Try for free >](https://www.dbvis.com/download/) OS compatibility: Windows, macOS, Linux. General features score: 17 out of 19. UI features score: 4 out of 7. DbVisualizer is a database GUI tool that offers a free version with limited functionality and a PRO subscription with a query builder, table export and import, charts, client-side commands, and table management. Within this tool’s PRO version, you’ll be able to perform most of the basic database development and design tasks. Recommended for: database developers and enterprise architects. Licensing: commercial (per user). Pricing: depends on a license, starting from $197/yr. per user. RazorSQL [Check the tool >](https://razorsql.com/) [Try for free >](https://razorsql.com/download.html) OS compatibility: Windows, macOS, Linux. General features score: 10 out of 19. UI features score: 5 out of 7. Being tested on over 40 databases, RazorSQL is one of the best SQL GUI tools available for SQL scripts editing, building queries, browsing database objects, and data comparison. Even though it lacks code completion functionality, cannot be used for database design, and doesn’t provide a debugger, it still can be helpful if you are looking for a SQL GUI client at a reasonable price. Recommended for: developers, database architects, database administrators. Licensing: commercial (per user). Pricing: depends on a license, starting from $129/yr. per user. OmniDB [Check the tool >](https://github.com/OmniDB/OmniDB) [Try for free >](https://github.com/OmniDB/OmniDB) OS compatibility: Windows, macOS, Linux. General features score: 14 out of 19. UI features score: 1 out of 7. OmniDB is a fully open-source GUI for database development, design, and administration that is open for contributions. It can provide you with SQL editing and execution, code completion, database design, and object editing features. Even though it is limited to the most essential features, it’s a free SQL GUI tool that one can use with a limited budget. Recommended for: database administrators, developers. Licensing: open source. Pricing: free. Favoring a visual interface over writing complex commands? Check the list of best [Git tools for Windows](https://blog.devart.com/top-10-mysql-gui-tools-for-database-management-on-windows.html) with a GUI to strengthen your toolset! Wrapping up Database design and development, management, and administration have become more accessible with the rise of the tools offering GUIs for SQL database editing. Visual representation of tables and objects, as well as self-explanatory structures showcased in diagrams and relations, combined with the neat UI, assist in optimization, refactoring, and collaboration. Within our overview, we have ensured that you’ll be able to find the best SQL GUI for Windows, macOS, and Linux. We have also listed the most common features that can improve the routine of everyone who works with databases, from developers to analysts and application architects. Ready to try one of the top-scoring SQL GUI tools? Get [30 days of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) for free! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [sql](https://blog.devart.com/tag/sql) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fchoosing-the-best-gui-client-for-sql-databases.html) [Twitter](https://twitter.com/intent/tweet?text=Choosing+the+Best+GUI+Client+for+SQL+Databases&url=https%3A%2F%2Fblog.devart.com%2Fchoosing-the-best-gui-client-for-sql-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/choosing-the-best-gui-client-for-sql-databases.html&title=Choosing+the+Best+GUI+Client+for+SQL+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/choosing-the-best-gui-client-for-sql-databases.html&title=Choosing+the+Best+GUI+Client+for+SQL+Databases) [Copy URL](https://blog.devart.com/choosing-the-best-gui-client-for-sql-databases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/clone-colums-data-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Clone Data from One Column to Another in the Same SQL Server Table By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) June 7, 2023 [0](https://blog.devart.com/clone-colums-data-in-sql-server.html#respond) 2902 In database development, cloning data from one column to another is a common practice. Being a fundamental operation, it plays a pivotal role in ensuring data integrity, enabling efficient analysis, and facilitating various data management tasks. Whether it’s used for performing calculations, preserving backups, optimizing performance, or accommodating schema changes, understanding the process of cloning data in SQL tables is essential for developers seeking to harness the full potential of their databases. In this article, we’ll learn how to clone data from one column to another within the same SQL Server table using T-SQL and the dbForge Data Compare tool. Contents Reasons to clone data from one column to another in SQL tables Clone data between columns within the same SQL table using T-SQL Clone data between columns using dbForge Data Compare Step 1: Data Comparison. Step 2: Data Synchronization. Reasons to clone data from one column to another in SQL tables There are several reasons to clone data from one column to another in a SQL table. Here are a few common scenarios: Calculation and transformation Developers may need to perform calculations or transformations on the data in one column and store the results in another column. This ensures easier manipulation and analysis of the transformed data without altering the original values. Data backup and archiving Creating a copy of data from one column to another can serve as a backup or archival mechanism. It provides an additional layer of protection in case the original data gets modified, deleted, or corrupted. Data redundancy and performance optimization In some cases, duplicating data between columns can improve query performance. By pre-calculating or denormalizing data and storing it in a separate column, developers can avoid expensive computations during runtime and enhance overall system performance. Data migration and schema changes When modifying the structure or schema of a database table, developers may need to clone data from one column to another to ensure data compatibility or accommodate new requirements. Historical tracking and audit trail By cloning data from one column to another, developers can create a historical record or audit trail of changes. This allows for tracking and analysis of data evolution over time. Data integration and data warehousing When integrating data from multiple sources or consolidating data into a data warehouse, developers may clone data from one column to another to align data formats or combine related information. As you can see, cloning data from one column to another provides developers with flexibility, performance optimization, data integrity, and historical tracking capabilities in various scenarios within a SQL table. Before you begin Before we start, here is a list of tips to help you make the cloning process smooth, as well as minimize any risk of data loss or integrity issues, and ensure data consistency: [Create a backup](https://www.devart.com/dbforge/sql/studio/sql-server-backup-and-restore.html) of the table or database before making any modifications. This ensures that you have a copy of the original data in case of any unexpected issues during the cloning process. Check the compatibility of the data types between the source and destination columns. The data types must match or be converted to avoid any data truncation, loss, or type conversion errors. Consider any constraints defined on the columns, such as primary key, unique, or non-null constraints. Handle the cloning process within a transaction depending on the size and complexity of the data. You can roll back the transaction in case of any failures or errors. Connect to the SQL Server with a username having the ALTER permission to modify a SQL table granted. Clone data between columns within the same SQL table using T-SQL The UPDATE statement can be used to copy the data from one column to another using a T-SQL query. The syntax of the statement would be: UPDATE table \nSET new_column=old_column\nWHERE conditions; table is the name of the table to be updated. new_column is the name of the column to which the data will be copied. old_ column is the name of the column from which the data will be copied. conditions are the conditions to be met for the UPDATE statement you want to execute. If no conditions are provided, all records in the table will be updated. This parameter is optional. As a prerequisite, we have created the Orders table in the AdventureWorks2019 database to use in our example. Let’s execute the SELECT statement to view the data in the table. Next, we’ll add a new empty column – TotalAmountNew , which will serve as a clone of the data from the TotalAmountOld column. It is important to note that the data type of the TotalAmountNew column should match the data type of the TotalAmountOld column to maintain consistency. To modify the table, execute the ALTER TABLE statement: ALTER TABLE dbo.Orders\nADD TotalAmountNew MONEY; Now, we can copy the data from one column to another by executing the UPDATE statement. UPDATE dbo.Orders\nSET TotalAmountNew = TotalAmountOld; After that, execute the SELECT statement to see whether the data has been cloned: As you can see, the columns – TotalAmountNew and TotalAmountOld – display the same data. And now, let’s explore how to copy the data from one column to another within the same SQL table using the dbForge Data Compare tool. This can be achieved with the [Column Mapping](https://docs.devart.com/data-compare-for-sql-server/comparing-data/selecting-tables-and-views.html) feature, which allows mapping specific columns, even with different names. Want to learn how to streamline copying of SQL databases? Check our detailed guide on the process based on [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/copy-database-wizard.html) . Clone data between the columns using dbForge Data Compare The cloning process of data between two columns with the help of the dbForge tool includes two steps: Step 1: Data Comparison. Step 2: Data Synchronization. As a prerequisite for this example, we have created the Sales table in the AdventureWorks2019 database: As you can see, the DiscountNew column has been intentionally created empty because it will display the cloned data later. Step 1: Data Comparison. To get started, open dbForge Data Compare and click New Data Comparison . In the New Data Comparison wizard that opens, choose the same server connection and database as source and target. Then, go to the Mapping page. By default, all objects are selected for data comparison. Since we want to compare the columns from the Sales table, we exclude the other objects by clearing the checkboxes next to them. The Columns in Comparison grid column displays the number of columns to be compared for the selected database table. Click the More Options menu to open the Column Mapping dialog and specify the columns for comparison. The Column Mapping dialog shows the source columns on the left and the destination columns on the right. To proceed, clear the checkboxes next to all the columns except for the column with the primary key ( SalesID ) and the column from which the data will be copied ( Discount ). In the Select columns for comparison grid, do the following on the right: For the empty column – DiscountNew , select None . For the source Discount column, select the DiscountNew destination column that will contain the cloned data. Once done, click OK to save the changes. In the New Data Comparison wizard, click Compare to execute the comparison. After the data sources have been successfully compared, the comparison results appear in the grid. The source and destination values that differ are displayed on the Different tab. Step 2: Data Synchronization. Now, we can run synchronization to finish the cloning process. To do this, click Synchronize data to the target database on top of the document. In the Data Synchronization Wizard that opens, select Execute the script directly against the target database and click Synchronize . In the pop-up window informing that you are about to modify the database on the specified server is displayed, click OK to confirm the action. Let’s check whether the data from one column has been cloned to another column in the SQL table. To do this, run the SELECT statement. In the output, we see that the values from the Discount column have been copied to the DiscountNew column. Access data and [schema comparison](https://www.devart.com/dbforge/sql/studio/sql-server-schema-compare.html) features along with SQL coding assitance, source control, database and table designer, and a full toolset for database management and administration within dbForge Studio for SQL Server. Conclusion To sum up, there are different ways to clone data between columns within the same SQL table, and which one to use depends on your specific needs. Still, we would like to note that the cloning process using dbForge Data Compare is simple and seamless and does not require extensive technical knowledge. This tool simplifies and accelerates routine operations, enhancing developers’ productivity significantly. [Download](https://www.devart.com/dbforge/sql/datacompare/download.html) a fully-functional 30-day trial version of dbForge Data Compare for SQL Server to evaluate its advanced capabilities. Tags [clone data between columns](https://blog.devart.com/tag/clone-data-between-columns) [data compare for sql](https://blog.devart.com/tag/data-compare-for-sql) [data synchronization](https://blog.devart.com/tag/data-synchronization) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fclone-colums-data-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Clone+Data+from+One+Column+to+Another+in+the+Same+SQL+Server+Table&url=https%3A%2F%2Fblog.devart.com%2Fclone-colums-data-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/clone-colums-data-in-sql-server.html&title=How+to+Clone+Data+from+One+Column+to+Another+in+the+Same+SQL+Server+Table) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/clone-colums-data-in-sql-server.html&title=How+to+Clone+Data+from+One+Column+to+Another+in+the+Same+SQL+Server+Table) [Copy URL](https://blog.devart.com/clone-colums-data-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/code-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Code Completion and SQL Code Navigation improved in SQL Complete v6.2 By [dbForge Team](https://blog.devart.com/author/dbforge) October 2, 2019 [0](https://blog.devart.com/code-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html#respond) 4177 The dbForge team is excited to announce the release of a greatly enhanced SQL Complete v6.2 . Being the flagman solution of the Devart company, [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is aimed at building consistency and quality into your code. This add-in for SSMS and VS will facilitate your database development, minimize downtime and help avoid performance issues. The release addresses a number of major features and improvements . SQL Server Management Studio v18.3 support The [dbForge database tools](https://www.devart.com/dbforge/) receive improvements on an ongoing basis. We are committed to meeting our customers’ expectations and for that reason, we urged to roll out our SQL Complete product with support for a recently released SQL Server Management Studio v18.3. Code Completion and Snippets Manager Functionality The primary objective of the SQL Complete tool is to offer intelligent code suggestions directly in your IDE and thus speed up the process of coding by eliminating typos and logical mistakes. We strive to improve this functionality, so meet the new code completion features. Support for the MIN_ACTIVE_ROWVERSION function With SQL Complete, you can now benefit from MIN_ACTIVE_ROWVERSION function. The Extend Insert Highlight Occurrences feature to show a pop-up with a column name in the VALUES clauses In the updated SQL Complete, you will be prompted to enter the column name after opening the bracket in the VALUES clause. A hint will appear displaying the column names according to the cursor position in the VALUES clause as you enter values. This will help you quickly navigate when entering column values. Displaying MS_Description for the Azure objects MS_Description extended property stores a basic description of an object. It is widely used for documentation and content purposes. The support for MS_Description for Azure objects was one of our clients’ eagerly-awaited functions. Suggesting properties for the built-in metadata functions (SERVERPROPERTY, FILEPROPERTY, DATABASEPROPERTYEX, etc.) In SQL Complete v6.2, properties for built-in metadata functions (SERVERPROPERTY, FILEPROPERTY, DATABASEPROPERTYEX, etc) are now prompted, allowing you to make full use of the system catalog to find out more about a database. Prompting times zones in AT TIME ZONE AT TIME ZONE converts an inputdate to the corresponding datetimeoffset value in the target time zone. The SQL Complete tool functionality has been extended to suggest time zones when writing SQL queries, helping format your SQL instances and handle all time zone calculations. Prompting hints names for USE HINT option The USE HINT query hint argument provides a method to add behavior to a single query and lets you drive the query optimizer without elevated credentials or without being a member of the sysadmin server role. Prompting objects in the context of DBCC SHOW_STATISTICS DBCC SHOW_STATISTICS displays current query optimization statistics for a table or indexed view. This statement is one of the most common database scripts and the updated SQL Complete now suggests the objects in the query, substantially accelerating database development. Prompting indexes in WITH clauses Index selection in WITH clauses can greatly speed up your coding and thus improve your overall performance. Suggesting GENERATED ALWAYS AS ROW START options in the Completion List The GENERATED ALWAYS AS ROW START hint in the Completion List notably facilitates working with tables. Displaying propertynames for the SERVERPROPERTY functions In SQL Complete v6.2, propertynames used as arguments for SERVERPROPERTY function are suggested, making it quicker and easier to get the information about the current SQL Server instance version, build, and other useful details. Support for the DBCC CHECKIDENT command The updated code completion functionality of the SQL Complete tool now supports DBCC CHECKIDENT command and prompts table_name arguments for it. Exclusion of IDENTITYCOL, $IDENTITY when unfolding asterisk At our customers’ request, in SQL Complete v6.2 we have excluded IDENTITYCOL and $IDENTITY from the suggestion popup window in SELECT statements when unfolding an asterisk. SQL Сode Navigation functionality The navigation functionality of SQL Complete has also been extended to ensure the ease of navigation and productivity extension. Jumping between BEGIN TRY/END TRY and BEGIN CATCH/END CATCH When working with large scripts, it is important to be able to quickly navigate between paired keywords in an SQL statement. With SQL Complete v6.2, you can now jump between BEGIN TRY/END TRY and BEGIN CATCH/END CATCH in a blink. To jump between the matching keywords the shortcut Shift+F12 is used. Jumping between CASE and END The [CASE expressions](https://www.devart.com/dbforge/sql/sqlcomplete/sql-case-expression.html) used in statements can be quite long and navigating between their beginnings and ends can be a daunting task. To solve this problem, we have introduced jumping between CASE and END in the new edition of SQL Complete. Tell Us What You Think We invite you to [try the new features](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) of SQL Complete v6.2 and share your thoughts about them with us. Your feedback is highly appreciated and will help us advance our products, as we are fully committed to staying at the forefront of innovations in order to make your SQL coding as easy and effective as possible. Tags [Code Completion](https://blog.devart.com/tag/code-completion) [Code Navigation](https://blog.devart.com/tag/code-navigation) [Releases](https://blog.devart.com/tag/releases) [sql complete](https://blog.devart.com/tag/sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcode-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html) [Twitter](https://twitter.com/intent/tweet?text=Code+Completion+and+SQL+Code+Navigation+improved+in+SQL+Complete+v6.2&url=https%3A%2F%2Fblog.devart.com%2Fcode-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/code-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html&title=Code+Completion+and+SQL+Code+Navigation+improved+in+SQL+Complete+v6.2) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/code-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html&title=Code+Completion+and+SQL+Code+Navigation+improved+in+SQL+Complete+v6.2) [Copy URL](https://blog.devart.com/code-completion-and-sql-code-navigation-improved-in-sql-complete-v6-2.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/code-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Code-First Migrations and EF Core Support Improvements in dotConnect for SQLite 5.18 By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 29, 2021 [0](https://blog.devart.com/code-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html#respond) 2728 The new version of [Devart dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) contains significant improvements of Entity Framework Core support, improving Entity Framework Core Code-First Migrations a lot and adding support for previously unsupported operations. Besides, we supported mapping of more .NET data types and extended capabilities of LINQ query translation to SQL. LINQ to Entities Improvements dotConnect for SQLite now supports translation of the following LINQ features to SQL for both EF Core 3 and EF Core 5: The static IsNullOrWhiteSpace() method of the String class The static Today property and instance DayOfWeek and Ticks properties of the DateTime class The following static methods of the Math class: Abs(), Round(), Truncate(), Floor(), Ceiling(), Max(), Min(), Pow(), Sqrt(), Log(), Log10(), Sin(), Cos(), Tan(), Asin(), Acos(), Atan() Uri Data Type Mapping For Entity Framework Core 3 and 5, dotConnect for SQLite now supports mapping the internet/intranet System.Uri type to SQLite ‘text’ data type. public class Blog {\n public int Id { get; set; }\n public Uri Url { get; set; }\n public List Posts { get; set; }\n} CREATE TABLE Blog ( \n Id INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,\n Url text NULL,\n) IPAddress and PhysicalAddress Type Mapping For Entity Framework Core 5, dotConnect for SQLite supports mapping network types System.Net.IPAddress and System.Net.NetworkInformation.PhysicalAddress to SQLite ‘text’ data type. public class AccessLog {\n public int Id { get; set; }\n public Uri Url { get; set; }\n public IPAddress IP { get; set; }\n public DateTime Timestamp { get; set; }\n} CREATE TABLE AccessLog ( \n Id INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,\n Url text NULL,\n IP text NULL,\n Timestamp datetime NULL\n) Code-First Migrations Improvements SQLite database engine has significant [architectural limitations for the ALTER TABLE operation](https://www.sqlite.org/lang_altertable.html) . This is why dotConnect for SQLite didn’t have support for a number of EF Core Code-First Migrations operations. The new dotConnect for SQLite version provides support for more operations via a workaround with creating a new table and copying data from the old table to it: AlterColumn RenameColumn (for SQLite 3.24 or lower) DropColumn (for SQLite 3.34 or lower) AddForeignKey DropForeignKey AddPrimaryKey DropPrimaryKey Two of the new operations, RenameColumn and DropColumn , are supported in two different modes: If SQLite is of version 3.25/3.35 or higher, the native SQLite ALTER TABLE RENAME COLUMN and ALTER TABLE DROP COLUMN commands, which are supported since these versions, are used. If SQLite version is lower, the workaround with re-creating a table is used. Let’s consider the following example with Dept and Emp classes: public class Dept {\n public int Deptno { get; set; }\n public string Dname { get; set; }\n public virtual ICollection Emps { get; set; }\n }\n \n public class Emp {\n public int Empno { get; set; }\n public string Ename { get; set; }\n public int Deptno { get; set; }\n public virtual Dept Dept { get; set; }\n } The following mapping is used: protected override void OnModelCreating(ModelBuilder modelBuilder) {\n modelBuilder.Entity().ToTable(\"Dept\");\n modelBuilder.Entity().HasKey(p => p.Deptno);\n modelBuilder.Entity().Property(p => p.Deptno).ValueGeneratedNever();\n modelBuilder.Entity().Property(p => p.Dname).HasMaxLength(50);\n modelBuilder.Entity().ToTable(\"Emp\");\n modelBuilder.Entity().HasKey(p => p.Empno);\n modelBuilder.Entity().Property(p => p.Empno).ValueGeneratedNever();\n modelBuilder.Entity().Property(p => p.Ename).HasMaxLength(100).IsRequired(true);\n modelBuilder.Entity().HasIndex(p => p.Ename);\n modelBuilder.Entity().HasOne(e => e.Dept).WithMany(d => d.Emps).HasForeignKey(e => e.Deptno);\n } After we add the migration and apply it, the following DDL is generated: CREATE TABLE IF NOT EXISTS Dept ( \n Deptno integer NOT NULL,\n Dname varchar(50) NULL,\n PRIMARY KEY (Deptno)\n);\n\nCREATE TABLE IF NOT EXISTS Emp ( \n Empno integer NOT NULL,\n Ename varchar(100) NOT NULL,\n Deptno integer NOT NULL,\n PRIMARY KEY (Empno),\n FOREIGN KEY (Deptno) REFERENCES Dept (Deptno) ON DELETE CASCADE\n);\n\nCREATE INDEX IX_Emp_Deptno ON Emp (Deptno);\n\nCREATE INDEX IX_Emp_Ename ON Emp (Ename); Let’s increase a string field length from 100 to 200. modelBuilder.Entity().Property(p => p.Ename).HasMaxLength(200).IsRequired(true); Let’s add a migration with a single AlterColumn operation. migrationBuilder.AlterColumn(\n name: \"Ename\",\n table: \"Emp\",\n type: \"varchar(200)\",\n maxLength: 200,\n nullable: false,\n oldClrType: typeof(string),\n oldType: \"varchar(100)\",\n oldMaxLength: 100); After we apply the migration, the following DDL is generated: PRAGMA foreign_keys = 0;\n\nCREATE TABLE __ef_temporary_Emp__ ( \n Empno integer NOT NULL,\n Deptno integer NOT NULL,\n Ename varchar(200) NOT NULL,\n PRIMARY KEY (Empno),\n FOREIGN KEY (Deptno) REFERENCES Dept (Deptno) ON DELETE CASCADE\n);\n\nINSERT INTO __ef_temporary_Emp__ (Empno, Deptno, Ename)\nSELECT Empno, Deptno, Ename\n FROM Emp;\n\nDROP TABLE Emp;\n\nALTER TABLE __ef_temporary_Emp__\nRENAME TO Emp;\n\nPRAGMA foreign_keys = 1;\n\nCREATE INDEX IX_Emp_Deptno ON Emp (Deptno);\n\nCREATE INDEX IX_Emp_Ename ON Emp (Ename); Conclusion We are glad to present the updated dotConnect for SQLite with new features, and we are going to further improve Entity Framework Core support in the provider. We are waiting for your [feedback](https://www.devart.com/dotconnect/sqlite/feedback.html) and suggestions on the further improvements and new features. Meanwhile, our top priority feature is Entity Framework Core 6 support with all its new features and support for new .NET 6 types, like DateOnly and TimeOnly. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [sqlite](https://blog.devart.com/tag/sqlite) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcode-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html) [Twitter](https://twitter.com/intent/tweet?text=Code-First+Migrations+and+EF+Core+Support+Improvements+in+dotConnect+for+SQLite+5.18&url=https%3A%2F%2Fblog.devart.com%2Fcode-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/code-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html&title=Code-First+Migrations+and+EF+Core+Support+Improvements+in+dotConnect+for+SQLite+5.18) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/code-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html&title=Code-First+Migrations+and+EF+Core+Support+Improvements+in+dotConnect+for+SQLite+5.18) [Copy URL](https://blog.devart.com/code-first-migrations-and-ef-core-support-improvements-in-dotconnect-for-sqlite-5-18.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/code-review-board-re-designed.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [What’s New](https://blog.devart.com/category/whats-new) Code Review Board Re-designed By [ALM Team](https://blog.devart.com/author/alm) August 23, 2013 [1](https://blog.devart.com/code-review-board-re-designed.html#comments) 3203 Summary: In Review Assistant v2.0 we have completely redesigned the Code Review Board window. We have made efforts to make performance more straightforward, and to make the application UI more natural for Visual Studio 2012 and 2013. This is the first article in a series of What’s New in Review Assistant 2.0. While working on the new version of [Review Assistant](https://www.devart.com/review-assistant/) , we studied UX problems detected earlier. Solutions for many of these problems required application’s interface changes. We had started development of Review Assistant v 1.0 long before Visual Studio 2012 was released. That’s why the UI was more suitable for Visual Studio 2010. To go with the times we have decided to redesign the main window — Code Review Board. Here is how the UI looks in the new version. Code Review Board Dealing with ambiguity One of the most common UX problems was the ambiguity in the review statuses management process . A review status could be changed by clicking an appropriate button, even when a review was collapsed. There was also the possibility to do this by using a combobox, when the review was expanded. In such a case the Apply button had to be pressed. In the new version of Review Assistant we have unified the way of changing reviews statuses . We have placed commands that allow you to change a review status on a toolbar, that is located above the review. These commands are available for users with respect to their roles and the current review status. Review Commands Multiple repositories in one review One of the most wanted features was the ability to add files from multiple repositories to a single review . It was near impossible to implement such a feature, at least at that point in time, within the constraints of the existed interface. Here is what we have after UI changes Two Repositories in a Single Review Personalized review status Another UX problem was to go throug the reviews list, and to take a decision on what to do about each review. Reviews’ statuses were shown as an icon. Such indication was not sufficient as the experience has shown. Users were hardly able to memorize intermediate review statuses. Having examined the reviews list (especially when the All reviews filter was enabled), users were unable to define whether they were involved in it. To solve this problem we have decided to display personal statuses in a reviews list . Personal Reviews Statuses As you can see, the user can now define at a glance what they need to do about a specific review. Start reviewing code [Download](https://www.devart.com/review-assistant/download.html) our peer code review tool and start reviewing code with Review Assistant for free today. Tags [code review](https://blog.devart.com/tag/code-review) [review assistant](https://blog.devart.com/tag/review-assistant) [what's new review assistant](https://blog.devart.com/tag/whats-new-review-assistant) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcode-review-board-re-designed.html) [Twitter](https://twitter.com/intent/tweet?text=Code+Review+Board+Re-designed&url=https%3A%2F%2Fblog.devart.com%2Fcode-review-board-re-designed.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/code-review-board-re-designed.html&title=Code+Review+Board+Re-designed) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/code-review-board-re-designed.html&title=Code+Review+Board+Re-designed) [Copy URL](https://blog.devart.com/code-review-board-re-designed.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 1 COMMENT Reviewing Code from Multiple Repositories in One Review | Review Assistant 2.0 September 3, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 4:03 pm […] It was one of the Review Assistant 2.0 features that required UI changes. […] Comments are closed."} {"url": "https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) Column-Level SQL Server Encryption Example Using SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) March 26, 2021 [0](https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html#respond) 18150 The article provides an overview of the column-level SQL Server encryption, including the definition of the Always Encrypted feature, the keys it uses, and their types, and describes how to enable Always Encrypted and set up COLUMN MASTER KEY and COLUMN ENCRYPTION KEY in dbForge SQL Complete. Introduction In terms of enhancing data security and preventing it from unauthorized access, each company makes its best to take care of customers’ data, such as credit card numbers, credentials, social security numbers, financial history, etc. As a rule, this information is stored in the databases. To keep important data in and avoid a data breach, you can use encryption for connections, data, databases, tables, and columns. In SQL Server, there are different encrypting approaches applied at a column and database level that include certificates, asymmetric keys, and symmetric keys, encrypting algorithms, Transparent Database Encryption, Always Encrypted, and dynamic data masking. For example, with database encryption, the entire database and all associated log files, tables, and views can be encrypted and decrypted. With column-level encryption, only data in a column of the table can be encrypted or decrypted. What is Always Encrypted Always Encrypted is a feature used to manage encryption and decryption keys on the client side in order to keep sensitive data secure in the SQL Server databases. It allows you to control access to your sensitive data and limit possible data loss. You can set up the Always Encrypted feature for database columns by specifying encryption algorithms and keys. Always Encrypted uses two encryption keys to protect data: Column Encryption Key (CEK) that encrypts data in an encrypted column. Column Master Key (CMK) that encrypts a column encryption key. At least one master key should exist to encrypt columns. Always Encrypted supports two encryption types for the keys: Deterministic that generates the same encrypted value for any plaintext value. It allows using point lookups, equality joins, grouping, filtering, and indexing on encrypted table columns. Randomized that generates different encrypted values for the same plaintext value. It prevents from using joining, grouping, searching, and indexing on encrypted table columns. To move on, we will explore how users can enable Always Encrypted for the database and create the column master and column encryption keys using dbForge SQL Complete. Configuring Always Encrypted with SQL Complete To set up column-level encryption with the help of SQL Complete, we’ll perform the following steps: Create a new database and a table. Insert columns with values into the table. Retrieve data from the table. Create a column master key. Create a column encryption key. Encrypt columns for the created table. Step 1: Creating a new database and a table. In this step, we create the Shop database and the Person table using the CREATE DATABASE and CREATE TABLE statements respectively. For this, execute the following script: CREATE DATABASE Shop;\n\nUSE Shop\n\nCREATE TABLE Person (\n PersonId INT IDENTITY (1, 1) PRIMARY KEY\n ,Name VARCHAR(100)\n ,Password VARCHAR(6) COLLATE Latin1_General_BIN2 not null\n ,SSN VARCHAR(11) COLLATE Latin1_General_BIN2 not null\n); The script creates the Person table containing four columns: PersonID, Name, Password, and SSN, and defines a primary key on the PersonID column. Also, we applied collations for two table columns – Password and SSN. The COLLATE clause should be specified to enable Always Encrypted on columns. Also, it defines the Latin1_General_BIN2 value as a type of encryption. Step 2: Inserting columns with values into the table. Now, add data to the Person table by executing the following script: INSERT INTO Person (Name, Password, SSN) \nVALUES ('James', 'dxv4cL', '417-86-5080'), \n ('Emily', 'trv5cN', '247-13-2079'); The script inserts two rows with the values for the Name, Password, and SSN columns. Step 3: Retrieving data from the table. In this step, retrieve data from the Person table by executing the SELECT statement: SELECT * FROM Person; The output is as follows: As you can see, the table displays the actual data without any encryption applied. Step 4: Creating a column master key. In this step, we use the CREATE COLUMN MASTER KEY statement to create a column master key: CREATE COLUMN MASTER KEY MyCMK\nWITH\n(\n KEY_STORE_PROVIDER_NAME = N'MSSQL_CERTIFICATE_STORE',\n KEY_PATH = N'CurrentUser/my/215BE046B1650F033DD1742BA9FCE358E820C342'\n)\nGO where MyCMK is the name of the master key. To view the column master key you’ve just created, in Object Explorer , expand the Shop database and locate the key in the Security > Always Encrypted Keys > Column Master Keys folder. Step 5: Creating a column encryption key. The next step is to create a column encryption key using the CREATE COLUMN ENCRYPTION KEY statement: CREATE COLUMN ENCRYPTION KEY MyCEK \nWITH VALUES \n( \n COLUMN_MASTER_KEY = MyCMK, \n ALGORITHM = 'RSA_OAEP', \n ENCRYPTED_VALUE = 0x016E000001630075007200720065006E00740075007300650072002F006D0079002F0032003100350062006500300034003600620031003600350030006600300033003300640064003100370034003200620061003900660063006500330035003800650038003200300063003300340032007978F4B4753C7BEE506D330B87A447DBB5BA6D94A2A2B2B4764D2041D3482FCF4A08EEF875DA058315887D67AA55E6C73D5F53D803D9E4E3D6111896E3ED2751402D60F278F30D01C40E4752ACE6C6730C67BB57B6935EBDE694FB9D2AAE4FA4E4C36BA2B1E4391B2DF78D29A8124286A2831DF6AE88F8044FBD1204F3DF731552614A8905A601D8D4696C2DC059870FF1AAC47697E9A87EFCA81C993F5CCE9AA4E574146C89A9853882FC26A13639D6BFE1275EA153D770FC22C2A6D0BA4505068EED6C8AC0929B29121C0451C57AEB8E075CD48AA86CE2B9AE1DCFB550FA7E5C4A035ACD5CFB11E9E731FF8D6F80CAA80032C8E6D3B36F5F4BD1FCD048036E64EDE366D7B6F726A94F163E68D3D50BD7D0FBDF071959100B61DB7C2F49FDCD15EBDE4703A5327CEFF7C81F328EC78B95B1F2BF7D0AA8C19BA8A44A53C463012286128B2053157FFEBD78B3F0DA2E884D300D1947E23A050DE11021BE0837E7CEFAC4959DF95E21334CD3A3DCB77C391AAEFDE01D14FB08A63AC45B459ABF754CFAE467CDBF1969AA93A16DFEDF3AFB25996EB532592B7462FD42FE18F64C176CE207AC9F7D72414C4AD4CE5AF7750D4E07318D864F611A7A30A875353DC3797301C933C7F737DB5393610F3C8411E06493751FE8637BE07038AF98C45A429B270E08435CBC20166C31F64532A4A7CF476BEBDBFED2D104F54DDE0D35702115\n); where MyCEK is the name of the column encryption key. To verify the column encryption key you’ve just created, in Object Explorer , expand the Shop database and locate the key in the Security > Always Encrypted Keys > Column Encryption Keys folder. In addition, SQL Complete now suggests the following parameters in the CREATE COLUMN ENCRYPTION KEY statement: column master key RSA_OAEP encryption algorithm Step 6: Encrypting columns for the created table. After we have prepared the environment and created column master and column encryption keys, we can start encrypting the Password and SSN columns for the Person table by enabling Always Encrypted in SSMS. To\nencrypt values\nin the columns, do the following: 1. In Object Explorer , right-click the Person table and select Encrypt Columns . 2. In the Always Encrypted window that opens, switch to the Column Selection tab, select the Password and SSN checkboxes and Randomized as an encryption type for both columns, and then click Next . 3. On the Master Key Configuration tab, we can skip this step and click Next because we have already configured column master and column encryption keys. 4. On the Run Settings tab, select Proceed to finish now and click Next . 5. On the Summary tab, verify the parameters you defined in the previous steps and click Finish . 6. On the Results tab, monitor the encryption progress. Now, retrieve the data from the Person table by executing the SELECT statement: SELECT * FROM Person; As you can see, the columns Password and SSN are now encrypted: Have faced some changes in database data or structure, and want to check the query history? Check [how you can do it](https://blog.devart.com/sql-server-query-history.html) in SQL Server. Conclusion In the article, we have explored the Always Encrypted feature, a column master and column encryption keys, their types and exemplified how to configure column-level encryption in SQL Server using the SQL Complete tool. [Download](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) a free 30-day trial version of dbForge SQL Complete to evaluate the features the tool provides. Tags [column encryption key](https://blog.devart.com/tag/column-encryption-key) [column level SQL Server encryption](https://blog.devart.com/tag/column-level-sql-server-encryption) [column master key](https://blog.devart.com/tag/column-master-key) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcolumn-level-sql-server-encryption-example-using-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=Column-Level+SQL+Server+Encryption+Example+Using+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fcolumn-level-sql-server-encryption-example-using-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html&title=Column-Level+SQL+Server+Encryption+Example+Using+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html&title=Column-Level+SQL+Server+Encryption+Example+Using+SQL+Complete) [Copy URL](https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/comic-video-about-developing-dbforge-studio-for-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Comic video about developing dbForge Studio for MySQL By [dbForge Team](https://blog.devart.com/author/dbforge) June 17, 2011 [0](https://blog.devart.com/comic-video-about-developing-dbforge-studio-for-mysql.html#respond) 2478 Our company has been working on its own [MySQL GUI client](https://www.devart.com/dbforge/mysql/studio/) development for over 6 years. Many features were implemented in this product in accordance with our users’ requests, that acted as task originators for us. We are glad that the development process of our MySQL Front-End goes on and on, and we were able to release version 5.0 of the product. For the release of version 5.0, we decided to make a special video telling about the product history. You can see the result. Funny video about history of our MySQL GUI tool We did not intend to have a professional video made by clip-makers. The video was shot frame-by-frame using a common digital camera. After that jpg files were rendered to the mp4 video format. Shooting took almost 7 hours. All models were made by hand from plasticine by one of our employees, whose hobby is design and model-making. All of us want to thank him for this! Tags [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcomic-video-about-developing-dbforge-studio-for-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=Comic+video+about+developing+dbForge+Studio+for+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fcomic-video-about-developing-dbforge-studio-for-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/comic-video-about-developing-dbforge-studio-for-mysql.html&title=Comic+video+about+developing+dbForge+Studio+for+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/comic-video-about-developing-dbforge-studio-for-mysql.html&title=Comic+video+about+developing+dbForge+Studio+for+MySQL) [Copy URL](https://blog.devart.com/comic-video-about-developing-dbforge-studio-for-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/coming-soon-dbforge-schema-compare-for-sql-server-v1-10.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Coming Soon: dbForge Schema Compare for SQL Server v1.10 By [dbForge Team](https://blog.devart.com/author/dbforge) May 27, 2009 [0](https://blog.devart.com/coming-soon-dbforge-schema-compare-for-sql-server-v1-10.html#respond) 2968 After analyzing our users’ feedback on the first release of dbForge Schema Compare for SQL Server . We decided to change our roadmap for this product and to release version 1.1 of the product before adding new major features. So what new features will this version include? Improved performance . We have shortened comparison time in more than two times . It’s important for large databases. Synchronization of object permissions . Now permissions granted on specific object can be compared and synchronized along with object itself. RULE and DEFAULT support. SQL Server RULE and DEFAULT objects can be compared and synchronized now. Improved table synchronization .  We have added special processing in table synchronization that allows avoiding data loss in several cases. New version release scheduled at first week of June 2009. Tags [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcoming-soon-dbforge-schema-compare-for-sql-server-v1-10.html) [Twitter](https://twitter.com/intent/tweet?text=Coming+Soon%3A+dbForge+Schema+Compare+for+SQL+Server+v1.10&url=https%3A%2F%2Fblog.devart.com%2Fcoming-soon-dbforge-schema-compare-for-sql-server-v1-10.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/coming-soon-dbforge-schema-compare-for-sql-server-v1-10.html&title=Coming+Soon%3A+dbForge+Schema+Compare+for+SQL+Server+v1.10) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/coming-soon-dbforge-schema-compare-for-sql-server-v1-10.html&title=Coming+Soon%3A+dbForge+Schema+Compare+for+SQL+Server+v1.10) [Copy URL](https://blog.devart.com/coming-soon-dbforge-schema-compare-for-sql-server-v1-10.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/compare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Compare and Find Data Differences Between Two Tables in SQL Server With SSIS and dbForge Tools By [dbForge Team](https://blog.devart.com/author/dbforge) June 25, 2020 [0](https://blog.devart.com/compare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html#respond) 4502 Systems often work with distributed databases that contain distributed tables. Distribution is supported by various mechanisms, including replication. In this case, it is necessary to constantly maintain the synchronization of a particular data segment. It is also necessary to check the synchronization itself. This is when the need to compare data in two tables appears. Table of contents: Comparing database schemas using SQL Server Data Tools Comparing database schemas with the help of dbForge Schema Compare Comparing database data using SSIS Comparing databases data with the help of dbForge Data Compare Conclusion Before\ncomparing data in two tables, you need to make sure that the schemas\nof the compared tables are either the same or acceptably different.\nBy acceptably different we mean a difference in the definition of two\ntables, whereby data can be compared correctly. For example, types\nof corresponding columns of compared tables should be mapped without\ndata loss in these columns. Let us compare the SQL Server schemas of the two Employee tables from two different databases JobEmpl and JobEmplDB . For further work, it is necessary to recall the definitions of the Employee table of JobEmpl and JobEmplDB databases: USE [JobEmpl]\nGO\n\nSET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [dbo].[Employee](\n\t[EmployeeID] [int] IDENTITY(1,1) NOT NULL,\n\t[FirstName] [nvarchar](255) NOT NULL,\n\t[LastName] [nvarchar](255) NOT NULL,\n\t[Address] [nvarchar](max) NULL,\n\t[CheckSumVal] AS (checksum((coalesce(CONVERT([nvarchar](max),[FirstName]),N'')+coalesce(CONVERT([nvarchar](max),[LastName]),N''))+coalesce(CONVERT([nvarchar](max),[Address]),N''))),\n\t[REPL_GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL,\n CONSTRAINT [PK_Employee_EmployeeID] PRIMARY KEY CLUSTERED\n(\n\t[EmployeeID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO\n\nALTER TABLE [dbo].[Employee] ADD CONSTRAINT [Employee_DEF_REPL_GUID] DEFAULT (newsequentialid()) FOR [REPL_GUID]\nGO\n\n\nand\n\n\nUSE [JobEmplDB]\nGO\n\nSET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [dbo].[Employee](\n\t[EmployeeID] [int] IDENTITY(1,1) NOT NULL,\n\t[FirstName] [nvarchar](255) NOT NULL,\n\t[LastName] [nvarchar](255) NOT NULL,\n\t[Address] [nvarchar](max) NULL,\n CONSTRAINT [PK_Employee_EmployeeID] PRIMARY KEY CLUSTERED\n(\n\t[EmployeeID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO Comparing database schemas using SQL Server Data Tools With the help of Visual Studio and [SSDT](https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-ver15) , you can compare database schemas. To do this, you need to create a new project “JobEmployee” by doing the following: Img.1. Creating a project JobEmployeeDB Then, you need to import the database. To do this, right-click the project and on the shortcut menu, select Import > Database : Img.2. Select database import Next, click Select connection and on the Browse , tab set up the connection to the JobEmpl database as follows: Img.3. Configuring JobEmpl Database Import Next, click Start to start the import of the JobEmpl database: Img.4. Starting the database import process You will then see a window showing the progress of the database import: Img.5. The window of the database import progress When the database import process is completed, click Finish : Img.6. Finishing database import Once it is finished, the JobEmployee project will contain directories, subdirectories, and database objects definitions in the following form: Img.7. JobEmployee project after importing the JobEmplDB database In the same way, we create a similar JobEmployeeDB project and import the JobEmplDB database into it: Img.8. JobEmployeeDB project after JobEmplDB\ndatabase import Now, right-click the JobEmployee project and on the shortcut menu, select Schema Compare : Img.9. Calling the database schema compare\nwindow This\nwill bring up the database schema compare window. In the window, you need to select the projects as source and target, and then click Compare to start the comparison process: Img.10. Database schema comparison window We can see here that despite the differences between the definitions of the Employee tables in two databases, the table columns that we need for comparison are identical in data type. This means that the difference in the schemas of the Employee tables is acceptable. That is, we can compare the data in these two tables. We can also use other tools to compare database schemas such as dbForge Schema Compare for SQL Server. Comparing database schemas with the help of dbForge Schema Compare Now, to compare database table schemas, we use a [SQL diff tool](https://www.devart.com/dbforge/sql/schemacompare/) , dbForge Schema Compare for SQL Server, which is also included in [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) . For this, in [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) , right-click the first database and on the shortcut menu, select Schema Compare > Set as Source : Img.11. Selecting the source-base for schema\ncomparison We simply transfer JobEmplDB, the second database, to the Target area and click the green arrow between source and target: Img.12. Setting up database schema comparison You simply need to click Next in the opened database schema comparison project: Img.13. Choosing the source and target for schema comparison Leave the following settings at their defaults and click Next : Img.14. Schema comparison options On the Schema Mapping tab, we also leave everything by default and click Next : Img.15. “Schema Mapping” Tab On the Table Mapping tab, select the required Employee table and on the right of the table name, click the ellipsis: Img.16. Selecting the Employee table The table mapping window opens up: Img.17. Column mapping of two tables In our case, only 4 fields are mapped, because two last fields are contained only in the JobEmpl database and are absent in the JobEmplDB database. This setting is useful when column names in the source table and target table do not match. The Column details table displays the column definition details in two tables: on the left – from the source database and on the right – from the target database. Now, click OK : Img.18. Column mapping of two tables-2 Now, to start the database schema comparison process, click Compare : Img.19. Starting schema comparison A progress bar will appear: Img.20. Schema comparison progress We then select the required Employee table: Img.21. The comparison of Employee table schema At\nthe bottom left, you can see the code for defining the source\ndatabase table and on the right – the target database table. We can see here, as before, that the definitions of the Employee table in two databases JobEmpl and JobEmplDB show admissible distinction, that is why we can compare data in these two tables. Let us now move on to the comparison of the data in two tables itself. By the way, to learn how to involve dbForge Schema Compare in the CI process, feel free to watch [this video](https://youtu.be/hllTzoXvoO8) . Comparing database data using SSIS We create a project called Integration Service Project in [Visual Studio](https://visualstudio.microsoft.com/) and name it IntegrationServicesProject: Let’s\nfirst make a comparison using SSIS.\nFor this, you need to have SSDT\ninstalled. Img.22. Creating a project called Integration\nServices Project We\nthen create three connections: to the source JobEmpl database to the target JobEmplDB database to the JobEmplDiff database, where the table of differences will be displayed the following way below: Img.23. Making a database connection That way, new connections will be displayed in the project as follows: Img.24. Displaying the created connections Then, in the project, on the Control Flow tab, we create a data flow task and name it “data flow task”: Img.25. Creating a data flow task Let us now switch to the data flow and create an element “Source OLE DB” by doing the following: Img.26. Creating a data source On the Columns tab, we then select the fields required for comparison: Img.27. Selecting the required fields Next, we leave by default and click OK : Img.28.“Error Output” Tab And now, right-click the created data source and on the shortcut menu, select Show Advanced Editor : Img.29. Selecting “Show Advanced Editor…” On the Input and Output Properties tab, set IsSorted to True in the properties for OLE DB source output and OLE DB source errors output: Img.30. Setting IsSorted property to True Next, for each of the Output Columns groups for the EpmloyeeID column, set SortKeyPosition property to 1. That is, we sort by the EmployeeID field value in ascending order: Img.31. Choosing sort by the EmployeeID field\nvalue in ascending order Similarly, let us create and set the data source to the JobEmplDB database. That way, we obtain two created sources in the data flow task: Img.32. Created data sources Now, we create a merge join element in the following way: Img.33. Creating “Merge Join” Please note that we merge tables using a full outer join. We then connect our sources to the created join element by merging “Merge Join” as follows: Img.34. Connecting sources to “Merge Join” We\nmake the connection from JobEmpl left and the connection from\nJobEmplDB – right. In\nfact, it is not that important, it is possible to do this the other\nway around. In the JobEmplDiff database, we create a different table called EmployeeDiff , where we are going to put data differences in the following way: USE [JobEmplDiff]\nGO\n\nSET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [dbo].[EmployeeDiff](\n\t[ID] [int] IDENTITY(1,1) NOT NULL,\n\t[EmployeeID] [int] NULL,\n\t[EmployeeID_2] [int] NULL,\n\t[FirstName] [nvarchar](255) NULL,\n\t[FirstName_2] [nvarchar](255) NULL,\n\t[LastName] [nvarchar](255) NULL,\n\t[LastName_2] [nvarchar](255) NULL,\n\t[Address] [nvarchar](max) NULL,\n\t[Address_2] [nvarchar](max) NULL,\n CONSTRAINT [PK_EmployeeDiff_1] PRIMARY KEY CLUSTERED\n(\n\t[ID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO Now, let us get back to our project and in the data flow task, we create a conditional split element as follows: Img.35. Creating a conditional split In the Conditional field for NotMatch, you need to type the following expression: (ISNULL(EmployeeID) || ISNULL(EmployeeID)) || (REPLACENULL(FirstName,\"\") != REPLACENULL(FirstName_2,\"\")) || (REPLACENULL(LastName,\"\") != REPLACENULL(LastName_2,\"\")) || (((Address != Address_2) && (!ISNULL(Address)) && (!ISNULL(Address_2))) || (ISNULL(Address) != ISNULL(Address_2))) This expression is true if the fields do not match with account for NULL values for the same EmployeeID value. And it is true if there is no match for the EmployeeID value from one table for the EmployeeID value in the other table, that is, if there are no rows in both tables that have the EmployeeID value. You can obtain a similar result in the form of selection, using the following T-SQL query: SELECT e1.[EmployeeID] AS [EmployeeID],\n\te2.[EmployeeID] AS [EmployeeID_2],\n\te1.[FirstName] AS [FirstName],\n\te2.[FirstName] AS [FirstName_2],\n\te1.[LastName] AS [LastName],\n\te2.[LastName] AS [LastName_2],\n\te1.[Address] AS [Address],\n\te2.[Address] AS [Address_2]\nFROM [JobEmpl].[dbo].[Employee] AS e1\nFULL OUTER JOIN [JobEmplDB].[dbo].[Employee] AS e2\n\tON e1.[EmployeeID] = e2.[EmployeeID]\nWHERE (e1.[EmployeeID] IS NULL)\nOR (e2.[EmployeeID] IS NULL)\nOR (COALESCE(e1.[FirstName], N'') <> COALESCE(e2.[FirstName], N''))\nOR COALESCE(e1.[LastName], N'') <> COALESCE(e2.[LastName], N'')\nOR COALESCE(e1.[Address], N'') <> COALESCE(e2.[Address], N''); Now, let us connect the elements “Merge Join” and “Conditional Split” as follows: Img.36. Connecting “Merge Join” and\n“Conditional Split” elements Next, we create an OLE DB destination element in the following way: Img.37. Creating a destination element Now, we map the columns: Img.38. Columns mapping We set the Error Output tab by default: Img.39. “Error Output” Tab-2 We can now join “Conditional Split” and “OLE DB JobEmplDiff” elements. As a result, we get a complete data flow: Img.40. A complete data flow Let us run the package that we have obtained: Img.41. Work of data flow Upon the successful completion of the package work, all its elements turn into green circles: Img.42. A processed package without errors If\nerrors occur, they are displayed in the form of red circles instead\nof green ones. To resolve the issues, you need to read log files. To analyze the data difference, we need to derive the necessary data from the EmployeeDiff table of the JobEmplDiff database: SELECT [ID]\n ,[EmployeeID]\n ,[EmployeeID_2]\n ,[FirstName]\n ,[FirstName_2]\n ,[LastName]\n ,[LastName_2]\n ,[Address]\n ,[Address_2]\n FROM [JobEmplDiff].[dbo].[EmployeeDiff] Img.43. Data difference-1 Here you can see the Employee table from JobEmpl database, where Address isn’t set, and FirstName and LastName are mixed up in some columns. However, there is a bunch of missing rows in JobEmplDB, which exist in JobEmpl: Img.44. Data difference-2 Comparing databases data with the help of dbForge Data Compare Let’s make data comparison with the help of [dbForge Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) , which is also included in SQL Tools. For this, in SSMS, right-click the JobEmpl database and on the shortcut menu, choose Data Compare > Set as Source : Img.45. Choosing a data source for data comparison Choose the second database JobEmplDB as Target and click the green arrow between source and target: Img.46. Starting the database data comparison\nsetup In the opened project of database comparison, click Next : Img.47. Defining source and target for data\ncomparison We leave the following settings by default and click Next : Img.48.\nData comparison options On the Mapping tab, we choose the desired Employee table and click the ellipsis: Img.49. Choosing the Employee table The field mapping window opens: Img.50. Field mapping In our case, only four fields are mapped, because two last fields are contained only in the JobEmpl database and are absent in the JobEmplDB database. This setting is quite convenient when the names of columns in the source table and target table do not match. “Column details” displays the details of the definitions of the columns from 2 tables: on the left – from the source database, on the right – from the target database. Let’s click OK . Note that you can set another key in “Comparison Key” including a customizable one. Now, click Compare to start comparing data in the databases: Img.51. Starting data comparison After that, the progress window will appear: Img.52. Data comparison progress Here we can see that 2 047 000 records are missing in the Employee table of the JobEmplDB database, which are present in the Employee table of the JobEmpl database, and the remaining 1000 records differ because they have no Address value in the source and because the FirstName and LastName field values are swapped in the table source. It is also possible to set visibility and sorting of fields: Img.53. Data difference-3 Img.54. Setting visibility and sorting of columns There is also a Find and Replace option: Img.55. Selecting the Find and Replace option Img.56. Find and Replace You can use [dbForge Search](https://www.devart.com/dbforge/sql/search/) to search by selected data. See also: [How to compare two MySQL tables](https://www.devart.com/dbforge/mysql/studio/mysql-data-comparison.html) Conclusion To sum up, we have studied the possible ways of comparing schemas and data of two databases. To illustrate this, we compared schemas and data of the Employee table in the JobEmpl and JobEmplDB databases. dbForge Schema Compare and Data Compare are [Devart tools](https://www.devart.com/) , included in SQL Tools, that enable us to perform database schema comparison without creating 2 projects for compared databases beforehand. They also enable users to compare database data without creating any extra tables. Tags [dbforge](https://blog.devart.com/tag/dbforge) [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [ssdt](https://blog.devart.com/tag/ssdt) [SSIS](https://blog.devart.com/tag/ssis) [visual studio](https://blog.devart.com/tag/visual-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcompare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Compare+and+Find+Data+Differences+Between+Two+Tables+in+SQL+Server+With+SSIS+and+dbForge+Tools&url=https%3A%2F%2Fblog.devart.com%2Fcompare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/compare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html&title=Compare+and+Find+Data+Differences+Between+Two+Tables+in+SQL+Server+With+SSIS+and+dbForge+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/compare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html&title=Compare+and+Find+Data+Differences+Between+Two+Tables+in+SQL+Server+With+SSIS+and+dbForge+Tools) [Copy URL](https://blog.devart.com/compare-and-find-data-differences-between-two-tables-in-sql-server-with-ssis-and-dbforge-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/comparing-two-oracle-schemas-is-simple-now.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Comparing Diffs Between Two Oracle Database Schemas By [dbForge Team](https://blog.devart.com/author/dbforge) February 28, 2011 [2](https://blog.devart.com/comparing-two-oracle-schemas-is-simple-now.html#comments) 45139 When developing databases, we frequently need to check what changes were made to its schema, if any mistakes were made, and, if there are any, we should roll back to the working version saved in the source control system. To do this, it’s enough to compare schemas of version 1.1 and 1.2, and generate an UPDATE SCRIPT using any Database Diff tool , including our [dbForge Schema Compare for Oracle](https://www.devart.com/dbforge/oracle/schemacompare/) . So what functionality should be available in such tool? Firstly , the tool should be easy-to-use and user-friendly. A user should not be obliged to learn a pile of topics in the tool documentation before using it. It should be enough to perform the following set of steps: 1) Select Source and Target 2) Select a schema or several schemas (if the tool provides such possibility) for comparison 3) Tune comparison process by checking the needed options (an optional step) 4) View comparison results 5) Tune script generation options (an optional step) 6) Generate an update script and/or execute it directly to the target database Secondly , the tool should support comparison of all object types you might have in your schemas – otherwise, how the tool is supposed to look for differences in them? And, at last, the tool should be fast. This point is especially important for searching data differences when a comparison of several schemas is performed, or the size of the project itself is large. Now let’s look at Oracle SQL Developer and its alternative — [dbForge Schema Compare for Oracle](https://www.devart.com/dbforge/oracle/schemacompare/) . Both are offering schema comparison functionality. Oracle SQL Developer Schema Comparison Wizard of [Oracle SQL Developer](https://www.oracle.com/database/technologies/appdev/sql-developer.html) can be opened through the Tools -> Database Diff menu. After you click this menu item, you’ll get a message telling you that it’s using Oracle Change Management, a payable option of the Oracle database, and you have to acknowledge you have a proper license to use that. Oracle SQL Developer: Database Diff requires Oracle Change Management Pack be licensed After that, you should choose Source Connection and Target Connection from the existing connections. Also, you can choose object types to compare during this step. Available object types are Tables, Views, Indexes, Package Spec, Package Body, Procedure, Functions, Triggers, Types, Sequence, Materialized View, Materialized View Logs, Synonyms, Database Link. Oracle SQL Developer: Diff Wizard Source and Destination page Press the Next button to open the Specify Objects page, where you can select objects for comparison and filter them by their type. Then press the Next button to open the Diff Summary page, where the Finish button becomes active. Then Press the Finish button. After comparison process is completed (it did not take much time), the result is shown as Diff Report. Oracle SQL Developer: Schema Diff Report Perfect, Oracle SQL Developer managed to compare test schemas, but it did not meet the requirements we specified at the beginning of the article. These are: 1. The tool needs availability of the Change Management license for your Oracle database. The price depends on your Oracle Database license, but it is quite high if you need to compare only a couple of schemas using the free SQL Developer. 2. There is no possibility to compare several schemas at once. 3. There are no comparison options as such so you cannot tune comparison process. 4. There is only the number of differences displayed in the provided Diff Report, but they cannot be viewed in the side-by-side DDL Diff. Also, Diff Report provides an Update script only for the single object selected in the grid and does not allow viewing the generated update script for several objects. 5. There is no Synchronization wizard with options for tuning synchronization process. As we see, there is quite a lot of requirements the tool doesn’t answer, isn’t there? We did our best to correct all these shortcomings in our tool for schema comparison. dbForge Schema Compare for Oracle Let’s run the dbForge Schema Compare for Oracle — [Oracle SQL Developer alternative](https://www.devart.com/dbforge/oracle/studio/alternative-to-oracle-sql-developer.html) , and press New Schema Comparison on Start Page. After this, we should create connections for source and target (this is impossible in Oracle SQL Developer). Schemas with identical names will be mapped automatically. Schemas also can be mapped manually, and several schemas can be included in comparison. dbForge Schema Compare for Oracle: Choose Source and Target schemas On the next two pages you can accurately tune the comparison process with the help of options and select the needed types for comparison of their DDL: Tables, Views, Packages, Procedures, Functions, Triggers, Array Types, Object Types, Table Types, Sequences, Materialized Views, Materialized View Logs, Synonyms, Database Links, XML Schemas. Unlike in Oracle SQL Developer, the Compare button is available on the first page of the wizard. dbForge Schema Compare for Oracle: Comparison Options and Object Filter After the comparison process is completed, a user-friendly document with comparison results is shown. In this document, you can inspect the differences visually, view the update script that can be generated for several objects selected in the grid at once, include and exclude the needed objects from synchronization, filter or group objects. It’s enough to call the corresponding Synchronization wizard to perform synchronization. dbForge Schema Compare for Oracle: Schema Comparison Document Synchronization Wizard allows selecting the path to save the generated script to disk, executing this script directly to the target database, tuning the synchronization process accurately, generating the update script, viewing warnings and action plan. dbForge Schema Compare for Oracle: Synchronization Script Output and Sync Options Well, let’s sum up everything. Oracle SQL Developer analog, developed by Devart answered the requirements we’ve specified in the beginning. However, dbForge Schema Compare for Oracle cannot be used in the continuous process of database development for [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , because it does not provide the possibility of performing comparison process and generating update script through the command line. Yes, it is so in the first version, but we’ve taken this fact into account and it will be corrected in the next version of the product. Tags [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcomparing-two-oracle-schemas-is-simple-now.html) [Twitter](https://twitter.com/intent/tweet?text=Comparing+Diffs+Between+Two+Oracle+Database+Schemas&url=https%3A%2F%2Fblog.devart.com%2Fcomparing-two-oracle-schemas-is-simple-now.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/comparing-two-oracle-schemas-is-simple-now.html&title=Comparing+Diffs+Between+Two+Oracle+Database+Schemas) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/comparing-two-oracle-schemas-is-simple-now.html&title=Comparing+Diffs+Between+Two+Oracle+Database+Schemas) [Copy URL](https://blog.devart.com/comparing-two-oracle-schemas-is-simple-now.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 2 COMMENTS Krish November 5, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:37 am Hi, While comparing different schemas in Oracle sql developer, it is showing differnce for the objects(Tables, Views, Indexes, Package Spec, Package Body, Procedure, Functions, Triggers, Types) even the white space is added at the difference objects. Is there any way to ignore white spaces while taking difference. triples March 28, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 5:05 pm Hi, Is there a way to export the differences from the comparison to an Excel/CSV file to get the data instead of a SQL delta? Comments are closed."} {"url": "https://blog.devart.com/comparison-database-management-systems.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [Uncategorized](https://blog.devart.com/category/uncategorized) Comparing the Best Database Management Systems By [Anna Bilchenko](https://blog.devart.com/author/annabil) February 25, 2025 [0](https://blog.devart.com/comparison-database-management-systems.html#respond) 710 With the increasing list of databases — SQL and NoSQL, each with its own strengths and use cases — it’s essential to understand how these systems differ and which one is right for your specific requirements. In this article, we’ll break down their features, advantages, and real-world use cases, helping you make the best choice for your project. Contents What is a Database Management System (DBMS)? List of the Best Relational Database Management Systems (RDBMS) Most Popular Non-Relational Databases (NoSQL) Lightweight and Embedded Databases Best Cloud-Based Databases How to Choose the Right Database for Your Application Conclusion What is a Database Management System (DBMS)? A Database Management System (DBMS) can be described as software that is designed to store, manage, and retrieve data efficiently. It sits at the core of most modern applications and is indispensable for ensuring that your data is well-organized, easily accessible, and fully secure. DBMS allow you to create, read, update, and delete data in a well-organized approach, which is often handled through a query language such as SQL. Scalability, performance, and security are the three cornerstones for those searching for the best database management software. With proper scalability, your system will be able to handle growing amounts of data and user traffic. Performance is extra important when you need fast and efficient retrieval of data. Security, on the other hand, is all about protecting sensitive information and controlling who can access the data. Developers must thoroughly understand these key elements before choosing the right DBMS type for their project. Different Types of DBMS: SQL vs. NoSQL The [types of database management systems](https://www.devart.com/difference-between-rdbms-and-dbms/) generally split into two main classes: SQL (relational) and NoSQL (non-relational). Each of them can offer certain unique features suited to various types of applications. SQL Databases (Relational): Store structured data in tables, with schemas being predefined. If you’re just getting started, it might help to understand the [MySQL meaning](https://blog.devart.com/what-is-mysql-definition-features-benefits-explained.html) , as it’s one of the most widely used relational databases globally. Use SQL for querying and data management, which promotes ACID compliance Ideal for applications that need complex relationships and data integrity to function, e.g., financial systems Can have scalability and flexibility issues when handling large or dynamic datasets NoSQL Databases (Non-relational): Store unstructured or semi-structured data, with schemas being flexible Support diverse models such as document, key-value, or graph databases Perfect for high-speed read/write operations, handling of big data May sacrifice ACID compliance when improved scalability and performance are required Hybrid approaches, in the context of database software comparison, find the middle ground in combining features of [both systems mentioned above](https://blog.devart.com/sql-vs-nosql.html) . Thanks to that, they can prove to be a more flexible solution if your organization prefers a blend of both choices. dbForge Edge is compatible with [35+ database and cloud services](https://www.devart.com/dbforge/edge/database-connections.html) . Explore how it can amplify your workflow! [Overview](https://www.devart.com/dbforge/edge/) [Try for free](https://www.devart.com/dbforge/edge/download.html) List of the Best Relational Database Management Systems (RDBMS) [Relational Database Management Systems](https://www.devart.com/what-is-rdbms/) (RDBMS) are a cornerstone in database management, specifically for applications that require structured data storage plus consistency. With their use of tables, rows, and columns, RDBMS offer reliable and scalable options, best suited for businesses and applications working with complex data relationships. Oracle — Enterprise-Grade Performance Oracle is the popular database software lauded for its scalability, security features, and boosted performance. It is trusted by top-tier global companies worldwide, designed to handle increased data volumes, and supports demanding transactional operations. Key features and use cases: Enterprise-grade scalability combined with in-memory processing AI Vector Search enables semantic and value-based search of structured and unstructured data Suited for large-scale applications (e.g. financial systems) Pros: Exceptional performance and scalability in the case of Real Application Clusters (RAC) Robust security when working with encryption and data masking Cons: Comes with high licensing costs Complex management requiring in-depth knowledge Boost your Oracle development with [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , an advanced IDE designed specifically to accelerate PL/SQL coding. MySQL — Reliable and Easy to Use MySQL is one of the most popular database management systems, being reliable and open-source. Known for its simplicity and reliable transaction handling, MySQL is an ideal choice for businesses on the lookout for a cost-effective database. Key features and use cases: Open-source with ACID compliance, ensuring reliable transactions Perfect for small to medium-scale web apps, e-commerce platforms, and also content management systems Offers support for JSON and vector data types Pros: Easy setup, reinforced with a user-centric interface Strong community support and detailed documentation High performance intended for read-heavy workloads Cons: Limited scalability in case of large-scale applications Lacks advanced features that can be found in PostgreSQL or Oracle Looking to improve your MySQL development? [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is a comprehensive IDE for MySQL and MariaDB, designed to simplify every aspect of database management. Microsoft SQL Server — Seamless Microsoft Integration Microsoft SQL Server is one of the relational database management system types, offering seamless integration with Microsoft’s ecosystem, making it the go-to choice for businesses heavily invested in Microsoft technologies. Key features and use cases: Advanced security features and integration with Microsoft tools Robust business intelligence (BI) and reporting capabilities Well-suited for large-scale enterprise applications relying on the Microsoft ecosystem Pros: Strong support and seamless integration with other Microsoft products Optimized for analytics, business intelligence, and reporting Cons: High licensing costs and total cost of ownership Limited cross-platform flexibility in relational database comparisons, contrasted with open-source databases Interested in testing out SQL Server tools before committing? With Devart’s free trial , you can explore fully functional versions of our software products [for 30 days at no cost.](https://www.devart.com/dbforge/sql/) PostgreSQL — Advanced and Highly Extensible PostgreSQL is a popular DBMS known for its robustness, complex query support, JSON handling, and customizable extensions, which makes it suitable for businesses requiring high data integrity and flexibility. Key features and use cases: Advanced querying capabilities, robust JSON support, and extensibility through plugins Well-suited for applications requiring complex data analytics, reporting, and high data integrity Pros: High standards of data integrity and consistency with ACID compliance Active community and an extensive database tool list for developers, including custom extensions Cons: Slower performance for high-transaction, simple queries Steeper learning curve for newcomers and developers unfamiliar with its advanced features Enhance your PostgreSQL development with [dbForge for PostgreSQL](https://www.devart.com/dbforge/postgresql/) , offering powerful tools such as Studio, Data Compare, and Schema Compare. IBM DB2 — Optimized for Analytics and AI IBM Db2 is a robust, enterprise-grade type of RDBMS optimized for advanced analytics, AI, and machine learning workloads. Its support for hybrid cloud environments makes it a preferred choice for data-intensive enterprises. Key features and use cases: Excellent optimization for AI-driven and machine learning applications Hybrid cloud support for seamless scalability and adaptability Suitable for enterprises focused on robust data analytics and business intelligence Pros: Outstanding performance for complex analytics workloads Flexible deployment options across on-premises and cloud platforms Cons: High cost and resource-intensive for smaller businesses Requires specialized expertise for effective management MariaDB — Improved Scalability Over MySQL MariaDB is an open-source relational type of database program. Developed as a fork of MySQL, it excels in database scalability comparison and performance for modern applications. It is well-suited for enterprise-level web applications and cloud-native deployments. Key features and use cases: Improved performance and scalability compared to MySQL, with support for distributed SQL and advanced storage engines Ideal for enterprise applications, ranking high on the DBMS software lists Pros: Open-source with enhanced security and scalability features Active development community ensuring rapid updates and advanced functionality Cons: Introduces new features and enhancements that may require additional learning for MySQL users Some MySQL-native GUI tools or third-party integrations may provide better support for MySQL Upgrading from MySQL to MariaDB is usually straightforward, but reverting to MySQL can be more complex due to feature differences dbForge MySQL product line is fully compatible with MariaDB. [Try it now!](https://www.devart.com/dbforge/mysql/) Most Popular Non-Relational Databases (NoSQL) Non-relational databases, or NoSQL databases, are designed for handling unstructured or semi-structured data, offering flexibility and boosted speed. They are well-suited for such modern use cases as real-time analytics, big data processing, and applications requiring horizontal scalability plus high-speed read/write operations. MongoDB — Flexible and Developer-Friendly MongoDB is a flexible and developer-oriented option from the NoSQL database list that is built to store data in a document-oriented format using JSON-like structures. It’s adaptable and easy to use, which makes it ideal for dynamic applications and agile development environments. Key features and use cases: Document-focused design with JSON-like storage for unstructured data Excellent for real-time analytics, Internet of Things (IoT) applications, and rapid prototyping Pros: Flexible schema design that accommodates evolving data needs Horizontal scalability intended for different database systems and large datasets Cons: Less suitable for workloads, as it demands strict data consistency Database performance comparison can be less favorable in case of highly complex queries Elasticsearch — Lightning-Fast Full-Text Search Elasticsearch is a lightning-fast, full-text search engine designed for near-instant data indexing and retrieval. It powers applications that require precise and effective high-speed searches across vast datasets. Key features and use cases: Full-text search capabilities with extra-fast indexing and query execution Perfect for logging, monitoring, and deep data search applications Pros: Exceptional search performance and scalability, handling big volumes of data effortlessly Distributed and scalable architecture that supports fault tolerance and increased availability Cons: High memory usage, which can go up with large-scale deployments Complex setup and maintenance requiring expertise to reach optimal performance Redis — Ultra-Fast In-Memory Data Store Redis is an exceptionally fast in-memory data store that is known for its speed and versatility. It supports a broad range of use cases, including caching, real-time analytics, and pub/sub messaging, which makes it a go-to choice for performance-critical applications. Key features and use cases: In-memory architecture suited for low-latency data access Perfect for session storage, caching, and leaderboard systems Pros: Lightning-fast, ideal for high-performance workloads Simple implementation as a caching layer to boost application performance Cons: Limited data persistence options that reduce long-term storage reliability High memory costs for handling large datasets Cassandra — Highly Scalable for Big Data Cassandra is a highly scalable wide-column store database created for big data applications. Noted for its fault tolerance and consistent availability, it is well-suited for distributed systems working with massive data volumes across numerous locations. Key features and use cases: Wide-column storage model, scoring high in big data database comparison Ideal for distributed applications that require high write throughput and fault tolerance Pros: Exceptional write performance for large-scale systems Resilient architecture with no single point of failure Cons: Limited querying flexibility compared to traditional SQL databases Steep learning curve for configuration and maintenance OrientDB — Multi-Model Flexibility OrientDB is a versatile multi-model database that supports both graph and document structures, offering flexibility for diverse application needs. Its ability to manage complex relationships makes it ideal for graph-based applications such as social networks or fraud detection systems. Key features and use cases: Supports both graph and document data models for hybrid applications Perfect for applications requiring relationship analysis and flexible schema design Pros: Combines graph and document capabilities in one platform Highly extensible for tailored use cases Cons: Smaller community compared to larger NoSQL solutions Performance may vary depending on workload and configuration Lightweight and Embedded Databases Lightweight and embedded databases are the go-to option if you need minimal resource usage for applications that deal with constrained environments or specific embedded systems. These databases, compared to other types, put forward the efficiency, simplicity, and reliable data management that mobile apps, IoT devices, and independent software solutions require. SQLite — Lightweight and Serverless SQLite is a self-contained, serverless database management system that stands out due to its simplicity and productivity. It is designed in a way so as to prioritize the minimal usage of resources, which is ideal for embedded systems and optimized applications. Key features and use cases: Serverless architecture that doesn’t require setup Perfect for mobile apps, IoT devices, and locally-based desktop applications Pros: Lightweight, easy to integrate Requires no configuration or server management Cons: Lacks scalability in case an application is larger or in need of high concurrency In most cases, limited to single-user operations Best Cloud-Based Databases Cloud-based databases are scalable, flexible, and highly available, which is a big pro for modern applications, as there is no need for on-premises infrastructure. They offer easy management, high reliability, and support seamless access to data from absolutely anywhere, ranking high in the comparison of databases. Amazon DynamoDB — Fully Managed and Scalable Amazon DynamoDB is a fully managed NoSQL database with seamless scaling and low-latency performance, which makes it an ideal fit for serverless and real-time applications. Key features and use cases: Fully managed with automatic scaling to promote high availability Optimized for high-velocity workloads, together with real-time analytics and IoT Has a convenient pay-as-you-go pricing model Pros: Effortless scaling, managed infrastructure Cost-effective pricing that is suitable for variable workloads Cons: Limited querying capabilities if compared to SQL databases Higher costs if you have demanding workloads Google Cloud — Comprehensive Database Solutions Google Cloud shines as a comprehensive suite of database solutions, which include Cloud SQL, Firestore, Bigtable, and Spanner. It caters to a wide range of workloads, from analytics to enterprise applications. Its robust integration with Google’s ecosystem ensures seamless performance for multi-cloud and hybrid environments. Key features and use cases: Wide range of database options for both relational and NoSQL needs Ideal for analytics, AI/ML-powered applications, and enterprise scalability Pros: Strong integration with Google’s ecosystem (e.g., BigQuery, AI tools) Flexible and scalable database services Cons: Complex pricing structure Steeper learning curve outside Google environments Amazon Redshift — Powerful Data Warehousing Amazon Redshift is a powerful, fully managed data warehousing solution designed for handling large-scale analytics and business intelligence workloads. Its seamless integration with the AWS ecosystem makes it a top choice for enterprises managing massive datasets. Key features and use cases: Optimized for big data analytics and complex queries Ideal for business intelligence and data-driven decision-making Pros: Excellent scalability for processing massive datasets Strong compatibility with AWS tools and third-party BI platforms Cons: Query optimization requires specialized expertise Higher operational costs for underutilized clusters Microsoft Azure — Versatile Database Options Microsoft Azure provides versatile database solutions, including Azure SQL Database, Azure Database for MySQL, and Azure Database for PostgreSQL. It enjoys enterprise-grade capabilities, making it ideal for hybrid cloud solutions and large-scale applications. Key features and use cases: Supports both relational and non-relational databases, supporting diverse workloads Tailored for enterprise-scale and hybrid cloud types of implementations Pros: Offers robust security and compliance features covering sensitive data Seamless integration with Microsoft’s ecosystem, enhancing productivity Cons: Increased complexity when managing multi-cloud setups Performance may fluctuate in case of heavy, large-scale workloads Supabase — Open-Source PostgreSQL Backend Supabase offers an open-source PostgreSQL backend that is tailored for developers with simplicity and scalability requirements. Its fully managed infrastructure aligned with integrated APIs makes it an excellent option on the database products list, fitting for modern web applications and startups. Key features and use cases: Turnkey PostgreSQL solution with real-time capabilities and authentication Optimized for Jamstack projects, as well as serverless workflows and rapid development Pros: Open-source product with an engaged and supportive community Easy integration with up-to-date development architectures Cons: Feature set is still expanding, as the platform is evolving Advanced configurations are narrower than in the standard PostgreSQL setups Heroku — Simplified Application Deployment Heroku provides a developer-focused platform for the needs of simplified application deployment, including managed, easy-to-use PostgreSQL databases. It comes with intuitive tools and integrations, which can be beneficial to small to medium-sized applications, driving quick setup and scalability. Key features and use cases: Managed PostgreSQL, boasting built-in scaling and monitoring tools Best suited for startups, web apps, and rapidly developing projects Pros: Developer-friendly interface with flexible third-party integrations Efficient management of applications and deployment Cons: Scaling will be costlier than with other platforms Not as suitable if you need support for high-volume business workloads How to Choose the Right Database for Your Application When you set out to select the best database system for your application, the first thing you will need to do is carefully evaluate these key factors: Type of Data Determine the type of data that your application will be handling. The options from the relational database list, like PostgreSQL or MySQL, are your top pick with structured data, while NoSQL options (MongoDB or Cassandra) are best used for unstructured or semi-structured data. Scalability and Performance Needs Next, consider the scalability and performance demands. Distributed databases (Amazon DynamoDB or Cassandra) are generally good for handling large-capacity or high-traffic environments. Consistency vs. Availability The next step to take will be to evaluate whether your application needs more consistency or availability. ACID-compliant databases do best with data integrity, while BASE models offer rigorous support for availability and scalability. Budget and Licensing Constraints Finally, you should factor in the estimates for budgeting and licensing costs. Open-source solutions (MariaDB or SQLite) can optimize the costs, but keep in mind that enterprise tools (Oracle or Microsoft SQL Server) are worth their price in the way of offering upscale features for larger-scale applications. Conclusion Choosing the right database management system (DBMS) is a crucial decision that directly impacts your project’s performance and scalability. With a variety of options — SQL Server, MySQL, PostgreSQL, MongoDB, Oracle, and more — each offering unique features and capabilities, it’s important to carefully match the type of [database software](https://www.devart.com/database-software/) to your specific needs. Consider factors like the type of data you are working with, your traffic expectations, and the budget concerns before making your choice. Every project is naturally different, so take the time to select the best DBMS that will fully support your goals. dbForge Edge is a powerful database management tool supporting SQL Server, MySQL, PostgreSQL, and Oracle. It offers a unified workspace for seamless development and administration with its multi-database connectivity, advanced query tools, and automation features. [Try it free for 30 days](https://www.devart.com/dbforge/edge/download.html) and boost your productivity! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbms](https://blog.devart.com/tag/dbms) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Anna Bilchenko](https://blog.devart.com/author/annabil) Always curious about how data flows and functions, I dive deep into the design and management of databases, from the first table sketch to fine-tuned performance. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcomparison-database-management-systems.html) [Twitter](https://twitter.com/intent/tweet?text=Comparing+the+Best+Database+Management+Systems&url=https%3A%2F%2Fblog.devart.com%2Fcomparison-database-management-systems.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/comparison-database-management-systems.html&title=Comparing+the+Best+Database+Management+Systems) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/comparison-database-management-systems.html&title=Comparing+the+Best+Database+Management+Systems) [Copy URL](https://blog.devart.com/comparison-database-management-systems.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/comparison-log-delivery-email.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Data and Schema Comparison Log Delivery to Email By [dbForge Team](https://blog.devart.com/author/dbforge) November 28, 2022 [0](https://blog.devart.com/comparison-log-delivery-email.html#respond) 2761 In this article, we will demonstrate to you how to set up automatic email delivery of the log file in both cases: when SQL data and schema comparison goes well and when it fails. The task will be completed by means of [dbForge Data Compare](https://www.devart.com/dbforge/sql/datacompare/) and [Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) . Specifically, we will use the tools’ command-line functionality to configure the automatic log delivery. Create the script There are three crucial steps to setting up automatic email notifications: Specify Source and Target in the corresponding .txt files. Lay out the mail settings in a .ps1 file. Create a .bat file to run both, comparison and mail delivery. Set Source and Target To begin with, let us specify what exactly we are going to compare: 1. Open Notepad and specify source server(s) and database(s). Use commas as a separator. DBFSQLSRV\\SQL2016, BicycleStoreDev\nDBFSQLSRV\\SQL2019, TestDatabaseDev 2. Save the .txt file. 3. Repeat the same procedure for the Target . DBFSQLSRV\\SQL2016, BicycleStoreDemo\nDBFSQLSRV\\SQL2019, TestDatabaseDemo As you can see, in our case, we are going to compare the development and production versions of the databases: TestDatabaseDev and BicycleStoreDev in the red corner, and TestDatabaseDemo and BicycleStoreDemo in the blue corner of the ring. Note : In order to illustrate the failed comparison scenario better, we have intentionally included the non-existent databases in the comparison. Spoiler: TestDatabaseDev and TestDatabaseDemo are the imposters. Configure the email settings The next item on our agenda is to configure the sender and the recipient of the comparison results. For this, we need to create a Windows PowerShell cmdlet file ( .ps1). It is a script, that contains a series of lines written in the PowerShell scripting language. 1. In Notepad, type in the following code: $emailFrom = \"email_from@test.com\"\n$emailTo = \"email_to@test.com\"\n$subj = \"email_subject\"\n$body = \"\"\n$file = \"path_to_file\"\n$smtpServer = \"\"\n\n$att = new-object Net.Mail.Attachment($file)\n$smtp = new-object Net.Mail.SmtpClient($smtpServer)\n$msg = new-object Net.Mail.MailMessage\n\n$msg.From = $emailFrom\n$msg.To.Add($emailTo)\n$msg.Subject = $subj\n$msg.Body = $body\n$msg.Attachments.Add($att)\n\n$smtp.Send($msg)\n$att.Dispose() 2. In order to cater it according to your particular needs, replace the quoted values of the following parameters with your data: $emailFrom = “email_from@test.com” – specify the sender’s email address. $emailTo = “email_to@test.com” – specify the recipient’s email address. $subj = “email_subject” – specify the mail subject. $body = “” – specify any text for the mail body, if required. $file = “path_to_file” – specify the path to the log file. $smtpServer = “” – specify the SMTP server of your mail service. 3. Save the file as .ps1. Create an executable file The last step in our scenario would be assembling all the previously prepared pieces into a single .bat file. This way, the task can be performed using the command line. 1. In Notepad, type in the following code: Set Compare=\"path_to_app\"\nSet Sender= powershell.exe\n\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (Source_Servers_and_DBs.txt) do (\n\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%g in (Target_Servers_and_DBs.txt) do (\n\n\n%compare% /comparison_command /source connection:\"Data Source=%%e;Initial Catalog=%%f;Integrated Security=False;User ID=sa\" /target connection:\"Data Source=%%g;Initial Catalog=%%h;Integrated Security=False;User ID=sa\" /log:Compare_result.log\n\n(\nif %ERRORLEVEL%==0 %Sender% -File D:\\temp\\sync_to_mail\\PowerShell\\send_email_script.ps1\ncd.>Compare_result.log\n)\n\n)\n)\n\npause 2. In the syntax above, mind the following: Set Compare=”path_to_app” – depending on what exactly you are comparing (data or schemas), specify either: “C:\\Program Files\\Devart\\dbForge Compare Bundle for SQL Server\\dbForge Data Compare for SQL Server\\datacompare.com” or “C:\\Program Files\\Devart\\dbForge Compare Bundle for SQL Server\\dbForge Schema Compare for SQL Server\\schemacompare.com” . Source_Servers_and_DBs.txt is the name of the previously created file containing source connections. Target_Servers_and_DBs.txt is the name of the previously created file containing target connections. %compare% /comparison_command – once again, depending on what exactly you are comparing, replace comparison_command with either: datacompare or schemacompare . D:\\temp\\sync_to_mail\\PowerShell\\send_email_script.ps1 is the location and name of the script with the email settings. For more information regarding all the aspects of the command-line syntax, refer to the [dbForge Data](https://docs.devart.com/data-compare-for-sql-server/working-with-particular-cases/compare-data-in-multiple-databases-from-the-command-line.html) and [Schema Compare](https://docs.devart.com/schema-compare-for-sql-server/working-with-particular-cases/compare-schemas-in-multiple-databases-from-the-command-line.html) documentation. 3. Save the file as .bat. Execute the script Finally, as we have already gathered all the “infinity stones”, we can proceed with our objective and run the created .bat file. Scenario 1 The first example demonstrates a successful comparison. The differences have been located, and the log file has been emailed: Data Compare Schema Compare The other three scenarios feature a non-existing database either in Source or Target (or both). Therefore, all of the following comparisons result in an error. Regardless, the corresponding log files are still generated and sent to the specified email address. Scenario 2 Data Compare Schema Compare Scenario 3 Data Compare Schema Compare Scenario 4 Data Compare Schema Compare Email Notifications It is now time to open our mailbox and see if there are any notifications. As expected, there are eight letters: one for each of the above-mentioned scenarios. To see the results, open a letter and look for the attached .txt file with the detailed comparison report. Upon opening the attached log file, you will be presented with a comprehensive report detailing how the comparison process was conducted and its outcomes: Comparison log #1 Data Compare Schema Compare Comparison log #2 Data Compare Schema Compare Comparison log #3 Data Compare Schema Compare Comparison log #4 Data Compare Schema Compare Conclusion In this article, we focused on how to get automatic email notifications with data and schema comparison log files after comparing multiple SQL Server databases. For this purpose, we created a .bat file that can complete the task in a single click. To go even further and make the process completely automatic, you can create a synchronization task in Windows Scheduler. [dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) provides a bunch of command-line options and possibilities for customizing data and schema comparison to your specific needs. Try a [30-day free trial](https://www.devart.com/dbforge/sql/compare-bundle/download.html) and check how it can help you in your day-to-day database tasks. Tags [automatic email notifications](https://blog.devart.com/tag/automatic-email-notifications) [cmd](https://blog.devart.com/tag/cmd) [command line](https://blog.devart.com/tag/command-line) [data comparison](https://blog.devart.com/tag/data-comparison) [schema comparison](https://blog.devart.com/tag/schema-comparison) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcomparison-log-delivery-email.html) [Twitter](https://twitter.com/intent/tweet?text=Data+and+Schema+Comparison+Log+Delivery+to+Email&url=https%3A%2F%2Fblog.devart.com%2Fcomparison-log-delivery-email.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/comparison-log-delivery-email.html&title=Data+and+Schema+Comparison+Log+Delivery+to+Email) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/comparison-log-delivery-email.html&title=Data+and+Schema+Comparison+Log+Delivery+to+Email) [Copy URL](https://blog.devart.com/comparison-log-delivery-email.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/component-collections-support-in-entity-developer.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) Сomponent Сollections Support in Entity Developer By [dotConnect Team](https://blog.devart.com/author/dotconnect) May 24, 2012 [0](https://blog.devart.com/component-collections-support-in-entity-developer.html#respond) 2653 This article explains and gives a practical example of how support of сomponent collections is implemented in [Entity Developer for NHibernate](http://devart.com/entitydeveloper/nhibernate-designer.html) . You can take a close look at this type of mapping in the NHibernate documentation [here](http://www.nhforge.org/doc/nh/en/index.html#components-incollections) . This article contains examples of mapping for the following associations: Classic mapping of component collections; A special case of component collections mapping for a many-to-many association. Classic mapping of component collections: Classic mapping of component collections is pretty much alike mapping of a one-to-many association between two entities of a model. The major difference is that the model does not have an entity corresponding to a details table, and instead of it the model contains a complex type consisting of a set of properties similar to a part of the details table field set. Whereas an association is built between an entity corresponding to a master table, and a complex type corresponding to a part of the details table field set. The master entity contains a navigation property which is a reference to a collection of complex type details. Within the database, one-to-many mapping is arranged as a table with a primary key for the one side of mapping, and a table containing a field which references the primary key of the master table, for the other side. Example: The database contains the Dept and Emp tables, which have a foreign key constraint between them. The DEPTNO field in the Emp table receives the ID value of the associated record in the Dept table. We perform the following sequence of operations: create a NHibernate model; add the Dept and Emp tables to the model; select a part of Emp entity properties (e.g. the properties ENAME, JOB, MGR), and by dragging and dropping detach them into a separate complex type, call it PartialEmpType, and then delete the Emp entity; after that we add an association between the Dept master entity and the PartialEmpType details complex type and customize it as displayed in the screenshot below: In the Association Editor dialog box, besides the regular parameters of a one-to-many association like collection type, navigation properties names etc., in the Join Table input field the details table name is specified, and in the Schema field the name of the schema it exists in, is entered. In the dialog box in is necessary to set the name of the foreign key column of the details table, which references the primary key of the master table. This dialog box also makes it possible to define private mapping for the details complex type properties to the details table fields, if this association mapping is different from general mapping of complex type properties. By default for this association type for End 1 of the navigation property the Generate related property check-box is off, but if you set it, the parent reference to the master entity will become available for the details complex type. As a result, we have the following model: Note : If you define collection type of a navigation property as Set , it is very important to implement Equals() and GetHashCode() correctly for classes corresponding to model complex types. To generate these methods with an Entity Developer template, select the template node in the object tree of the Model Explorer tool window, select the Implement Equals template property in the Properties tool window and set it to True . The code generated for the model will be as follows: /// \n /// There are no comments for Dept in the schema.\n /// \n public partial class Dept {\n\n private int _DEPTNO;\n\n private string _DNAME;\n\n private string _LOC;\n\n private Iesi.Collections.Generic.ISet _PartialEmpTypes;\n\n #region Extensibility Method Definitions\n\n partial void OnCreated();\n\n public override bool Equals(object obj)\n {\n Dept toCompare = obj as Dept;\n if (toCompare == null)\n {\n return false;\n }\n\n if (!Object.Equals(this.DEPTNO, toCompare.DEPTNO))\n return false;\n\n return true;\n }\n\n public override int GetHashCode()\n {\n int hashCode = 13;\n hashCode = (hashCode * 7) + DEPTNO.GetHashCode();\n return hashCode;\n }\n\n #endregion\n\n public Dept()\n {\n this._PartialEmpTypes = new Iesi.Collections.Generic.HashedSet();\n OnCreated();\n }\n\n /// \n /// There are no comments for DEPTNO in the schema.\n /// \n public virtual int DEPTNO\n {\n get\n {\n return this._DEPTNO;\n }\n set\n {\n this._DEPTNO = value;\n }\n }\n\n /// \n /// There are no comments for DNAME in the schema.\n /// \n public virtual string DNAME\n {\n get\n {\n return this._DNAME;\n }\n set\n {\n this._DNAME = value;\n }\n }\n\n /// \n /// There are no comments for LOC in the schema.\n /// \n public virtual string LOC\n {\n get\n {\n return this._LOC;\n }\n set\n {\n this._LOC = value;\n }\n }\n\n /// \n /// There are no comments for PartialEmpTypes in the schema.\n /// \n public virtual Iesi.Collections.Generic.ISet PartialEmpTypes\n {\n get\n {\n return this._PartialEmpTypes;\n }\n set\n {\n this._PartialEmpTypes = value;\n }\n }\n }\n\n /// \n /// There are no comments for PartialEmpType in the schema.\n /// \n public partial class PartialEmpType {\n\n private string _ENAME;\n\n private string _JOB;\n\n private System.Nullable _MGR;\n\n #region Extensibility Method Definitions\n\n partial void OnCreated();\n\n public override bool Equals(object obj)\n {\n PartialEmpType toCompare = obj as PartialEmpType;\n if (toCompare == null)\n {\n return false;\n }\n\n if (!Object.Equals(this.ENAME, toCompare.ENAME))\n return false;\n if (!Object.Equals(this.JOB, toCompare.JOB))\n return false;\n if (!Object.Equals(this.MGR, toCompare.MGR))\n return false;\n\n return true;\n }\n\n public override int GetHashCode()\n {\n int hashCode = 13;\n return hashCode;\n }\n\n #endregion\n\n public PartialEmpType()\n {\n OnCreated();\n }\n\n /// \n /// There are no comments for ENAME in the schema.\n /// \n public virtual string ENAME\n {\n get\n {\n return this._ENAME;\n }\n set\n {\n this._ENAME = value;\n }\n }\n\n /// \n /// There are no comments for JOB in the schema.\n /// \n public virtual string JOB\n {\n get\n {\n return this._JOB;\n }\n set\n {\n this._JOB = value;\n }\n }\n\n /// \n /// There are no comments for MGR in the schema.\n /// \n public virtual System.Nullable MGR\n {\n get\n {\n return this._MGR;\n }\n set\n {\n this._MGR = value;\n }\n }\n } As can be seen from the code above, the property is generated in the master entity class, containing a reference to the collection of instances of the details complex type class. Mapping generated for the Dept entity is as follows: \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n The NHibernate model for Entity Developer considered in this example can be downloaded [here](https://blog.devart.com/wp-content/uploads/2012/05/DataModel1.zip) . Mapping of component collections for a many-to-many association: Many-to-many associations use an intermediate table that has a foreign key to tables of both associated entities. An object uses several objects of another type, and one of these latter objects refers several objects of the first type. As is always the case for many-to-many mappings in relational databases, we need the third table which provides references for a many-to-many relationship. Mapping like this allows you to map extra columns of a many-to-many association join table to a complex type. Example: The database contains the Employees, Territories and EmployeeTerritories tables, the last one provides references for the many-to-many relationship. The EmployeeTerritories table contains 1 extra column, which is EmployeeTerritoriesDescription. We perform the following sequence of operations: create a NHibernate model; add the Employees, Territories, and EmployeeTerritories tables; select the EmployeeTerritoriesDescription property of the EmployeeTerritories entity and by dragging and dropping detach it into a separate complex type, call it ‘EmployeeTerritoriesExtra’, delete the EmployeeTerritorу entity; then we add a many-to-many association between the Employee and Territory entities and customize it as displayed in the screenshot: In the Association Editor dialog box, besides the regular parameters of a many-to-many association, such as collection type, navigation properties names, etc., in the Join Table input field the join table name providing references for the many-to-many relationship is specified, and in the Schema field the name of the schema it exists in, is entered. In the dialog box it is necessary to set the name of the foreign key column of the join table, which provides references for the many-to-many relationship. To specify mapping for the join table extra fields for a many-to-many association, in the Component field of the Association Editor dialog box set the required complex type, containing the set of properties corresponding to the set of the join table extra fields. This dialog box also makes it possible to define private mapping for complex type properties to join table extra fields, if this association mapping is different from general mapping of complex type properties. As a result we have the following model: Note : If you define collection type of the navigation property as Set , it is very important to implement Equals() and GetHashCode() correctly for classes corresponding to model complex types. To generate these methods with an Entity Developer template, select the template node in the object tree of the Model Explorer tool window, select Implement Equals template property in the Properties tool window and set it to True . The code generated for the model will be as follows: /// \n /// There are no comments for Territory in the schema.\n /// \n public partial class Territory {\n\n private string _TerritoryID;\n\n private string _TerritoryDescription;\n\n private Iesi.Collections.Generic.ISet _Employees;\n\n #region Extensibility Method Definitions\n\n partial void OnCreated();\n\n public override bool Equals(object obj)\n {\n Territory toCompare = obj as Territory;\n if (toCompare == null)\n {\n return false;\n }\n\n if (!Object.Equals(this.TerritoryID, toCompare.TerritoryID))\n return false;\n\n return true;\n }\n\n public override int GetHashCode()\n {\n int hashCode = 13;\n hashCode = (hashCode * 7) + TerritoryID.GetHashCode();\n return hashCode;\n }\n\n #endregion\n\n public Territory()\n {\n this._Employees = new Iesi.Collections.Generic.HashedSet();\n OnCreated();\n }\n\n /// \n /// There are no comments for TerritoryID in the schema.\n /// \n public virtual string TerritoryID\n {\n get\n {\n return this._TerritoryID;\n }\n set\n {\n this._TerritoryID = value;\n }\n }\n\n /// \n /// There are no comments for TerritoryDescription in the schema.\n /// \n public virtual string TerritoryDescription\n {\n get\n {\n return this._TerritoryDescription;\n }\n set\n {\n this._TerritoryDescription = value;\n }\n }\n\n /// \n /// There are no comments for Employees in the schema.\n /// \n public virtual Iesi.Collections.Generic.ISet Employees\n {\n get\n {\n return this._Employees;\n }\n set\n {\n this._Employees = value;\n }\n }\n }\n\n /// \n /// There are no comments for Employee in the schema.\n /// \n public partial class Employee {\n\n private int _EmployeeID;\n\n private string _LastName;\n\n private string _FirstName;\n\n private System.Nullable _BirthDate;\n\n private System.Nullable _HireDate;\n\n private string _Address;\n\n private Iesi.Collections.Generic.ISet _Territories;\n\n #region Extensibility Method Definitions\n\n partial void OnCreated();\n\n public override bool Equals(object obj)\n {\n Employee toCompare = obj as Employee;\n if (toCompare == null)\n {\n return false;\n }\n\n if (!Object.Equals(this.EmployeeID, toCompare.EmployeeID))\n return false;\n\n return true;\n }\n\n public override int GetHashCode()\n {\n int hashCode = 13;\n hashCode = (hashCode * 7) + EmployeeID.GetHashCode();\n return hashCode;\n }\n\n #endregion\n\n public Employee()\n {\n this._Territories = new Iesi.Collections.Generic.HashedSet();\n OnCreated();\n }\n\n /// \n /// There are no comments for EmployeeID in the schema.\n /// \n public virtual int EmployeeID\n {\n get\n {\n return this._EmployeeID;\n }\n set\n {\n this._EmployeeID = value;\n }\n }\n\n /// \n /// There are no comments for LastName in the schema.\n /// \n public virtual string LastName\n {\n get\n {\n return this._LastName;\n }\n set\n {\n this._LastName = value;\n }\n }\n\n /// \n /// There are no comments for FirstName in the schema.\n /// \n public virtual string FirstName\n {\n get\n {\n return this._FirstName;\n }\n set\n {\n this._FirstName = value;\n }\n }\n\n /// \n /// There are no comments for BirthDate in the schema.\n /// \n public virtual System.Nullable BirthDate\n {\n get\n {\n return this._BirthDate;\n }\n set\n {\n this._BirthDate = value;\n }\n }\n\n /// \n /// There are no comments for HireDate in the schema.\n /// \n public virtual System.Nullable HireDate\n {\n get\n {\n return this._HireDate;\n }\n set\n {\n this._HireDate = value;\n }\n }\n\n /// \n /// There are no comments for Address in the schema.\n /// \n public virtual string Address\n {\n get\n {\n return this._Address;\n }\n set\n {\n this._Address = value;\n }\n }\n\n /// \n /// There are no comments for Territories in the schema.\n /// \n public virtual Iesi.Collections.Generic.ISet Territories\n {\n get\n {\n return this._Territories;\n }\n set\n {\n this._Territories = value;\n }\n }\n }\n\n /// \n /// There are no comments for EmployeeTerritoriesExtra in the schema.\n /// \n public partial class EmployeeTerritoriesExtra {\n\n private string _EmployeeTerritoriesDescription;\n\n #region Extensibility Method Definitions\n\n partial void OnCreated();\n\n public override bool Equals(object obj)\n {\n EmployeeTerritoriesExtra toCompare = obj as EmployeeTerritoriesExtra;\n if (toCompare == null)\n {\n return false;\n }\n\n if (!Object.Equals(this.EmployeeTerritoriesDescription, toCompare.EmployeeTerritoriesDescription))\n return false;\n\n return true;\n }\n\n public override int GetHashCode()\n {\n int hashCode = 13;\n return hashCode;\n }\n\n #endregion\n\n public EmployeeTerritoriesExtra()\n {\n OnCreated();\n }\n\n /// \n /// There are no comments for EmployeeTerritoriesDescription in the schema.\n /// \n public virtual string EmployeeTerritoriesDescription\n {\n get\n {\n return this._EmployeeTerritoriesDescription;\n }\n set\n {\n this._EmployeeTerritoriesDescription = value;\n }\n }\n\n #region Ends of the many-to-many association 'Employee_Territory'\n\n /// \n /// There are no comments for Employees in the schema.\n /// \n public Employee Employees\n {\n get;\n set;\n }\n\n /// \n /// There are no comments for Territories in the schema.\n /// \n public Territory Territories\n {\n get;\n set;\n }\n\n #endregion\n } As can be seen from the code above, in classes corresponding to entities connected with a many-to-many association, the property is generated containing a reference to the class collection for the complex type, consisting in its turn of a set of properties, corresponding to the set of the join table extra fields. Again, in the complex type class, references are generated to classes corresponding to entities connected with a many-to-many association. Therefore, when a many-to-many association is organized in such a way, we have a possibility to access both values of join table extra columns, by means of referring to complex type fields themselves, and instances of relation classes by referring to the respective complex type reference field. Mapping generated for the Employee and Territory entities is as follows: \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n The NHibernate model for Entity Developer considered in this example can be downloaded [here](https://blog.devart.com/wp-content/uploads/2012/05/DataModel2.zip) . Tags [nhibernate](https://blog.devart.com/tag/nhibernate) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcomponent-collections-support-in-entity-developer.html) [Twitter](https://twitter.com/intent/tweet?text=%D0%A1omponent+%D0%A1ollections+Support+in+Entity+Developer&url=https%3A%2F%2Fblog.devart.com%2Fcomponent-collections-support-in-entity-developer.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/component-collections-support-in-entity-developer.html&title=%D0%A1omponent+%D0%A1ollections+Support+in+Entity+Developer) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/component-collections-support-in-entity-developer.html&title=%D0%A1omponent+%D0%A1ollections+Support+in+Entity+Developer) [Copy URL](https://blog.devart.com/component-collections-support-in-entity-developer.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Configure PostgreSQL for Remote Connections: A Beginner’s Guide By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) February 22, 2023 [0](https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html#respond) 6684 Configuring PostgreSQL for remote connections is essential for accessing data from different locations. By default, PostgreSQL only allows connections from the local machine, which can be a limitation in many situations. Remote access to a PostgreSQL database is necessary for applications that require accessing the data from different geographical locations or for teams that are working in different parts of the world. It can also be crucial for troubleshooting issues that cannot be resolved locally or when the database administrator is working remotely. Configuring PostgreSQL for remote access enables greater flexibility, accessibility, and collaboration, which can improve productivity and efficiency in multiple use cases. In this article, we will walk through the step-by-step process of configuring PostgreSQL for remote access, including setting up the PostgreSQL server, configuring the firewall, modifying PostgreSQL configuration files, and testing the connection with the help of an advanced Postgres GUI tool – [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . Whether you are a database administrator, developer, or user, this article will provide you with the knowledge and tools to enable remote access to your PostgreSQL database. Contents Introduction to remote [database](https://blog.devart.com/wp-admin/post.php?post=62537&action=edit#Introduction-to-database-remote-connections) connections Securing remote connections to PostgreSQL How to allow remote connections to PostgreSQL on Linux Prerequisites Enable remote access to Postgres How to allow remote connections to PostgreSQL on Windows Connect to a remote PostgreSQL server from dbForge Studio for PostgreSQL Advantages of using dbForge Studio for PostgreSQL Introduction to remote database connections Remote database connections allow users to access data from a database server that is located on a different machine or network. Remote access is crucial for organizations with distributed teams or with users in different geographic locations who need to access and work with the same data. It is also important for businesses that require real-time data access for critical decision-making. Remote database connections can be established through different protocols and technologies, such as TCP/IP, SSH, or VPN, and can be configured for various database management systems, including PostgreSQL. In this era of global connectivity, remote access to databases is becoming increasingly important for organizations of all sizes. Ports play a vital role in establishing remote database connections. A port is a communication endpoint that enables data to be sent and received between different devices. When setting up a remote connection, the database server needs to listen on a specific port to accept incoming connections from remote clients. The default port used for PostgreSQL is 5432. Other database management systems, such as MySQL and Microsoft SQL Server, have their own default port numbers that can be changed to ensure secure remote access. Ports can also be used to configure firewalls to allow or block incoming traffic to the database server. When configuring remote database connections, it is essential to ensure that the appropriate ports are open and that they are secured to prevent unauthorized access or data breaches. Securing remote connections to PostgreSQL Securing remote connections to a PostgreSQL database is essential to prevent unauthorized access or attacks. Here are some ways to secure remote connections to PostgreSQL: Use SSL/TLS: PostgreSQL supports SSL/TLS encryption, which provides secure communication between the client and the server. Enabling SSL/TLS for remote connections can prevent access without permission and protect sensitive data. Implement IP whitelisting: Configure the PostgreSQL server to accept connections only from trusted IP addresses or networks. This can be done by configuring the firewall or by using the pg_hba.conf file in PostgreSQL. Use VPN: Use a Virtual Private Network (VPN) to establish a secure connection between the client and the server. VPNs provide an additional layer of security by encrypting the entire communication channel. Use SSH Tunnels: SSH tunnels can be used to encrypt and forward remote connections to a PostgreSQL server. This provides secure access to the server without exposing it to the public network. Use strong authentication: Use strong passwords and implement multi-factor authentication for all user accounts. This can prevent brute force attacks and unauthorized access to the server. By implementing these security measures, remote connections to a PostgreSQL database can be secured, ensuring the safety and privacy of sensitive data. How to allow remote connections to PostgreSQL on Linux Prerequisites In this instructional guide, we will be discussing how to enable remote connections to a PostgreSQL server installed on Ubuntu. To follow along with the steps provided, you will need to have the following installed on your system: Ubuntu 20.04 PostgreSQL server version 12 Please ensure that you have these requirements installed before proceeding with the guide. Enable remote access to Postgres To allow remote access to a PostgreSQL 12 server on Ubuntu 20.04, you need to follow the steps below: 1. Modify the PostgreSQL configuration file Open the PostgreSQL configuration file “postgresql.conf” using your preferred text editor. The file is typically located in the /etc/postgresql/12/main directory. To open the file from the Linux Terminal, execute: sudo nano /etc/postgresql/12/main/postgresql.conf Then, find the line #listen_addresses = 'localhost' and uncomment it (remove the # character at the beginning of the line). Next, change the value of “listen_addresses” to “*”. This allows PostgreSQL to listen on all available IP addresses. Alternatively, you can specify a specific IP address or a range of IP addresses that are allowed to connect to the server. 2. Modify the pg_hba.conf file Open the “pg_hba.conf” file using your preferred text editor. The file is typically located in the /etc/postgresql/12/main directory. To open the file from the Linux Terminal, execute: sudo nano /etc/postgresql/12/main/pg_hba.con Take the following section: # IPv4 local connections: host    all             all             127.0.0.1/32            md5 And modify it this way: # IPv4 local connections: host    all             all             0.0.0.0/0            md5 3. Allow port 5432 through the firewall To enable traffic on port 5432 through the firewall, execute the following command: sudo ufw allow 5432/tcp 4. Restart PostgreSQL Run the following command to restart PostgreSQL: sudo service postgresql restart After completing these steps, you should be able to connect to the PostgreSQL server from a remote machine using a PostgreSQL client. However, please note that allowing remote access to a PostgreSQL server can bear a security risk, so it is recommended to use secure passwords, encryption, and firewall rules to protect your system. How to allow remote connections to PostgreSQL on Windows By default, when [installing PostgreSQL](https://blog.devart.com/download-install-postgresql-on-windows.html) , you get it configured to only accept connections from the local machine. This is a security measure to prevent unauthorized access to the database server. However, in some cases, you may need to allow connections from remote hosts. Let us look at how to do this. To allow remote connections to PostgreSQL on Windows, follow these steps: 1. Edit the PostgreSQL configuration file 1.1 Open the PostgreSQL installation directory and locate the postgresql.conf file. By default, the configuration file for PostgreSQL (version 14) is located at C:\\Program Files\\PostgreSQL\\14\\data\\postgresql.conf. 1.2 Make a backup of the file before making any changes. 1.3 Open the postgresql.conf file in a text editor. 1.4 Find the line that reads #listen_addresses = 'localhost' and uncomment it if it is commented (remove the ‘#’ character at the beginning of the line). Next, to ensure that PostgreSQL is configured to accept connections from any address, check the value of “listen_addresses” – it should be set to “*”. Note: You can also use a specific IP address for the PostgreSQL server to listen on. 2. Edit the pg_hba.conf file to allow remote connections 2.1 Open the PostgreSQL installation directory and locate the pg_hba.conf file. By default, it is located at C:\\Program Files\\PostgreSQL\\14\\data\\pg_hba.conf (for PostgreSQL 14). 2.2 Make a backup of the file before making any changes. 2.3 Open the pg_hba.conf file in a text editor. 2.4 Add a new line at the end of the file to allow remote connections. The line should have the following format: host all all 0.0.0.0/0 md5 This line allows connections from any IP address (0.0.0.0/0) and requires a password for authentication (md5). 2.5 Restart PostgreSQL (for example, from the Computer Management console). 3. Configure the Windows Firewall to allow incoming connections to PostgreSQL 3.1 Launch Windows Control Pane . 3.2 Open Windows Defender Firewall. 3.3 Click Advanced settings on the left-hand side of the window. 3.4 Click Inbound Rules on the left-hand side of the window. 3.5 Click New Rule on the right-hand side of the window. 3.6 Select Port as the type of rule and click Next . 3.7 Select TCP as the protocol and enter 5432 as the port number. Click Next . 3.8 Select Allow the connection and click Next . 3.9 Select the network types for which the rule should apply ( Domain , Private , or Public ). Click Next . 3.10 Enter a name and description for the rule and click Finish . After completing these steps, remote connections to PostgreSQL should be allowed on the Windows machine. Connect to a remote PostgreSQL server from dbForge Studio for PostgreSQL Let us look at how you can [connect to the PostgreSQL server](https://blog.devart.com/connect-to-postgresql-database.html) and then [manage connections](https://www.devart.com/dbforge/postgresql/studio/database-explorer.html) in one of the best PostgreSQL clients – dbForge Studio for PostgreSQL. On the Database menu, select New Connection . On the Database Connection Properties > General tab, specify the connection details: Host: Provide the host name Port: Provide the port number User and Password: Enter respective user credentials. By default, the password is saved automatically. If you don’t want to save the password, clear the Allow saving password checkbox. Database: Enter the name of a PostgreSQL database you want to connect to or select it from the drop-down list. Optional: Connection Name: The connection name is generated automatically from the host name. If you want to create a distinctive name for the connection, type the new name. Click Test Connection to verify the connection details you have provided. Click Connect to connect to a PostgreSQL server. Advantages of using dbForge Studio for PostgreSQL dbForge Studio is a powerful and feature-rich [PostgreSQL GUI client](https://www.devart.com/dbforge/postgresql/studio/) for database development and administration. Here are some of the advantages of using it in daily work: User-friendly interface: dbForge Studio for PostgreSQL has a clean interface that is easy to use, even for those who are new to PostgreSQL database management. Advanced PostgreSQL functionality: The Studio provides a wide range of advanced features, including code completion, database management, PostgreSQL data export and import, SQL editing, debugging, and profiling. Efficient data and schema compare and sync: The IDE allows its users to easily compare PostgreSQL databases, get comprehensive information on all differences, and generate clear and accurate SQL synchronization scripts to deploy changes. Easy database management: With dbForge Studio for PostgreSQL, you can easily create, edit, and delete tables, views, procedures, and other database objects. Advanced SQL editing: The tool boasts a mighty [PostgreSQL editor](https://www.devart.com/dbforge/postgresql/studio/data-editor.html) with advanced features, including code highlighting, code formatting, and auto-completion, which help to improve the efficiency and quality of your work. Instant test data generation: dbForge Studio incorporates a powerful Data Generator allowing users to create realistic test data in just a few clicks. Efficient query execution: dbForge Studio for PostgreSQL helps optimize the execution of queries by providing detailed information about query execution plans. Customizable code snippets: The IDE provides a library of customizable code snippets for frequently used SQL commands, which can help reduce the time and effort required to write code. Overall, dbForge Studio for PostgreSQL is a powerful and versatile tool that provides many benefits that make it a popular choice for database developers. To see for yourself how dbForge Studio for PostgreSQL can help you manage your databases more efficiently, we invite you to download and evaluate our free trial. Simply [download free trial of dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/download.html) . With the free trial, you’ll have full access to all the features of the Studio for PostgreSQL for 30 days, so you can try out all the functionality and see how it fits into your workflow. We’re confident that you’ll love the product, and we look forward to hearing your feedback. Tags [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [PostgreSQL Tutorial](https://blog.devart.com/tag/postgresql-tutorial) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconfigure-postgresql-to-allow-remote-connection.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Configure+PostgreSQL+for+Remote+Connections%3A+A+Beginner%E2%80%99s+Guide&url=https%3A%2F%2Fblog.devart.com%2Fconfigure-postgresql-to-allow-remote-connection.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html&title=How+to+Configure+PostgreSQL+for+Remote+Connections%3A+A+Beginner%E2%80%99s+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html&title=How+to+Configure+PostgreSQL+for+Remote+Connections%3A+A+Beginner%E2%80%99s+Guide) [Copy URL](https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Configuring an ODBC Driver Manager on Windows, macOS, and Linux By [DAC Team](https://blog.devart.com/author/dac) June 30, 2022 [0](https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html#respond) 6485 This article explains how to install and configure the ODBC Driver Manager on Windows, macOS, and Linux. Also, it shows how to configure the [ODBC](https://blog.devart.com/oledb-vs-odbc-which-driver-to-choose.html) data source name. Contents What is ODBC Driver Manager? How to Install ODBC Driver Manager Installing and Configuring an ODBC Driver Manager on Windows Installing and Configuring an ODBC Driver Manager on Linux Installing and Configuring an ODBC Driver Manager on macOS What is an ODBC Driver Manager? To explain what this driver manager is, take a look at the ODBC architecture in Figure 1. Figure 1 . ODBC architecture. The ODBC driver manager manages communication between apps and ODBC drivers. To put it simply, this is what happens when your application starts connecting to your database: First, the app connects to a certain database using a connection string or an ODBC data source name. Then, the driver manager loads the correct ODBC driver based on the connection string. When the driver is loaded, the manager calls an SQLConnect or SQLDriverConnect function in the ODBC driver. The connect function in the driver establishes a connection to the database. The security credentials are passed from the connection string used by the application. Finally, the database responds by returning connection success or failure. The application can now submit SQL statements through the driver manager. Then, they are submitted from the driver manager to the ODBC driver. And finally, into the database. The database will then return the results. So, the ODBC driver manager performs the following: Provides an easy way to do common database tasks from your app. This is done uniformly regardless of the database product Loads and unloads ODBC drivers. Perform basic error handling. Calls ODBC driver functions. How to Install ODBC Driver Manager You can start using the ODBC driver manager after installing and configuring it. Let’s consider how to install the ODBC driver manager depending on the operating system. Installing and Configuring an ODBC Driver Manager on Windows ODBC driver manager for Windows already exists in modern Windows machines. Starting with Windows XP and Windows Server 2003, the driver manager for Windows is preinstalled. You don’t need to download and install it unless you’re using an older Windows operating system. The driver manager for Windows is also pre-configured. So, all you need to do is to install ODBC drivers and configure data sources. To connect to the ODBC data source, we use the ODBC Data Source Administrator. Here, we can see our ODBC driver manager in the About tab. Figure 2 . The ODBC driver manager (odbc32.dll) in Windows. That’s the driver manager in Windows in a file called odbc32.dll. We can also see installed ODBC drivers in this app as shown in Figure 3 below. Figure 3. The list of ODBC drivers is displayed in the ODBC Data Source Administrator. This list has a counterpart in the Windows registry as seen in Figure 4 below. Figure 4 . The list of ODBC drivers and DSNs are stored in the Windows registry. If you install another driver, an entry is added to the registry under ODBCINST.INI. And we use these drivers to set up data source names or DSNs. When you configure a DSN, it is also stored in the Windows registry as shown in Figure 4. You can see this under the section ODBC.INI. Note that there’s a separate odbcinst.ini and odbc.ini text files in C:\\Windows. But for convenience, it is better to use the ODBC Data Source Administrator. Installing and Configuring an ODBC Driver Manager on Linux ODBC driver manager for Linux is unlike in Windows. You need to install it on your computer. In this case, we’re going to use the unixODBC driver manager and install it on the Ubuntu 20.04 LTS. So, Debian packages will be used to install ODBC drivers later. What we need: The Terminal for installing unixODBC. Your favorite text editor for editing configuration files. Working installation of MySQL, PostgreSQL, and SQL Server. Devart ODBC drivers for MySQL, PostgreSQL, and SQL Server. Now, don’t get intimidated by the Terminal. Many use this because if no one likes it, it won’t exist on all platforms. And sometimes, typing stuff is easier than clicking buttons. So, let’s begin. 6 Painless Steps to Install and Configure an ODBC Driver Manager on Linux Open the Terminal There are a few ways to open the Terminal in Ubuntu. The one is by clicking the Show Applications icon at the lower right corner of the screen. Then, search for Terminal , click its icon or press Enter when it appears in the search results. Another is by pressing Ctrl-Alt-T . This option is easier and faster to do. Download and Install UnixODBC Download unixODBC source code. Note that it might need compilation, and if you want to get a compiled version, check this [Linux RPM file](https://www.ibm.com/links?url=http%3A%2F%2Frpmfind.net%2Flinux%2Frpm2html%2Fsearch.php%3Fquery%3DunixODBC) . To install it, run the following command in the Terminal: sudo apt-get install unixodbc unixodbc-dev Note that you can install a compiled version of the unixODBC driver manager using a package manager that is suited to your operating system and distribution. For instance, on Red Hat Linux, you can use the yum command: yum install unixODBC unixODBC-devel Wait until the installation completes. Along with the installation of the driver manager comes a tester for DSNs called isql. Another is a tool called odbcinst that will help you with configurations. You will see these in action later. UnixODBC stores a list of drivers installed in a text file called odbcinst.ini . And all DSN configurations are stored in another file called odbc.ini . There’s no registry in Linux. So, this is the counterpart of the Windows registry entries you saw in Figure 4 earlier. After installation, it’s still unusable until you install ODBC drivers. Install ODBC Drivers Open your favorite browser to download 3 Debian packages for the next ODBC drivers: [MySQL](https://www.devart.com/odbc/mysql/download.html) . Select the ODBC Driver for MySQL 4.2 DEB x64. This is the 64-bit driver for MySQL. [PostgreSQL](https://www.devart.com/odbc/postgresql/download.html) . Select the ODBC Driver for PostgreSQL 4.2 DEB x64. This is the 64-bit driver for PostgreSQL. [SQL Server](https://www.devart.com/odbc/sqlserver/download.html) . Select the ODBC Driver for SQL Server 4.2 DEB x64. This is the 64-bit driver for SQL Server. Note the folder where you downloaded the packages. Then, go to your download folder and double-click one of the DEB packages. You will see something like this in Figure 5. Figure 5 . Sample ODBC driver installer. The sample displays the PostgreSQL ODBC driver installer. To install the driver, click the Install button and wait until installation finishes. The Install button changes to the Remove one if the installation is successfully completed. Then, install t he remaining two. Note that you will install a trial version. But it’s worth every penny if you wish to buy a license. Verify Installation Next, verify the installation of the unixODBC and the ODBC drivers. To verify the installed unixODBC, run the Terminal again. Then, type the following command: odbcinst -j Then, you will see a similar configuration as in Figure 7 below. Figure 6 . unixODBC configuration shown using the odbcinst command-line tool. In Figure 6, you can also see the version of unixODBC and the path of the ODBC INI files for the drivers and data sources. Like in Windows, there’s a System, User, and File data source. You need the path of these files when you will configure data sources later. The configuration file paths are pre-configured upon installation of unixODBC. We chose not to change the path of these configuration files. To review a list of all drivers installed earlier, run the following command in the Terminal. odbcinst -q -d See the result in Figure 7. Figure 7 . Installed ODBC drivers verified using odbcinst command-line tool. The drivers listed in Figure 8 also appear in the odbcinst.ini file. To check that out, view the file in your favorite text editor in Ubuntu. In our case, is gedit . Type it in the Terminal using the full path and file you saw in Figure 6 earlier. Here’s the command for that. gedit /etc/odbcinst.ini See a screenshot of the output in Figure 8. Figure 8 . Odbcinst.ini file where installed drivers are listed. In Figure 8, you can see the names of the drivers we installed earlier and a list of the driver library files. All these components are inserted into the file during installation. You don’t need to configure them manually. At this point, the unixODBC driver manager is installed and configured. And since ODBC drivers are also installed, it’s time to test connectivity to data sources. Configure Data Source Names Upon installation of the Devart ODBC drivers, 3 data source templates were added to the odbc.ini file. Check out Figure 9 to see what it looks like. Figure 9 . Odbc.ini after the installation of the 3 Devart ODBC drivers. The blank data source configurations in Figure 6 are also a good starting point to configure data sources. Configuring Data Sources using a Text Editor To configure data sources, you need to fill up the blanks. Of course, you also need to have working installations of MySQL, PostgreSQL, and SQL Server. Figure 10 shows the configuration that we did with the gedit text editor. Figure 10 . Modified odbc.ini with data source configurations. Let’s dissect each of the options. Data Source Name (DSN) – This is the first entry in configuring a data source. In Figure 10, the DSNs are MySQL-Test-DSN , PostgreSQL-Test-DSN , and MSSQL-Test-DSN . Driver – The driver entry defines what ODBC driver to use by the DSN. Data Source – This is the database server or where the database is installed. It can be a name or an IP address. (I used IP addresses in Figure 10 so I hid them) Database or Initial Catalog – The name of the database you want to access. The term ‘Initial Catalog’ is for SQL Server. The rest use the term ‘Database’. User ID and Password – The security credentials to access the database. Port – This is the port number used by each database product. In Figure 8, the port values are the default port numbers used by each database product. Schema – Specifies what schema the PostgreSQL database belongs to. To verify if these data sources are recognized by unixODBC, run this command in the Terminal. odbcinst -q -s And check the output in Figure 11. Figure 11 . Verifying data source names using odbcinst. So, all 3 DSNs were recognized by unixODBC. Testing the DSN Using the isql Command-Line Tool To test the DSN, you need the isql command-line tool that comes with unixODBC. The syntax is: isql Let’s try this with the MySQL DSN. In the Terminal, run the command isql MySQL-Test-DSN . Once you’re connected, you will see a response. Then, you can try to run a simple query. See all these in Figure 12 below. Figure 12 . Using the isql command-line tool to test a DSN and run a simple query. That’s it. It’s easy. You can also try it with PostgreSQL and SQL Server. So, if data sources are working, then your Linux ODBC driver manager is complete. Learn more methods to [test ODBC connection](https://blog.devart.com/4-ways-to-test-an-odbc-connection.html) to ensure the stability of your architecture and reliability of data flow. Installing and Configuring an ODBC Driver Manager on macOS Now let’s consider how to install and configure an ODBC driver manager on macOS. 5 Trouble-free Steps to Install and Configure ODBC Driver Manager on macOS It’s a cakewalk to install an ODBC driver manager on macOS using iODBC. Try the steps below. The workstation used in the following examples uses Mac OS X Big Sur. Download iODBC Driver Manager Get the latest [iODBC Driver Manager](https://www.iodbc.org/dataspace/doc/iodbc/wiki/iodbcWiki/Downloads) . The version used in the following examples is 3.52.15. You need to download the file mxkozzzz.dmg . It’s a disk image file containing the package installer for iODBC. Install iODBC Driver Manager Then, double-click the disk image file mxkozzzz.dmg . A window will open. Then, you will see the iODBC-SDK.pkg package file. Open it to start the installation. NOTE : Mac OS X will not allow you to open the package initially. You will see something similar to Figure 13 below. Figure 13 . iODBC installation blocked in Mac OS X Big Sur. To overcome this, right-click the iODBC-SDK.pkg . A context menu will appear. Then, select Open With . Then, select Installer (default). But you will still see the blocking message in Figure 18. But you now have an option to open it anyway. So, click Open . Then, follow the on-screen instructions until the installation is complete. Install ODBC Drivers You need to configure data sources to test the driver manager. But first, you need to install ODBC drivers. In our example, we will consider the installation of ODBC Driver for PostgreSQL. Note, that to download the package installers you need a Devart account. [PostgreSQL package ( devartodbcpostgresql.pkg )](https://www.devart.com/odbc/postgresql/download.html) . [MySQL package ( devartodbcmysql.pkg )](https://www.devart.com/odbc/mysql/download.html) . After the installation is completed, open the installer package of the ODBC driver for PostgreSQL by right-clicking it and clicking Open . A warning will appear to ask you if you want to install the package. So, click Allow to install the package. Then, follow the on-screen instructions. Agree with the Software license and proceed with the installation. Do the same with the ODBC driver for MySQL. Figure 14 . ODBC driver package installer in Mac OS X. By the end of the installation, you will see a successful installation message. Figure 15 . ODBC driver successful installation message on Mac OS X. Configuring Data Sources The best part of using iODBC compared to unixODBC is the GUI interface. It’s almost the same as in Windows. And that’s not all. System DSN templates were included along with the installation of ODBC drivers. See Figure 21 below. Figure 16 . iODBC Driver Manager and the System DSN. Two DSN templates were included in the installation of Devart ODBC drivers. So, you can start with those templates to configure a MySQL connection. To do that, select the DEVART_MYSQL DSN and click Configure . You will see that the User ID, Password, and the rest are blank. Only the MySQL default port (3306) is filled-up. Add your credentials and test them against the sakila sample database. Figure 17 . A modified DEVART_MYSQL System DSN. Then, you can do the same thing for the DEVART_POSTGRESQL. At this point, you are ready to test for database connectivity. Test Connectivity in iODBC Data Source Administrator To test the connection to the MySQL data source, select the DEVART_MYSQL System DSN. Then, click Test . You will be asked for a Username and Password allowed in the MySQL database. If your credentials are correct, a success message will appear. Figure 18 . Success message after testing DEVART_MYSQL System DSN. Then, do the same thing for DEVART_POSTGRESQL System DSN. At this point, the iODBC Driver Manager is installed and configured correctly. ODBC Driver manager installation in a nutshell Regardless of the operating system you’re using, installing and configuring an ODBC driver manager is a straightforward process. Windows users benefit from a pre-installed driver manager that requires minimal setup, while Linux and macOS users can quickly install and configure their respective ODBC driver managers with just a few commands or clicks. Operating System Installation Steps Configuration Tools Configuration Tools DSN Setup Windows Pre-installed since XP; no manual installation needed. ODBC Data Source Administrator TRUE TRUE Linux Install unixODBC via Terminal using ‘sudo apt install unixodbc’. Text editor (e.g., gedit), Terminal tools (isql, odbcinst) TRUE TRUE macOS Download iODBC Driver Manager (mxkozzzz.dmg) and install via GUI. iODBC Data Source Administrator TRUE TRUE Whether using the Windows ODBC Data Source Administrator, UnixODBC on Linux, or iODBC on macOS, the setup remains intuitive, allowing seamless database connectivity with minimal effort. Tags [iodbc](https://blog.devart.com/tag/iodbc) [Manuals](https://blog.devart.com/tag/manuals) [odbc](https://blog.devart.com/tag/odbc) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconfiguring-an-odbc-driver-manager-on-windows-macos-and-linux.html) [Twitter](https://twitter.com/intent/tweet?text=Configuring+an+ODBC+Driver+Manager+on+Windows%2C+macOS%2C+and+Linux&url=https%3A%2F%2Fblog.devart.com%2Fconfiguring-an-odbc-driver-manager-on-windows-macos-and-linux.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html&title=Configuring+an+ODBC+Driver+Manager+on+Windows%2C+macOS%2C+and+Linux) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html&title=Configuring+an+ODBC+Driver+Manager+on+Windows%2C+macOS%2C+and+Linux) [Copy URL](https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Connect Excel, Power BI, Tableau, and Python to Redshift Using an ODBC Driver By [Max Remskyi](https://blog.devart.com/author/max-remskyi) January 30, 2023 [0](https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html#respond) 2758 An [ODBC driver](https://www.devart.com/odbc/what-is-an-odbc-driver.html) is a software component that enables an application to interact with a database management system (DBMS) using the ODBC interface. The purpose of using an ODBC driver is to provide a standard method for accessing data stored in a variety of different DBMSs, regardless of the platform or programming language being used. This allows developers to write a single application that can work with multiple databases and all those rows and columns, rather than having to write separate code for each one. Additionally, ODBC drivers can provide additional functionality such as connection pooling, statement caching, and data encryption. It is important to note that using an ODBC driver is not the only way to connect these tools to Redshift. Other methods such as JDBC, OLEDB, and native connectors can also be used to connect to Redshift. It depends on the specific requirements and the capabilities of the tool and data source. But now let’s focus on how to connect to Redshift using an ODBC driver. Table of Contents Introduction to Amazon Redshift Advantages of Using an ODBC Driver for Amazon Redshift Connecting to AWS Redshift via an ODBC Driver Using Third-Party Tools How to Connect Excel to Amazon Redshift How to Connect Tableau to Amazon Redshift How to Connect Power BI to Amazon Redshift How to Connect Python to Amazon Redshift How to Connect SQL Server to Amazon Redshift Summary Introduction to Amazon Redshift Amazon Redshift is a fully managed, petabyte-scale data warehouse service offered by Amazon Web Services (AWS). It allows customers to store and analyze large amounts of data using a variety of tools, including SQL. Redshift is designed to handle large amounts of data across multiple nodes, making it a popular choice for businesses that need to process and analyze large amounts of data in real time. A cloud data warehouse like Amazon Redshift can be used to store, manage, and analyze large amounts of structured and semi-structured data in rows and columns. This data can come from a variety of sources, such as transactional systems, log files, and social media platforms. Once the data is stored in the data warehouse, it can be queried and analyzed using SQL or other business intelligence tools. This allows businesses to gain insights from their data and make informed decisions. Integration with third-party services typically involves using APIs or connectors to import data from these services into the data warehouse. For example, Amazon Redshift has built-in connectors for popular services like Amazon S3 and Amazon RDS, as well as connectors for databases like MySQL and PostgreSQL. It also has a JDBC driver that allows it to connect to any data source that supports JDBC. Additionally, many third-party data integration and ETL tools like Talend, Alteryx, and Informatica support Amazon Redshift as a destination. This allows customers to easily move and transform their data from various sources into Redshift for analysis. Once the data is loaded into Redshift, it can be queried, joined, and transformed using SQL, or further processed using other AWS services like Amazon EMR and Amazon SageMaker for machine learning and advanced analytics. Advantages of Using an ODBC Driver for Amazon Redshift There are several advantages of using an [ODBC driver for Amazon Redshift](https://www.devart.com/odbc/redshift/) : Compatibility: ODBC is a widely used open standard for connecting to databases, which means it is supported by many different tools and platforms. This makes it easy to connect to Amazon Redshift from a wide variety of programming languages and business intelligence tools. Flexibility: ODBC allows for a wide range of configuration options, such as specifying the level of data compression and the number of connections to the cluster, as well as filtering data. This allows users to fine-tune their connection to match the specific needs of their application or use case. Ease of Use: ODBC drivers provide a simple, consistent interface for connecting to databases, making it easy to connect to Amazon Redshift even for developers who may not be familiar with the underlying protocol. Performance: The Amazon Redshift ODBC driver uses advanced performance-enhancing features such as connection pooling and multi-threading to maximize performance when querying data. Security: The Amazon Redshift ODBC driver supports all the security features offered by Amazon Redshift such as SSL/TLS encryption for data in transit, and encrypting data at rest using keys managed by AWS Key Management Service (KMS). Support: The Amazon Redshift ODBC driver is officially supported by Amazon and is regularly updated with the latest features and bug fixes. About Devart ODBC Drivers Devart, a leading provider of database tools and connectivity solutions, offers robust ODBC drivers that stand out for their rich features and numerous advantages. In this exploration, we delve into the key features and benefits that make Devart ODBC Drivers a compelling choice for organizations seeking optimal database connectivity. ODBC Key Features Cross-Database Compatibility Devart ODBC Drivers support a wide array of databases, making them a versatile choice for organizations with diverse database environments. Whether you’re working with Oracle, SQL Server, MySQL, PostgreSQL, or another popular database system, Devart ODBC Drivers ensure compatibility and consistent performance across various platforms. High Performance Performance is a critical factor in database connectivity, and Devart ODBC Drivers excel in this regard. Leveraging advanced optimization techniques, these drivers deliver high-speed data access and retrieval, minimizing latency and ensuring a smooth user experience. This is particularly beneficial for applications and services that require real-time or near-real-time data interactions. Secure Connectivity Security is paramount in today’s data-driven landscape, and Devart ODBC Drivers prioritize the protection of sensitive information. The drivers support industry-standard authentication and encryption protocols, ensuring that data transfers between the application and the database remain secure. This feature is crucial for compliance with data protection regulations and safeguarding against potential security threats. Easy Configuration and Deployment Devart ODBC Drivers are designed with convenience for the end user in mind. The straightforward installation process and user-friendly configuration options make it easy for developers and database administrators to set up and deploy the required driver. This simplicity accelerates the integration of the driver into existing systems, reducing implementation time and effort. Unicode Support As businesses operate on a global scale, the importance of Unicode support cannot be overstated. Devart ODBC Drivers fully embrace Unicode standards, enabling seamless handling of multilingual data. This feature is particularly valuable for organizations dealing with diverse data sources and catering to an international user base. ODBC Advantages Improved Application Performance By leveraging the high-performance capabilities of Devart ODBC Drivers, applications can access and retrieve data from databases more efficiently. This results in overall improved application performance, enhancing user satisfaction and productivity. Whether it’s a business intelligence tool, reporting software, or custom application, the speed and reliability of Devart ODBC Drivers contribute to a superior user experience. Reduced Development Time The simplicity of configuration and deployment significantly reduces the time required for integrating an ODBC driver into an application. Developers can focus on the application logic and functionality rather than spend excessive time on database connectivity issues. This streamlined development process is particularly advantageous in fast-paced environments where time-to-market is a critical factor. Scalability and Flexibility The cross-database compatibility of Devart ODBC Drivers ensures scalability and flexibility for organizations with evolving database requirements. As data needs grow or change, these drivers seamlessly adapt to new database environments without requiring extensive modifications to the existing codebase. This adaptability is crucial for businesses experiencing dynamic data scenarios and expanding data infrastructure. Enhanced Data Security and Compliance The robust security features of Devart ODBC Drivers provide organizations with the confidence that their data is secure and compliant with regulatory standards. This is particularly important in industries such as healthcare, finance, and government, where data protection and compliance are paramount. The drivers’ adherence to security best practices helps organizations maintain the integrity and confidentiality of their data assets. Devart ODBC Drivers emerge as reliable and feature-rich solutions for organizations seeking seamless database connectivity. With cross-database compatibility, high performance, secure connectivity, ease of configuration, and support for Unicode, every ODBC driver addresses the diverse needs of modern applications and data environments. By choosing Devart ODBC Drivers, organizations can unlock the full potential of their data connectivity, paving the way for enhanced performance, reduced development time, and heightened data security. Connecting Amazon Redshift to Diverse Data Sources Let’s delve deeper into the advantages and applications of establishing connections between Amazon Redshift and different data sources. Connecting Excel to Amazon Redshift Excel remains a ubiquitous tool in business, and its integration with Amazon Redshift opens up new possibilities for data analysis and reporting. By connecting Excel to Redshift, users can leverage the familiar Excel interface while tapping into the high-performance capabilities of Redshift. This integration streamlines data manipulation, analysis, and visualization, allowing organizations to make informed decisions with real-time data. One of the key benefits of connecting Excel to Amazon Redshift is the ability to handle large datasets effortlessly. Redshift’s parallel processing architecture complements Excel’s capabilities, ensuring smooth handling of massive datasets without compromising performance. Users can execute complex queries and calculations directly in Excel, with the underlying data residing in Amazon Redshift, promoting efficiency and reducing the load on local resources. Moreover, this connection facilitates seamless collaboration among team members. Multiple users can work on the same Excel file concurrently, accessing and updating data stored in Amazon Redshift. This collaborative approach enhances data accuracy and promotes a more cohesive decision-making process within organizations. You also might be interested in reading how to [import leads and contacts from Salesforce to Excel](https://blog.devart.com/import-leads-and-contacts-from-salesforce-to-excel.html) . Connecting Python to Amazon Redshift Python’s versatility and extensive libraries make it a preferred language for data analysis and manipulation. Connecting Python to Amazon Redshift extends the capabilities of both tools, offering a powerful solution for data scientists, analysts, and developers. Through the integration of Python with Amazon Redshift, users can perform advanced analytics, machine learning, and statistical modeling directly on the data stored in Redshift. This eliminates the need for time-consuming data extraction and processing, as the analysis can be conducted in real time within the Redshift environment. The seamless connection empowers data scientists to derive valuable insights more efficiently, accelerating the pace of innovation. Furthermore, Python’s support for various data visualization libraries, such as Matplotlib and Seaborn, enhances the presentation of analytical results. Users can create dynamic and interactive visualizations directly from Amazon Redshift data, providing a more engaging and insightful representation of information. Connecting Power BI to Amazon Redshift Power BI has emerged as a dominant player in the business intelligence landscape, and its integration with Amazon Redshift offers a potent solution for organizations seeking dynamic and interactive reporting capabilities. By connecting Power BI to Redshift, users can create visually appealing dashboards and reports that get updated in real time based on the latest data from Redshift. This integration facilitates a seamless data flow from Amazon Redshift to Power BI, enabling users to leverage Power BI’s rich visualization features without compromising on performance. The direct connectivity ensures that insights are derived from the most up-to-date data, providing a more accurate and reliable foundation for decision-making. Additionally, Power BI’s cloud-based nature aligns well with Amazon Redshift’s cloud infrastructure, promoting scalability and flexibility. Organizations can scale their analytics efforts effortlessly, accommodating growing data volumes and evolving business requirements. Connecting Tableau to Amazon Redshift Tableau, renowned for its intuitive and interactive data visualization capabilities, becomes even more powerful when connected to Amazon Redshift. The integration of Tableau and Redshift empowers users to create compelling visualizations and interactive dashboards that can be shared across the organization. One notable advantage of connecting Tableau to Amazon Redshift is the ability to handle complex queries and aggregate large datasets. Redshift’s parallel processing architecture complements Tableau’s ability to process and visualize vast amounts of data, ensuring a smooth and responsive user experience even with extensive datasets. Furthermore, the direct connection between Tableau and Redshift streamlines the data preparation process. Users can easily extract, transform, and load (ETL) data from Redshift into Tableau, eliminating the need for manual data manipulation. This accelerates the time-to-insight and allows organizations to focus on deriving meaningful conclusions from their data. Connecting Amazon Redshift to diverse data sources such as Excel, Python, Power BI, and Tableau opens up a world of possibilities for organizations seeking to extract maximum value from their data. The seamless integration between these tools and Amazon Redshift enhances data analysis, visualization, and collaboration, providing a robust foundation for data-driven decision-making. As businesses continue to generate and accumulate vast amounts of data, the ability to connect, analyze, and derive actionable insights from diverse data sources becomes a strategic advantage. The combination of Amazon Redshift and these popular tools ensures that organizations can harness the full potential of their data, driving innovation, and staying ahead in the competitive landscape of modern analytics. In addition, we suggest you check our article that explains how to [connect to a HubSpot database and retrieve its data from Tableau, Excel, and Power BI using the ODBC driver for HubSpot](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) . Connecting to AWS Redshift via an ODBC Driver To set up a connection to Amazon Redshift using an ODBC driver, you will need to follow these basic steps: Install the Amazon Redshift ODBC driver on your machine. The driver can be downloaded from the Amazon Redshift website. Create a new Amazon Redshift cluster if you do not have one already. Make sure to note down the endpoint, port, and cluster name. Create a new Data Source Name (DSN) using the ODBC Data Source Administrator. The DSN is a connection string that contains all the necessary information for connecting to your clusters, such as the endpoint, port, and database name. Configure the DSN with your cluster’s credentials, such as the master username and password. Test the connection using the Test button on the DSN configuration page. If the test is successful, you will be able to connect to your Amazon Redshift cluster using the DSN. Once the DSN is set up, you can use it to connect to the Amazon Redshift cluster from various tools and applications that support ODBC connections, such as Tableau, Microsoft Excel, or Python. Please note that you will also need to open your firewall to allow connections to the port you are using to connect to Redshift. Please also note that, depending on the tool you are using, you may need to configure additional settings to connect, like SSL mode and ca-certificate path. Using Third-Party Tools In addition to the Amazon Redshift ODBC driver, there are several other drivers that can work with the ODBC standard to connect to a variety of data sources. Some examples include: Microsoft [ODBC Driver for SQL Server](https://docs.devart.com/odbc/sqlserver/overview.htm) : This driver allows you to connect to a SQL Server database using ODBC. [MySQL Connector/ODBC](https://docs.devart.com/odbc/mysql/overview.htm) : This driver allows you to connect to a [MySQL database using ODBC](https://blog.devart.com/connect-excel-to-mysql-and-import-data.html) . [PostgreSQL ODBC Driver](https://docs.devart.com/odbc/postgresql/overview.htm) : This driver allows you to connect to a PostgreSQL database using ODBC. [Oracle ODBC Driver](https://docs.devart.com/odbc/oracle/overview.htm) : This driver allows you to connect to an Oracle database using ODBC. IBM DB2 ODBC Driver: This driver allows you to connect to an IBM DB2 database using ODBC. [SQLite ODBC Driver](https://docs.devart.com/odbc/sqlite/overview.htm) : This driver allows you to connect to an SQLite database using ODBC. Teradata ODBC Driver: This driver allows you to connect to a Teradata database using ODBC. These are just a few examples of the many different drivers that are available for [easy connection to various data sources](https://www.devart.com/odbc/third-party-tools.html) using ODBC. Some of the other drivers available include drivers for NoSQL databases like MongoDB and Cassandra, and drivers for data sources like Salesforce, Google BigQuery, and Amazon Athena. How to Connect Excel to Amazon Redshift There are a few different ways to connect Microsoft Excel to Amazon Redshift or [SQL Server](https://blog.devart.com/import-from-excel-to-sql-server-html.html) , but one common method is to use the Microsoft Power Query add-in. Here are the general steps: Install the Power Query add-in for Excel formulas if you haven’t already. Open Excel rows and columns and go to the Data tab. Click From Other Sources and select From Data Connection Wizard . Select Amazon Redshift and enter your Amazon Redshift connection details, including the server name, port, database name, and credentials. Select the tables you want to import and click Finish . The data from Amazon Redshift will be [imported into Microsoft Excel](https://blog.devart.com/how-to-import-data-to-excel-via-odbc.html) and can be used for further analysis and reporting. Please note that you need to have an Amazon Redshift cluster and credentials set up before you can connect via Power Query. You might also check [how to connect an Oracle database to Excel](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html) . Besides, the Excel Add-in for Freshdesk now allows you to [connect via Freshdesk API v2](https://blog.devart.com/support-for-excel-2019-and-freshdesk-api-v2-in-devart-excel-add-ins-2-0.html) , providing access to more Freshdesk objects. How to Connect Tableau to Amazon Redshift Open Tableau and click Connect to Data . Select Amazon Redshift from the list of data sources. In the Amazon Redshift connection dialog box, enter the server name and port number for your Amazon Redshift cluster. Enter your database name and credentials, then click Sign In . Select the schema and tables you want to connect to, then click Connect . Once connected, you can start building visualizations and analyzing data in Tableau. How to Connect Power BI to Amazon Redshift Open Power BI Desktop and click Get Data in the Home ribbon. Select Amazon Redshift from the list of data sources. In the Amazon Redshift connection dialog box, enter the server name and port number for your Amazon Redshift cluster. Enter your database name and credentials, then click OK . Select the schema and tables you want to connect to, then click Load . Once connected, you can start building visualizations and analyzing the data in Power BI. How to Connect Python to Amazon Redshift Install the necessary libraries: psycopg2 and sqlalchemy . Import the libraries and create a connection string with the necessary details, such as the server name, port, database name, and credentials. Use the create_engine function from the sqlalchemy library to create a connection to the Amazon Redshift database using the connection string. Use the connection function from the sqlalchemy library to establish a connection to the database. Use the execute function from the sqlalchemy library to execute SQL queries on the connected database. Here’s an example of connecting to Amazon Redshift using the psycopg2 library: And here’s an example of connecting to Amazon Redshift using the sqlalchemy library: Please make sure that the appropriate ports are open for your Redshift cluster and that the IAM user has the necessary permissions to connect. How to Connect SQL Server to Amazon Redshift Use SQL Server Management Studio (SSMS) to connect to your SQL Server instance. In SSMS, open the Object Explorer and navigate to the Server Objects node. Right-click the Linked Servers node and select New Linked Server . In the New Linked Server dialog box, enter the name of the Amazon Redshift server you want to connect to and select Other data source in the Provider dropdown. On the Security tab, you can configure the security context under which the linked server runs. You can use either Be made using the login’s current security context or Be made using this security context and provide the username and password of a Redshift user. Click OK to create a linked server. Once the linked server is created, you can query it using the OPENQUERY or OPENROWSET function. Please note that you need to have an Amazon Redshift cluster and credentials set up before you can connect via SQL Server. Also, you should have the SQL Server Native Client installed on the SQL Server machine. Additionally, you need to make sure that the appropriate ports are open for your Redshift cluster and that the IAM user has the necessary permissions to connect. Summary Connecting Microsoft Excel, Power BI, Tableau, and Python to a Redshift database using an ODBC driver is relatively easy. Excel spreadsheets, Power BI, and Tableau all have built-in support for connecting to data sources using an ODBC driver, so you can simply select ODBC as the connection type and enter the appropriate connection details (such as the hostname and database name) in the appropriate fields. Once connected, you can then create reports and visualizations based on the data in the Redshift database. Python also supports connections to data sources with an ODBC driver using libraries such as Pyodbc, which provides a Python-based interface for interacting with databases that support ODBC. With Pyodbc, you can connect to Redshift using a simple connection string, and then use standard Python commands to execute queries and retrieve data. In summary, using an ODBC driver to connect Excel spreadsheets, Power BI, Tableau, and Python to Redshift is a good option and it’s easy to do so. Besides, there are many other [ODBC Drivers by Devart](https://www.devart.com/odbc/) for the most popular services, so don’t hesitate to check them! Tags [excel](https://blog.devart.com/tag/excel) [how to](https://blog.devart.com/tag/how-to) [odbc](https://blog.devart.com/tag/odbc) [Powerbi](https://blog.devart.com/tag/powerbi) [tableau](https://blog.devart.com/tag/tableau) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-excel-python-power-bi-tableau-to-redshift.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Excel%2C+Power+BI%2C+Tableau%2C+and+Python+to+Redshift+Using+an+ODBC+Driver&url=https%3A%2F%2Fblog.devart.com%2Fconnect-excel-python-power-bi-tableau-to-redshift.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html&title=How+to+Connect+Excel%2C+Power+BI%2C+Tableau%2C+and+Python+to+Redshift+Using+an+ODBC+Driver) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html&title=How+to+Connect+Excel%2C+Power+BI%2C+Tableau%2C+and+Python+to+Redshift+Using+an+ODBC+Driver) [Copy URL](https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-excel-to-mysql-and-import-data.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) How to Connect MySQL to Excel – A Guide to Importing MySQL Data Into Excel By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) July 25, 2023 [0](https://blog.devart.com/connect-excel-to-mysql-and-import-data.html#respond) 2923 Connect Excel to MySQL and Take Control of Your Data Import Microsoft Excel is a powerful tool for data analysis, and MySQL is a popular open-source relational database management system. Integrating MySQL with Excel can offer quite a few advantages to data analysis and reporting. This article will show several methods to connect MySQL to Excel and transfer data to Excel. Introduction to Microsoft Excel Microsoft Excel is a powerful data analysis, manipulation, and visualization tool. It allows users to organize and analyze data in various ways, including charts, graphs, and pivot tables. Excel is widely used in businesses and industries worldwide to store, analyze and present data meaningfully. Here are some of the specialists who commonly use Microsoft Excel: Accountants: Excel is a go-to tool for accountants as it provides powerful data analysis capabilities, supports complex calculations, and allows for the creation of financial statements, budgets, and forecasts. Business analysts: Business analysts use Excel to analyze data and create reports that help businesses make informed decisions. They use Excel to identify trends, patterns, and insights from large datasets, create charts and graphs, and develop visualizations. Data analysts and scientists: Excel is often the first tool used by data analysts and scientists to clean, organize, and analyze data. Excel’s formulae, data filters, conditional formatting, and pivot tables make it easy to manipulate and analyze data quickly. Project managers: Project managers use Excel to track budgets, create schedules, and manage resources. Excel’s project management templates can help managers stay organized and handle tasks, milestones, and deadlines. Sales and marketing professionals: Excel can be applied to track customer data, create sales reports, and analyze market trends. It can also be used to create charts and graphs to visually represent data and create marketing plans. Educators: Educators use Excel to manage student data, track grades, and create lesson plans. Excel is useful for organizing and analyzing student performance and progress data. Advantages of Integrating MySQL and Excel [Integrating MySQL and Excel](https://blog.devart.com/how-to-import-data-to-excel-via-odbc.html) can be beneficial for several reasons: Data analysis and reporting: By integrating MySQL and Excel, users can analyze data and create comprehensive data reports. Excel’s pivot tables, charts, and graphs can be used to visualize data from MySQL databases, making it easy to analyze and identify patterns and trends in data. Streamlined data management: Integrating MySQL and Excel can provide a streamlined approach to data management. Instead of manually importing data from MySQL into Excel, users can establish a connection between the two systems and automate the process of transferring data. This can save time and reduce the risk of errors. Data collaboration: Integrating MySQL and Excel can make collaboration on data analysis projects easier. Multiple users can access the same MySQL database from different locations and work on Excel reports simultaneously. This can facilitate communication and collaboration among team members. Besides, [Excel Add-in for Freshdesk now allows you to connect via Freshdesk API v2](https://blog.devart.com/support-for-excel-2019-and-freshdesk-api-v2-in-devart-excel-add-ins-2-0.html) , providing access to more Freshdesk objects. Complex calculations: Excel’s formulae and functions can be used to perform complex calculations on MySQL data. This can be useful for financial modeling, forecasting, and statistical analysis. Automated reporting: Integrating MySQL and Excel can enable users to automate reporting tasks. Users can create Excel templates automatically, pulling data from MySQL databases into multiple cells, conditional formatting and generating reports. Let’s sum it up: [integrating MySQL and Excel](https://blog.devart.com/import-from-excel-to-sql-server-html.html) can offer several benefits, including data analysis, streamlined data management, collaboration, complex calculations, and automated reporting. Users can create powerful data-driven reports by taking full advantage of both systems to help them make quick business decisions. How to Connect Excel to MySQL: Best Methods There are several methods of connecting Excel to MySQL, which include the Devart ODBC Driver for MySQL, cloud connectors such as Skyvia, and the native driver for MySQL. You might also be interested in [the system requirements for connecting Oracle Database to Excel](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html) . Using Devart ODBC Driver for MySQL [Devart ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) is a powerful tool that enables users to connect Excel to MySQL. This driver is easy to install and provides a range of powerful features that make it easy to transfer data from MySQL into Excel. Download and install the Devart ODBC Driver for MySQL on your computer. Open Excel and go to the Data tab. Click From Other Sources and select From Microsoft Query . In the Choose Data Source window, select Devart ODBC , and click OK . In the Devart ODBC MySQL Connection window, enter your MySQL server name, port number, and database name. Click Test Connection to verify the connection. Once the connection is established, you can select the data you want to import from MySQL into Excel. Click OK to import the data. Using a third-party ODBC driver like Devart ODBC Driver for MySQL with Excel has several advantages: Improved performance: Devart ODBC Driver is designed to optimize performance and improve query execution time compared to the native ODBC driver. This can be especially beneficial for users with large datasets and complex queries. Greater compatibility: Devart ODBC Driver supports many MySQL versions and configurations, ensuring greater compatibility with different MySQL database setups. In contrast, the native ODBC driver may have limitations in terms of compatibility with certain MySQL versions. Advanced features: Devart ODBC Driver offers several advanced features unavailable with the native ODBC driver. For example, it supports a wider range of SQL commands, has better error reporting, conditional formatting and provides more detailed logging and tracing options. Easy to install and configure: Devart ODBC Driver is easy to install and configure, with a simple and intuitive user interface. It comes with clear instructions and documentation, making it easy for users to get it up and running quickly. Better support: Devart offers dedicated technical support for its ODBC Driver, with a team of experts available to assist users with any issues. This can give users greater peace of mind and help them quickly resolve technical issues. You can download the driver for free and check the advantages yourself! Using Cloud Connectors [MySQL connectors](https://skyvia.com/connectors/mysql) such as Skyvia offer an easy and convenient way to connect Excel to MySQL. Skyvia is a cloud-based platform that provides a range of data integration services. Let’s see how you can connect Excel to MySQL with its help. Sign up for a Skyvia account and create a new data integration project. Select MySQL as the source data connector and enter your MySQL server name, port number, username, and password. Select Excel as the destination data connector and enter your Excel file and worksheet names. Map the fields you want to import from MySQL into Excel and click Run to start the import process. There are a few advantages of using cloud connectors, such as providing a convenient and easy way to [Excel cloud connection](https://skyvia.com/excel-add-in/) . They provide better security and data management features compared to other methods. Also, they offer more advanced features, such as data transformation and mapping, which can be useful in complex data integration scenarios. Using the Native Driver for MySQL Excel also comes with a native MySQL driver, allowing users to connect to a MySQL database directly from Excel. This driver is easy to use and requires no additional software to be installed. Open Excel and go to the Data tab. Click From Other Sources and select From Microsoft Query . In the Choose Data Source window, select MySQL ODBC 5.3 ANSI Driver , and click OK . In the MySQL Connector/ODBC Data Source Configuration window, enter your MySQL server name, port number, username, and password. Click Test to verify the connection. Once the connection is established, you can select the data you want to import from MySQL into Excel. Click OK to import the data. Exporting and Importing Data From MySQL to Excel Once you have established a connection between MySQL and Excel, you can easily transfer data between the two systems. Here’s how you can do it using the Devart ODBC Driver for MySQL: To export data Open Excel and go to the Data tab. Click From Other Sources and select From Microsoft Query from the drop-down menu. Select Devart ODBC Driver for MySQL from the list of drivers and click Connect . Enter the necessary connection details, such as the server name, username, and password, in the Data Source Configuration Wizard . Then click Test Connection to ensure the connection works properly. Once the connection is established, select the database and table to export data from. In the Microsoft Query window, select the columns you want to export by selecting the checkbox next to each column name. Click Return Data and select the Table option to export the data to Excel. Choose the location where you want to save the Excel file and click OK to export your data. To import data Open Excel and go to the Data tab. Click on From Other Sources and select From Microsoft Query from the drop-down menu. Select Devart ODBC Driver for MySQL from the list of drivers and click Connect . Enter the necessary connection details, such as the server name, username, and password, in the Data Source Configuration Wizard and click Test Connection to ensure the connection works properly. Once the connection is established, select the database and table that you want to import your data to. In the Microsoft Query window, click the SQL button to open the SQL editor. Enter the SQL command to insert data into the MySQL table. For example, if you want to insert data into a table named customers , you can enter the following SQL command: INSERT INTO customers (name, email, phone) VALUES (?, ?, ?) Note that the question marks represent parameters that will be filled in with the actual data from the Excel file. Click Parameters and enter the cell references for the data you want to import from Excel. For example, if you want to get data from multiple cells A2, B2, and C2, you can enter the following cell references: A2, B2, C2. Click OK to save the parameters. Then click OK again to execute the SQL command and import your data from Excel to MySQL. You can also check this guide that will help you [import leads and contacts from Salesforce to Excel](https://blog.devart.com/import-leads-and-contacts-from-salesforce-to-excel.html) without unforeseen challenges. Besides, you might be interested in this article that explains [how to connect to a HubSpot database and retrieve its data using Tableau, Excel, Power BI, and the ODBC Driver for HubSpot](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) . Unveil the Power of Excel Add-ins Seamless connection between MySQL and Excel has become crucial for users seeking efficient data integration. While there are various methods to achieve this connection, one powerful approach involves the use of Excel Add-ins. Let’s explore the capabilities of connecting MySQL to Excel through Excel Add-ins and highlight the advantages it offers over the traditional ODBC connection. A Gateway to Effortless Integration [Excel Add-in for MySQL](https://www.devart.com/excel-addins/mysql/) serve as valuable extensions that enhance Excel’s functionality by enabling users to connect directly to external data sources, such as MySQL databases. This method eliminates the need for complex coding or manual data entry, streamlining the data import process and enhancing overall productivity. Key Features of Excel Add-ins for MySQL Connectivity: User-Friendly Interface: Excel Add-ins provide a user-friendly interface that simplifies the process of connecting to MySQL databases. Users can navigate through the setup with ease. Real-Time Data Updates: One notable advantage of using Excel Add-ins is the ability to retrieve data from MySQL databases in real time. This ensures that users are working with the latest information, eliminating the need for manual updates. Data Mapping and Transformation: Excel Add-ins often come equipped with features for data mapping and transformation. This allows users to customize how data is imported and displayed in Excel, providing greater control over the integration process. Query Building Capabilities: Users can build and execute SQL queries directly within Excel, facilitating advanced data retrieval and analysis. This feature empowers users to tailor queries to their specific requirements, enhancing the flexibility of data extraction. ODBC vs. Excel Add-ins While both ODBC and Excel Add-ins facilitate MySQL-Excel connectivity, the latter presents distinct advantages that set it apart. Advantages of Excel Add-ins: Simplified Configuration: Excel Add-ins typically offer a more straightforward setup compared to ODBC connections. This simplicity reduces the learning curve for users and accelerates the integration process. Enhanced Security: Excel Add-ins often leverage secure authentication methods, enhancing the overall security of data transfers between MySQL and Excel. This is particularly crucial when handling sensitive information. Dynamic Data Refresh: Unlike ODBC connections that may require manual refreshing, Excel Add-ins can be configured for dynamic data refresh, ensuring that the Excel spreadsheet reflects real-time changes in the connected MySQL database. Advanced Functionality within Excel: Excel Add-ins seamlessly integrate with Excel’s native functions and features, allowing users to leverage the full spectrum of Excel’s capabilities for data analysis, visualization, and reporting. While ODBC connections offer a viable method for connecting MySQL to Excel, the advantages presented by Excel Add-ins make them a compelling choice. The streamlined setup, enhanced security, and seamless integration with Excel’s features position Excel Add-ins as a powerful tool for users seeking efficient and dynamic MySQL-Excel connectivity. Perform Last Check Up With the right tools and techniques, you can easily establish a connection between MySQL and Excel. One of the most efficient ways to achieve this is by using Devart ODBC drivers, which offer a range of features designed to simplify the integration process and maximize productivity. Here are some valuable tips to help you connect MySQL to Excel quickly and easily using Devart ODBC drivers: Ensure Compatibility Before getting started, ensure that your MySQL server version is compatible with the Devart ODBC driver you intend to use. Devart provides detailed documentation outlining the supported MySQL versions for each driver, ensuring a smooth integration process. Download and Install the Driver Begin by downloading the appropriate Devart ODBC driver for MySQL from the official website. The installation process is straightforward and typically involves following a few simple steps provided in the installation wizard. Once installed, you’re ready to establish a connection between Excel and MySQL. Configure the Connection Open Excel and navigate to the Data tab. From there, select From Other Sources and choose From Microsoft Query . In the Choose Data Source window, select the Devart ODBC driver you installed and proceed to configure the connection settings. Enter the MySQL server name, port number, database name, username, and password. Click Test Connection to ensure that the connection is established successfully. Optimize Performance Devart ODBC drivers are optimized for performance, ensuring efficient data retrieval and query execution. To further optimize performance, consider optimizing your MySQL database by indexing frequently queried columns and minimizing the use of complex joins and subqueries where possible. This will help enhance overall query performance and streamline data retrieval. Utilize Advanced Features Devart ODBC drivers offer a range of advanced features that can further enhance your data integration process. Take advantage of features such as bulk updates, inserts, and support for SSL encryption to optimize data transfer security and efficiency. Also, explore the driver’s documentation to leverage any specific features tailored to your integration requirements. Stay Updated Devart regularly releases updates and enhancements to their ODBC drivers, ensuring compatibility with the latest MySQL versions and addressing any potential issues or bugs. Stay informed about new releases and updates by subscribing to the Devart newsletter or checking their website regularly. Updating your ODBC driver to the latest version will ensure optimal performance and compatibility with your MySQL environment. Seek Support When Needed If you encounter any difficulties or have questions during the integration process, don’t hesitate to reach out to the Devart tech support team. They’re available to provide assistance and guidance to help resolve any issues and ensure a successful integration between MySQL and Excel. By following these valuable tips and leveraging the power of Devart ODBC drivers, you can connect MySQL to Excel quickly and easily, empowering you to unlock the full potential of your data and streamline your data analysis workflow. Whether you’re a seasoned data analyst or a novice user, Devart ODBC drivers offer the tools and support you need to achieve seamless integration and drive actionable insights from your data. Conclusion Connecting Excel to MySQL may seem daunting at first, but it can be relatively straightforward with the right tools and instructions. Various methods are available to connect Excel to MySQL, including the corresponding Devart ODBC Driver, cloud connectors like Skyvia, and the native MySQL driver. While each method has advantages and disadvantages, the [Devart ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) is popular due to its compatibility, ease of use, and advanced features. By following the instructions provided for each method, users can easily establish a connection between Excel and MySQL and transfer data between them. In addition to the Devart ODBC Driver for MySQL, there are several other Devart ODBC drivers available for different database management systems. Devart specializes in providing database management solutions, and their ODBC Drivers are designed to facilitate easy and efficient data access between various database systems and applications. Note that the rich range of Devart ODBC drivers covers the most popular database systems: Devart ODBC Driver for Oracle: This driver allows users to connect Excel to Oracle databases and provides advanced features such as support for Oracle Advanced Security and SSL connections. Devart ODBC Driver for SQL Server: This driver enables users to connect Excel to SQL Server databases and provides features such as support for bulk updates, inserts, and enhanced performance through query optimization. Devart ODBC Driver for PostgreSQL: This driver allows users to connect Excel to PostgreSQL databases and provides features such as support for SSL encryption and support for PostgreSQL-specific data types. Devart ODBC Driver for SQLite: This driver enables users to connect Excel to SQLite databases and provides features such as support for SQLite-specific data types and compatibility with SQLite encryption extensions. By providing a range of ODBC Drivers for different database systems, Devart aims to make it easier for users to access and manage their data across different platforms and applications. Tags [database](https://blog.devart.com/tag/database) [excel](https://blog.devart.com/tag/excel) [MySQL](https://blog.devart.com/tag/mysql) [odbc](https://blog.devart.com/tag/odbc) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-excel-to-mysql-and-import-data.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+MySQL+to+Excel+%E2%80%93+A+Guide+to+Importing+MySQL+Data+Into+Excel&url=https%3A%2F%2Fblog.devart.com%2Fconnect-excel-to-mysql-and-import-data.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-excel-to-mysql-and-import-data.html&title=How+to+Connect+MySQL+to+Excel+%E2%80%93+A+Guide+to+Importing+MySQL+Data+Into+Excel) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-excel-to-mysql-and-import-data.html&title=How+to+Connect+MySQL+to+Excel+%E2%80%93+A+Guide+to+Importing+MySQL+Data+Into+Excel) [Copy URL](https://blog.devart.com/connect-excel-to-mysql-and-import-data.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) Connect MongoDB Databases in the Cloud Using Devart ODBC Driver for MongoDB By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) July 1, 2024 [0](https://blog.devart.com/connect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html#respond) 1156 Content Objective Technology Background Prerequisites Architecture Register MongoDB Atlas Account Install Devart ODBC for MongoDB Get MongoDB Connection in MongoDB Atlas Create ODBC Connection in PowerBI Desktop ODBC Driver Configuration Conclusion Objective This tutorial explains how to build Credit Card Analytics by leveraging Power BI Desktop, Devart ODBC for the MongoDB driver, and the MongoDB database. It will not instruct you on how to work on PowerBI Desktop or MongoDB; instead, we will demonstrate how to work with Devart ODBC for the MongoDB driver. Technology Background Power BI Desktop Microsoft Power BI Desktop is built for the analyst. It combines state-of-the-art interactive visualizations with built-in industry-leading data query and modeling. Create and publish your reports to Power BI. Power BI Desktop helps empower others with timely critical insights anytime, anywhere. MongoDB Database & MongoDB Atlas MongoDB is a source-available, cross-platform, document-oriented database program. It is classified as a NoSQL database product and utilizes JSON-like documents with optional schemas. MongoDB is a document database with the scalability and flexibility you want and the querying and indexing you need. MongoDB provides the MongoDB Atlas product as a developer data platform built around a fully managed MongoDB service. It addresses transactional, search, and analytical workloads. Devart ODBC for MongoDB [Devart ODBC Driver for MongoDB](https://www.devart.com/odbc/mongodb/) is a high-performance connectivity tool with enterprise-level features for accessing MongoDB databases from ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows, macOS, and Linux. Our ODBC driver fully supports standard ODBC API functions and data types, enabling easy and secure access to live MongoDB data from anywhere. Prerequisites Before starting to connect the Oracle database by using Devart ODBC for MongoDB and building Analytics, make sure to download the necessary tools and drivers: Click [here](https://www.microsoft.com/en-us/download/details.aspx?id=58494) to download Power BI Desktop Click [here](https://www.devart.com/odbc/mongodb/) to download Devart ODBC for MongoDB Driver Click [here](https://www.mongodb.com/cloud/atlas/register) to register a MongoDB Atlas account to create a document database on the Cloud. Architecture Power BI Desktop uses the Devart ODBC for MongoDB driver to connect to the MongoDB database hosted in the Cloud and query data for building analytics. Register MongoDB Atlas Account If you do not have a MongoDB Atlas Account, you should access the [link](https://www.mongodb.com/cloud/atlas/register) to register an account. After you have the MongoDB Atlas account, you can create a MongoDB database and use Devart ODBC Driver for MongoDB to connect to the database. Below is an example of the MongoDB Atlas dashboard. Install Devart ODBC for MongoDB Download and run the installer file to install and follow the instructions below: Select Destination Location Select Full Installation Click Next to proceed to the start of the installation After the installation, we continue configuring the driver. Get MongoDB Connection in MongoDB Atlas To set up the ODBC Configuration in the next step, we need to identify the connection of the MongoDB database in the MongoDB Atlas. On the main page of MongoDB Atlas, select Connection -> Drivers In the drop-down list of drivers, select the required one and copy the DNS name from the connection string. In this case, the DNS name should be: cluster0.hvkaglr.mongodb.net and use it in ODBC Configuration Notice that when we create MongoDB databases in the cloud, it means they are being run in a Clustering mechanism. So, we will not connect to an IP address or Server Name of any specified MongoDB database; instead, we will connect via the DNS Server Name. As the image below shows, there are three servers in Cluster 0: primary and secondary servers. On the main page, click Browse Collection to explore the list of databases and their collections. We have on the sample database sample_mflix and its collections: users, movies, theaters, sessions, and embedded_movies. We can understand that sample_mflix is a database, and we use it for ODBC configuration. Its collections are tables reflecting relational database concepts. ODBC Driver Configuration In this step, we must configure ODBC Connection in your environment. In this tutorial, we are installing on a Windows machine Click Start , Run Type the C:\\Windows\\System32\\odbcad32.exe if the system is 64bit to open ODBC Data Source Administrator Click on the Driver tab and make sure the Devart ODBC Driver for MongoDB is in the list of drivers Select the User DSN or System DSN tab. Click Add. The Create New Data Source dialog will appear. Select Devart ODBC Driver for MongoDB and click Finish. The driver setup dialog will open. Enter the connection information in the appropriate fields Data Source Name: ODBC_MongoDBAtlas Server: cluster0.hvkaglr.mongodb.net -> This information is what we get in MongoDB Atlas Port: 27017 is the default port to connect User ID: mongdb_admin -> This user is created during we create a cluster and MongoDB in MongoDB Atlas Password: password of mongodb_admin user Database: sample_mflix -> This information is what we get in MongoDB Atlas Connection Format: DNS Seed List. This setting is important because we connect to the MongoDB databases in the Cloud we created in MongoDB Atlas. Click on Test Connection to correct the connection. Create ODBC Connection in PowerBI Desktop In this step, we will use Devart ODBC for MongoDB driver to establish the connection bridge between PowerBI Desktop and MongoDB database in MongoDB Atlas. Make sure that PowerBI Desktop is installed completely. Open PowerBI Desktop -> Click on Get Data -> Get Data window appears -> Choose Others. On the left panel, there are a lot of drivers that PowerBI supports Choose ODBC driver -> Connect From the Data Source drop-down list -> Choose the DNS: ODBC_MongoDBAtlas which is created in 9. ODBC Driver Configuration Click OK . Then enter username and password if they are required Click Connect . The Navigator window appears, and we will the list of tables in sample_mflix database  -> select Comments and Users collection to add them to PowerBI Desktop We are ready to build the report with data from Cloud MongoDB databases Conclusion Devart ODBC Driver for MongoDB is a high-performance connectivity tool with enterprise-level features for accessing MongoDB databases from ODBC-compliant reporting, analytics, BI, and ETL tools on 32-bit and 64-bit Windows, macOS, and Linux. Our ODBC driver fully supports standard ODBC API functions and data types, enabling easy and secure access to live MongoDB data from anywhere. Tags [mongodb](https://blog.devart.com/tag/mongodb) [odbc](https://blog.devart.com/tag/odbc) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html) [Twitter](https://twitter.com/intent/tweet?text=Connect+MongoDB+Databases+in+the+Cloud+Using+Devart+ODBC+Driver+for+MongoDB&url=https%3A%2F%2Fblog.devart.com%2Fconnect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html&title=Connect+MongoDB+Databases+in+the+Cloud+Using+Devart+ODBC+Driver+for+MongoDB) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html&title=Connect+MongoDB+Databases+in+the+Cloud+Using+Devart+ODBC+Driver+for+MongoDB) [Copy URL](https://blog.devart.com/connect-mongodb-databases-in-cloud-using-devart-odbc-for-mongodb.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-mysql-to-tableau.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [ODBC](https://blog.devart.com/category/odbc) How to Connect Tableau to MySQL: A Guide for BI Professionals By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) July 25, 2023 [0](https://blog.devart.com/connect-mysql-to-tableau.html#respond) 2415 Connect Tableau to MySQL and Revolutionize Your Data Analysis As businesses become more data-driven, quickly and effectively analyzing data has become essential. Business intelligence (BI) tools, such as Tableau, have become increasingly popular because they help companies make informed decisions based on their data. However, to effectively use these tools, it is necessary to have a reliable and efficient way to transfer data from databases to the BI tool. MySQL, one of the most popular open-source relational database management systems, comes in here. This article will explore how to connect Tableau to MySQL and share data seamlessly between the two. What is Tableau? Tableau is a powerful business intelligence (BI) tool that enables users to visualize and analyze data from multiple sources. It provides a range of features that make it easy for users to create interactive dashboards, charts, and graphs from their data. Tableau is used by businesses of all sizes and in various industries to gain insights into their operations and make data-driven decisions. It can be used for various purposes, including sales and marketing analysis, financial reporting, and operational analytics. Advantages of Tableau Tableau has several advantages, making it a popular choice for data analysts and business professionals. Some of the critical benefits of Tableau include the following: Easy-to-use Interface: Tableau has a user-friendly interface that allows users to create visualizations and explore data without programming skills. Integration with Multiple Data Sources: Tableau can connect to a wide range of data sources, including Excel, SQL Server, Oracle, and more. This enables users to bring all their data into one place for statistical analysis. Interactive Dashboards: Tableau allows users to create interactive dashboards that can be shared with others in their organization. This makes it easy to communicate insights and collaborate on data-driven projects. Real-time Data: Tableau can connect to real-time data sources, such as streaming data feeds, to provide up-to-date insights into business operations. Cloud-based Deployment: Tableau offers a cloud-based deployment option that makes it easy for users to access their data anywhere, anytime. 5 Ways to Connect Tableau to MySQL There are several ways to connect Tableau to MySQL, each with its own advantages and disadvantages. Here are five ways to connect Tableau to MySQL: Linking MySQL and Tableau with [MySQL ODBC Driver](https://www.devart.com/odbc/mysql/) Cloud Connection Using Skyvia Integrating Tableau with MySQL via Native connection [Export MySQL data to CSV](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) and upload to Tableau Using a third-party connector like TIBCO Spotfire. This article will focus on the first three options, which are the most commonly used methods. Besides, you might also be interested in reading how to connect [Excel, Power BI, Tableau, and Python to Redshift using ODBC Driver](https://blog.devart.com/connect-excel-python-power-bi-tableau-to-redshift.html) . Linking MySQL and Tableau with Devart ODBC Driver The Devart [ODBC driver](https://www.devart.com/odbc/what-is-an-odbc-driver.html) allows you to connect Tableau to MySQL using an ODBC connection. Here’s how to set it up: Download and install the Devart ODBC driver for MySQL. Open Tableau and select Connect to Data . In the Connect pane, select Other Databases (ODBC) and select the Devart ODBC driver for MySQL. Enter your MySQL server details and click Connect . You can now select your MySQL database and start creating visualizations. Cloud Connection Using Skyvia [Skyvia](https://skyvia.com/) is a cloud-based data integration platform that allows users to connect and integrate data from various sources. It offers a range of features that make it easy to manage data integration tasks, including data migration, data synchronization, and data backup. One of the benefits of using Skyvia [MySQL cloud connector](https://skyvia.com/connectors/mysql) is that it provides an easy-to-use interface for creating data integration workflows. Users can drag and drop data sources and targets and use visual mapping tools to map data between different systems. Here’s how to set it up: Sign up for a Skyvia account and create a new integration. In the integration settings, select MySQL as the source and Tableau as the target. Enter your MySQL server details and connect to your MySQL database. Connect to your Tableau account and select the Tableau project and dataset you want to use. Set up the integration schedule and start transferring data. Integrating Tableau with MySQL via a Native connection Tableau desktop also offers a native connection to MySQL, allowing users to connect to MySQL databases without additional drivers or software. This method is often the easiest and most straightforward way to connect Tableau to MySQL. Tableau provides a native connector for MySQL, allowing users to connect to their databases directly without needing a third-party driver. Here’s how to use the native connector to integrate Tableau with MySQL: Open Tableau and select Connect to Data . In the Connect pane, select MySQL from the list of connectors. In the MySQL Connection dialog box, enter the server name or IP address, port number, and the database name you want to connect to. If you have a specific user account that you want to use to connect to the MySQL database, enter the username and password for that account. If not, you can leave these fields blank, and Tableau will prompt you for your credentials when you connect. Once you have entered all the necessary information, click Sign In to connect to the MySQL database. Once connected to the database, you can select the tables or views you want to use in your statistical analysis. Tableau will generate a preview of the data, which you can explore and manipulate using the various visualization tools available in Tableau. Using the native connector to connect Tableau to MySQL is straightforward and requires no additional software or drivers to be installed. However, ensuring that your MySQL server is configured to allow remote connections and that you have the necessary permissions to access the databases and tables needed for your analysis is essential. As for other integrations, you might also like our piece on [how to access HubSpot Data Source from Power BI, Tableau, and Excel Using the ODBC Driver for HubSpot](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) . Export MySQL data to CSV and upload to Tableau If you don’t want to use a direct connection between MySQL and Tableau, you can also export your MySQL data to CSV format and then upload it to Tableau for analysis and visualization. Exporting data from MySQL to CSV format and then uploading it to Tableau is a common method users use to integrate MySQL with Tableau. Here’s how to do it: Open MySQL Workbench and connect to the MySQL database from which you want to export data. Select the table or tables you want to export in the “Navigator” pane. Right-click on the table and select Export Data from the context menu. In the “Export Options” dialog box, choose CSV as the format and select the location where you want to save the exported file. Choose the appropriate settings for the CSV file, such as whether to include headers or enclose fields in quotes. Click Export to save the file in CSV format. Open Tableau and select Connect to Data . In the “Connect” pane, select Text File as the connector. Navigate to the location where you saved the CSV file and select it. In the Text File Import Wizard dialog box, choose the appropriate settings for the file, such as the delimiter and encoding. Click OK to import the data into Tableau. Once the data is imported, you can use the various visualization tools in Tableau to create visualizations and dashboards based on the data. Exporting MySQL data to CSV format and importing it into Tableau is a simple and effective method for integrating MySQL with Tableau. It is beneficial if you only need to work with a subset of the data in your MySQL database or if you need to perform additional transformations on the data before visualizing it in Tableau. Connecting Tableau to MySQL with JDBC Driver Finally, you can also connect Tableau to MySQL  using a JDBC driver. JDBC drivers allow Java applications to connect to databases like MySQL and extract data for analysis and visualization. Tableau has built-in support for JDBC, meaning you can use it to connect to MySQL. You must first download and install the MySQL Connector/J driver to connect to MySQL with JDBC. You can download it from the MySQL website. Once you have downloaded the driver, follow these steps: Open Tableau and select Connect to Data from the start page. Select MySQL from the list of options. In the “Server” field, enter the name of the MySQL server that you want to connect to. Enter the port number for the MySQL server in the “Port” field. Enter the name of the database that you want to connect to in the Database field. Enter your MySQL login credentials in the “Username” and “Password” fields. Click the Connect button to establish the connection. Once the connection is established, you can start creating visualizations with your MySQL data. Advantages of Connecting Tableau to MySQL: Connecting Tableau to MySQL desktop has several advantages for BI professionals. Here are some of them: Access to Real-Time Data: By connecting Tableau to MySQL desktop or other version, BI professionals can access real-time data from their databases. This means they can create reports and dashboards showing the most up-to-date information about their business. Better Data Visualization: Tableau is a powerful data visualization tool that can help BI professionals to create stunning visualizations from their MySQL data. With Tableau, you can create interactive dashboards that allow users to explore data more engagingly. Improved Decision-Making: By connecting Tableau to MySQL desktop, BI professionals can make more informed decisions based on the insights they gain from their data. They can identify trends, patterns, and outliers that may not be immediately apparent from raw data. Conclusion Tableau is a powerful tool for BI professionals who need data visualization and data analysis from multiple sources. By connecting Tableau desktop to MySQL, users can gain insights into their business operations and make data-driven decisions. With its easy-to-use interface, integration with multiple data sources, and interactive dashboards, Tableau is an essential tool for businesses of all sizes and industries. Connecting Tableau to MySQL is essential for BI professionals who want to maximize their data. Using one of the methods outlined in this article, you can easily establish a connection between MySQL and Tableau and start creating visualizations that can help your business make better decisions. Whether you use a third-party tool like [Devart ODBC](https://www.devart.com/odbc/) Driver or a native driver like JDBC, the process is straightforward and can be completed in just a few steps. Besides, ODBC driver can also help you [to access Snowflake Data Warehouse Easily from Power BI or Tableau](https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html) or even [to configure a DSN and import data from Asana to Excel, Power BI, or Tableau](https://blog.devart.com/odbc-driver-for-asana-configure-a-dsn-and-import-data-from-asana-to-excel-power-bi-or-tableau-with-ease.html) with ease. The choice of the best way to connect Tableau to MySQL depends on the specific needs and requirements of the user. For instance, if you need [Tableau and MySQL integration](https://skyvia.com/data-integration/tableau) regularly and want to automate the process, then using a cloud-based solution like Skyvia could be a good option. Skyvia allows you to create automated workflows that can extract data from MySQL, transform it if needed, and load it into Tableau Server or Tableau Online. On the other hand, if you prefer a direct connection between Tableau and MySQL, then using a native connector like MySQL Connector could be the best choice. With a native connector, you don’t need to install any additional software or drivers, and you can access MySQL data directly from within Tableau. Similarly, using a JDBC driver like the Devart ODBC driver is also a good option if you want to have more control over the connection and access to advanced features like connection pooling and caching. Overall, the best way to connect Tableau to MySQL server depends on your specific needs and requirements and the nature and complexity of your data. Evaluating each option and choosing the best one that suits your needs is recommended. So why wait? Start exploring your MySQL data with Tableau today! Tags [MySQL](https://blog.devart.com/tag/mysql) [odbc](https://blog.devart.com/tag/odbc) [tableau](https://blog.devart.com/tag/tableau) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-mysql-to-tableau.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Tableau+to+MySQL%3A+A+Guide+for+BI+Professionals&url=https%3A%2F%2Fblog.devart.com%2Fconnect-mysql-to-tableau.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-mysql-to-tableau.html&title=How+to+Connect+Tableau+to+MySQL%3A+A+Guide+for+BI+Professionals) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-mysql-to-tableau.html&title=How+to+Connect+Tableau+to+MySQL%3A+A+Guide+for+BI+Professionals) [Copy URL](https://blog.devart.com/connect-mysql-to-tableau.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Connect Oracle Database to Excel: Import Your Data Easily By [Max Remskyi](https://blog.devart.com/author/max-remskyi) May 31, 2023 [0](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html#respond) 4245 Before we begin, it is essential to understand the system requirements for connecting [Oracle Database](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) to Excel. First, you need to have access to an Oracle Database, and you need to have administrative privileges to configure the database. Second, you need to have a compatible version of Excel installed on your computer. Finally, you need to have the necessary drivers installed to establish a connection between Oracle Database and Excel. Requirements for Connecting Connecting Oracle Database to Excel can be a relatively straightforward process, especially with the use of ODBC drivers like the Devart Oracle ODBC driver. However, some common mistakes can make the process more difficult than it needs to be. Here are some common mistakes to avoid when connecting Oracle Database to Excel: Not meeting the requirements: One of the most common mistakes is not meeting the requirements for connecting to Oracle Database. For example, the correct version of Oracle Database may not be installed, or the necessary privileges may not be granted to the user. Make sure to review the requirements and ensure they are met before attempting to connect. Incorrect connection string: Another common mistake is using an incorrect connection string when attempting to connect to Oracle Database. Make sure to double-check the connection string and ensure that it is correctly formatted. Incorrect login credentials: Another common mistake is using incorrect login credentials when attempting to connect to Oracle Database. Double-check that the username and password are correct and that the user has the necessary privileges to access the database. Incorrect ODBC driver settings: If using an ODBC driver like the Devart Oracle ODBC driver, it’s important to ensure that the driver settings are correctly configured. This can include settings like the port number, database name, and other connection parameters. Firewall or network issues: Finally, firewall or network issues can sometimes cause problems when attempting to connect to Oracle Database. Make sure to check that any necessary ports are open and that the network is configured correctly. By avoiding these common mistakes and taking the time to ensure that all requirements are met and settings are correctly configured, connecting Oracle Database to Excel can be a straightforward process. Connecting Oracle Database to Excel: Best Methods There are several methods to connect Oracle Database to Excel, but the most popular ones are using an ODBC driver, Toad, or a Macro. Let’s explore each of these methods in detail. How to Connect Excel to Oracle Database using ODBC The [Devart ODBC](https://www.devart.com/odbc/) driver is a popular method to connect Oracle Database to Excel. Here’s a step-by-step guide to connecting Excel to Oracle Database using the ODBC driver: Step 1: Install the ODBC driver on your computer. Step 2: Open Excel and go to the Data tab. Step 3: Click on the From Other Sources button and select From Data Connection Wizard. Step 4: In the Data Connection Wizard, select the ODBC DSN option. Step 5: Select the ODBC driver from the list of available drivers. Step 6: Enter the connection details for your Oracle Database, including the hostname, port, username, and password. Step 7: Test the connection and click on Finish to complete the connection setup. Connection Oracle to Excel using Toad Toad is another popular method to connect Oracle Database to Excel. Here’s a step-by-step guide to connecting Excel to Oracle Database using Toad: Step 1: Install Toad on your computer. Step 2: Open Toad and go to the Database menu. Step 3: Select the Connection Manager option and click on New Connection. Step 4: Enter the connection details for your Oracle Database, including the hostname, port, username, and password. Step 5: Test the connection and save the connection details. Step 6: Open Excel and go to the Data tab. Step 7: Click on the From Other Sources button and select From SQL Server. Step 8: Enter the connection details for your Oracle Database and select the table or view you want to import. Step 9: Click on OK to import the data. Connect Oracle to Excel through a Macro Using a macro is a simple method to connect Oracle Database to Excel. Here’s a step-by-step guide to connecting Excel to Oracle Database using a Macro: Step 1: Open Excel and press Alt + F11 to open the VBA editor. Step 2: Click on Insert and select Module. Step 3: Enter the VBA code to connect to your Oracle Database. Step 4: Run the macro to import the data into Excel. Connecting Excel to Oracle without ODBC Wondering how to get data from Oracle database in Excel sheet, yet not ready to use ODBC drivers? One of the easiest methods to connect Excel to an Oracle database without using ODBC is to employ ActiveX Data Objects (ADO) and VBA (Visual Basic for Applications) to create a connection to a data source. This approach will let you access the database using a connection string, query the data from it, and write it to the worksheet. Connecting Excel to an Oracle Database Using ADO and VBA Worksheet preparation First, navigate to the File menu, choose Options , and then opt for the Customize Ribbon menu. Here, enable the Developer tab with a check mark and click Ok to proceed. Next, create the TEmployees table to create the destination which you want to populate with the records from the database.To do it, open Excel and go to Sheet1 . Note that if Sheet1 does not exist, you should right-click on any sheet tab at the bottom, select Insert , and then choose Worksheet . In the top left corner, you have an option to rename it to Sheet1 (right-click it to see the action menu and click Rename ). The next preparational step is to prepare data or headers. In Sheet1 , in cell A1 , enter the header for the first column, e.g., Emp_ID . In cell B1 , enter the header for the second column, e.g., Name . You can leave the rows below the headers empty — the data will be populated by your macro later. As a result, your sheet should look as follows: A B Emp_ID Name Table creation Now, let’s create a table that you intend to populate. To do it, select the range containing the headers (in our example, it’s A1:B1 ). If you want to include empty rows for future data, select, e.g., A1:B20 (note that the number of rows should match the rows you intend to populate), and in our script, we’re going to have 20 of them. Go to the Insert tab on the top ribbon. Click Table in the Tables section. In the Create Table dialog box: Ensure the range matches your selection (e.g., =$A$1:$B$1). Check the My table has headers box. Click OK . Since we will export data from the Emloyees database, let’s name the table TEmployees . Table naming and formatting After creating the table, it will be selected, and the Table Design tab will appear in the ribbon. In the Properties located in the Table Name field, you should replace Table1 with TEmployees . Press Enter to save the name. After that, your table should include the stripes formating that’s added by default. Now, your table is prepared to be populated with data. Ensure macro-enabled format To make sure that the workbook format lets you use VBA, navigate to the File menu and proceed to the Save as option. Choose Excel Macro-Enabled Workbook (.xlsm) format. Name the file and hit Save . Set up the connection to the Oracle database using ADO and VBA Open the Developer tab and click Visual Basic to start creating an application (or use or use the shortcut Alt + F11 for fast access). In the VBA editor window, go to Insert, and click Module to create a new module for your code. Add a Reference to ADO. Next, in the VBA editor, click Tools, and navigate to References. In the list, find and check Microsoft ActiveX Data Objects X.X Library (select the latest version, e.g., 6.1). Click OK to proceed. Write the code for connecting and retrieving data from the database Insert the following code into the created module: Sub ConnectTOOracle()\n \n Dim cn As ADODB.Connection\n Dim rs As ADODB.Recordset\n Dim mtxData As Variant\n \n Worksheets(\"Sheet1\").Activate\n ActiveSheet.ListObjects(\"TEmployees\").DataBodyRange.Value = \"\"\n \n \n Set cn = New ADODB.Connection\n Set rs = New ADODB.Recordset\n \n cn.Open ( _\n \"User ID=hr\" & _\n \";Password=oracle_test\" & _\n \";Data Source=oscorp\" & _\n \";Provider=OraOLEDB.Oracle\")\n \n rs.CursorType = adOpenForwardOnly\n rs.Open (\"select employee_id, first_name || ' ' || last_name from hr.employees where employee_id < 110 order by employee_id\"), cn\n \n mtxData = Application.Transpose(rs.GetRows)\n \n Worksheets(\"Sheet1\").Activate\n 'ActiveSheet.Range(\"a1:b20\") = mtxData\n ActiveSheet.ListObjects(\"TEmployees\").DataBodyRange.Resize(UBound(mtxData, 1) - LBound(mtxData, 1) + 1, UBound(mtxData, 2) - LBound(mtxData, 2) + 1) = mtxData\n \n \n 'Cleanup in the end\n Set rs = Nothing\n Set cn = Nothing\n \nEnd Sub Note that this code contains several variables for connection, recordset, and data retrieval, so you will have to adjust them according to your needs. Also, the SQL query in this example retrieves employee_id, first_name , and last_name from the hr.employees table, but you are free to replace the table name and column names with your actual data. Here’s a breakdown of the changes you will have to make. First, we have to declare the variables: cn This is the ADO Connection object, which will be used to establish a connection to the Oracle database. rs This is the Recordset object, which will hold the data retrieved from the database. mtxData This is a variant array that will store the data extracted from the rs (recordset) before being placed into the Excel sheet. Next, let’s clear the existing table in Excel with this part of the code to prepare it for being populated with the data we will retrieve. Worksheets(\"Sheet1\").Activate\nActiveSheet.ListObjects(\"TEmployees\").DataBodyRange.Value = \"\" As a result, we’ll clear any existing data in the Excel table named “TEmployees”, ensuring that fresh data is loaded every time the macro runs. Next, we’re going to establish a connection with the Oracle database using this code: Set cn = New ADODB.Connection\nSet rs = New ADODB.Recordset\n\ncn.Open ( _\n \"User ID=hr\" & _\n \";Password=oracle_test\" & _\n \";Data Source=oscorp\" & _\n \";Provider=OraOLEDB.Oracle\") Note that to use OraOLEDB.Oracle, you must have the Oracle Client or Oracle Instant Client with OLE DB support installed on your computer. Here’s the breakdown: Set cn = New ADODB.Connection Creates a new database connection . cn.Open(…) Opens the connection using the Oracle OLE DB Provider ( OraOLEDB.Oracle ) , with the following details: User ID = hr (Oracle username) Password = oracle_test (Oracle password) Data Source = oscorp (This is the Oracle TNS alias or service name ) Provider = OraOLEDB.Oracle (Oracle’s OLE DB driver for ADO) Next, comes the query execution part: rs.CursorType = adOpenForwardOnly\nrs.Open (\"select employee_id, first_name || ' ' || last_name from hr.employees where employee_id < 110 order by employee_id\"), cn rs.CursorType = adOpenForwardOnly Defines a forward-only cursor , meaning the recordset can only be traversed once (it is more memory-efficient). rs.Open(…) Executes the SQL query: Retrieves employee_id and full name ( first_name || ‘ ‘ || last_name ) from the hr.employees table. Filters records to include only those where employee_id is less than 110. Sorts results in ascending order by employee_id. And, finally, we are going to transfer data to Excel and write data to the table using this part of code: mtxData = Application.Transpose(rs.GetRows)\n\nWorksheets(\"Sheet1\").Activate\n'ActiveSheet.Range(\"a1:b20\") = mtxData\nActiveSheet.ListObjects(\"TEmployees\").DataBodyRange.Resize(UBound(mtxData, 1) - LBound(mtxData, 1) + 1, UBound(mtxData, 2) - LBound(mtxData, 2) + 1) = mtxData Here’s a breakdown of the values used there which can be replaced by your instances: rs.GetRows Fetches all records from the recordset into a two-dimensional array . Application.Transpose(…) Transposes the array so that the data aligns correctly when pasted into the Excel table. The script also dynamically resizes the TEmployees table and fills it with the retrieved data: UBound(mtxData, 1) – LBound(mtxData, 1) + 1: Determines the number of rows. UBound(mtxData, 2) – LBound(mtxData, 2) + 1: Determines the number of columns. The .Resize(…) ensures the table adjusts automatically to fit the data. At the end, you will also have to provide a cleanup of memory: Set rs = Nothing\nSet cn = Nothing Run the code To run the code, return to Excel. In the Developer tab, click Macros , select ConnectToOracle , and click Run . The data from the Oracle database should populate the TEmployees table on Sheet1 . The method described above has its undeniable advantages since it lets you establish a direct database connection, and as a result you fetch data directly from the Oracle database without requiring manual exports. Also, when using it, you get real-time data retrieval whenever the macros run. Also, this approach can work with large data sets, as ADO retrieves only the required data (as per the SQL query), reducing unnecessary load. How to Import Oracle Data to Excel Once you have established a connection between Oracle Database and Excel, importing data is a straightforward process. Here’s a step-by-step guide to importing data from Oracle Database to Excel using the Devart [Oracle ODBC driver](https://www.devart.com/odbc/oracle/) : Step 1: Open Excel and go to the Data tab. Step 2: Click on the From Other Sources button and select From. Step 3: Select From Data Connection Wizard. Step 4: In the Data Connection Wizard, select the ODBC DSN option and click on Next. Step 5: Select the Devart Oracle ODBC driver from the list of available drivers and click on Next. Step 6: Enter the connection details for your Oracle Database, including the hostname, port, username, and password. Step 7: Test the connection and click on Finish to complete the connection setup. Step 8: In the Import Data dialog box, select the table or view you want to import and click on OK. Step 9: Choose the location where you want to import the data, and click on OK. Devart Oracle ODBC driver is a software driver that enables applications to connect to Oracle databases via the Open Database Connectivity (ODBC) interface. ODBC is a widely used API for accessing databases, and the Devart Oracle ODBC driver provides a way for applications that support ODBC to access Oracle databases. The Devart Oracle ODBC driver provides a high-performance and reliable way to connect to Oracle databases. It supports all major Oracle versions, including Oracle 12c, and provides a wide range of features and options for working with Oracle data. Some of the key features of the Devart Oracle ODBC driver include: Support for Unicode data; Support for stored procedures and functions; Support for large objects (LOBs); Support for Oracle-specific data types; Support for Oracle Advanced Security features; Support for Oracle RAC (Real Application Clusters); The driver also provides a variety of configuration options, allowing users to fine-tune the connection settings to optimize performance and security. It also supports connection pooling, which allows multiple applications to share a single connection to the database, improving performance and scalability. In addition to its features and performance, the Devart Oracle ODBC driver is known for its ease of use and reliability. It provides a simple and intuitive interface for configuring connections and accessing Oracle data, and is backed by a team of experienced developers and support staff who provide ongoing updates and assistance to users. You can try it now. [Download ODBC for Oracle](https://www.devart.com/odbc/oracle/download.html) . Troubleshooting common connection issues Sometimes even though you set up everything according to instructions, you still might encounter some errors that might either prevent you from connecting the Oracle database to Excel or affect the quality of the connection. Let’s overview some of them in detail. Fixing ODBC driver errors Data source name not found That error means that the Data Source Name (DSN) you are specifying in your connection configuration is not being found in the Windows registry. To fix this error, you must ensure that you have specified the DSN correctly. To do it, you will have to open the ODBC Data Source Administrator and check if you have filled the field correctly. ODBC connection failed When you make changes to your server, such as assigning a new name, IP address, or DSN, the driver won’t be able to establish the connection since it uses the information you have provided previously. You will need to update the details of your connection whenever you change them. Resolving authentication and login issues Some of the most common issues you may encounter when trying to import data from Excel to Oracle are: Incorrect credentials : Ensure the username and password are entered correctly, considering case sensitivity. Expired or locked account : The database administrator may need to unlock the account or reset the password. Insufficient user privileges : Users must have the necessary permissions, including CREATE SESSION and access to required tables or views. Database authentication settings : If authentication modes (TNSNAMES or EZCONNECT) are misconfigured, Oracle may reject login attempts. Network connectivity issues : Firewalls, VPNs, or incorrect server details can prevent authentication. Let’s explore TNSNAMES and EZCONNECT configurations in detail. TNSNAMES connection method Oracle database administrators typically provide users with two essential configuration files: TNSNAMES.ORA SQLNET.ORA These files need to be saved in a specific folder on the user’s system, and environment variables must be configured to ensure proper connectivity. Configuration steps Install Oracle ODAC Components in C:\\Oracle. Save the TNSNAMES.ORA and SQLNET.ORA files in C:\\Oracle\\TNS. Set Environment Variables: Create a system environment variable named TNS_ADMIN with the value C:\\Oracle\\TNS. Add the following paths to the system Path variable: C:\\Oracle C:\\Oracle\\bin Reboot the system to apply changes. EZCONNECT method EZCONNECT simplifies Oracle connectivity by eliminating the need for TNSNAMES.ORA, requiring only SQLNET.ORA. Configuration Steps: Install Oracle ODAC Components in C:\\Oracle. Save the SQLNET.ORA file in C:\\Oracle\\TNS. Set Environment Variables: Create a system environment variable named TNS_ADMIN with the value C:\\Oracle\\TNS. Add the following paths to the system Path variable: C:\\Oracle C:\\Oracle\\bin Reboot the system to apply changes. Checking the correctness of these configurations will ensure you have a smooth connectivity between Oracle data source and Excel. Dealing with large dataset performance issues When working with large datasets in Excel, users may experience slow queries and application freezing. This is often due to the volume of data being processed and inefficient queries. Below are some strategies to improve performance: Common issues and solutions Issue Cause Solution Slow Queries Large queries take longer to execute and return results. Optimize SQL queries by selecting only necessary columns, applying WHERE clauses, and avoiding SELECT *. Excel Freezing Excessive data retrieval causes Excel to become unresponsive. Limit data import by filtering before retrieval and setting row limits. Memory Limitations Excel has constraints on handling large datasets efficiently. Use Power Query or Power BI for better performance. Optimization strategies Strategy Description Optimize SQL Queries Use efficient SELECT statements, avoid SELECT *, and apply filtering at the source. Use Indexed Views Implement indexes on frequently queried tables and use materialized views for pre-aggregated data. Limit Data Import Import only essential rows using Excel’s data import options and apply filters before loading. Enable Query Caching Utilize Oracle’s RESULT_CACHE hints to store frequently used query results. Use Power Query or Power BI These tools offer better performance for handling large datasets compared to standard Excel imports. By implementing these techniques, users can significantly reduce query execution time and prevent Excel from freezing when handling large Oracle datasets. Best practices for managing Oracle data in Excel Check the strategies described below to ensure your Excel reports remain accurate, efficient, and secure when you import data from Oracle to Excel. Keeping data up to date with automated refresh For seamless analysis, ensure your Oracle data in Excel stays current. Use ODBC to update data when opening the workbook or at set intervals, or Power Query — for a flexible approach. For full automation, schedule a VBA script with ActiveWorkbook.RefreshAll via Windows Task Scheduler. Handling data formatting issues after import Imported Oracle data often suffers from formatting inconsistencies. Dates may appear incorrectly when they have different formats in the Excel file and in a database. Large numbers also sometimes can convert incorrectly, so make sure to set the format to Number before import. Another matter you have to consider is that encoding mismatches can corrupt special characters. We advise using advanced data conversion mechanisms that provide bi-directional mapping, which are, for instance, available between any Oracle and ODBC data types. Check [ODBC Driver for Oracle from Devart](https://www.devart.com/odbc/oracle/) to explore its data export options in detail. Avoiding security risks when connecting to Oracle Looking for ways to minimize security risks, yet still need to access data from Oracle database from Excel? To ensure the data transfer is secure, implement strong access controls by defining user roles and permissions, ensuring users only access the data they need. Also, it’s a good practice to monitor database activity through logging and auditing tools to detect suspicious behavior in real time. Additionally, avoid exposing credentials and protect .odc and .dsn files by restricting access to authorized users only. Alternative ways to export Oracle data to Excel As you can see, there are many ways to set up Excel Oracle database connection; however, sometimes it might be easier for you to export data only once, especially when this is a one-time task you want to handle. Here are several methods for export you can use for the export tasks that won’t be repeated. Exporting Oracle data as CSV for Excel import Exporting Oracle tables as CSV is a straightforward way to transfer data into Excel for analysis, even though it lacks the flexibility that comes with ODBC or other methods. To generate a CSV file using SQL queries, run the following command in SQL*Plus or SQLcl: SET MARKUP CSV ON\nSPOOL output.csv\nSELECT * FROM your_table;\nSPOOL OFF This saves the query results as a CSV file, which can be opened in Excel. Using Oracle SQL Developer to export data Oracle SQL Developer simplifies data exports by providing a built-in interface to generate Excel-compatible files. To export a table or query result: Open SQL Developer and connect to your database. Run the desired query or open the table. Right-click the result grid and select Export . Choose Excel (.xls/.xlsx) or CSV as the format. Configure export options and save the file. Compared to direct Excel connections, this method provides more control over data selection and formatting while avoiding connection setup complexities. Automating data exports with PL/SQL scripts For recurring data exports, PL/SQL procedures can automate the process. A scheduled PL/SQL script can write query results to a CSV file and move it to a designated location: DECLARE  file UTL_FILE.FILE_TYPE;BEGIN  file := UTL_FILE.FOPEN('/export_path/', 'output.csv', 'W');  FOR rec IN (SELECT * FROM your_table) LOOP    UTL_FILE.PUT_LINE(file, rec.column1 || ',' || rec.column2);  END LOOP;  UTL_FILE.FCLOSE(file);END;/ For automation, use DBMS_SCHEDULER to run the script at scheduled intervals. To load the data into Excel, SQL Loader or Power Query can be configured to pull from the exported file automatically. Connection method comparison Now that you know how to connect Oracle database in Excel using different methods, you might feel a bit lost about which of them is going to be the best option for you. Check this side-by-side comparison of different methods to choose the one that best fits your project. Method Ease of Setup Data Refresh Best For ODBC Driver Moderate – requires configuring driver and DSN settings Manual (can be automated with scripts) Connecting to external databases, integration with various systems Power Query Easy – user-friendly interface, no coding required Automated (can be set to refresh on schedule) Data extraction, transformation, and visualization in Excel or Power BI VBA Moderate – requires knowledge of VBA scripting Manual (can be automated with additional code) Custom automation within Excel, small datasets, simple tasks SQL Developer Moderate – requires installation and database connection setup Manual (can be automated with scheduled tasks) Query execution, database management, and SQL-based reporting Conclusion Oracle is a powerful and widely used database management system that is popular for its reliability, scalability, and security. It is used by businesses of all sizes across a variety of industries for managing and analyzing large volumes of data, and its features and active community make it a popular choice for organizations with complex data processing requirements. Excel is a widely used spreadsheet program that provides users with a flexible and customizable tool for organizing, analyzing, and manipulating data. Its ease of use, built-in features, and third-party add-ins make it a powerful tool for a variety of tasks in both personal and professional settings. Data transfer between different databases and services is an essential aspect of data management in today’s world. Oracle Database and Excel are two widely used tools, and connecting them can make data transfer easier and more efficient. In this article, we explored different methods to connect Oracle Database to Excel, including using an ODBC driver, Toad, and a macro. We also provided a step-by-step guide to importing data from Oracle Database to Excel using the Devart Oracle ODBC driver. By following these methods, you can import data from Oracle Database to Excel in just a few clicks. Besides, Devart provides a range of ODBC drivers for different databases, in addition to the Devart Oracle ODBC driver. These drivers enable applications to connect to various databases using the ODBC interface, providing a consistent and reliable way to access data. Some of the other ODBC drivers offered by Devart include: Devart MySQL ODBC driver: Enables applications to connect to MySQL databases via the ODBC interface. It supports all major MySQL versions, provides support for Unicode data, and supports a wide range of MySQL-specific features and options. Devart SQL Server ODBC driver: Enables applications to connect to Microsoft SQL Server databases via the ODBC interface. It supports all major SQL Server versions, provides support for Unicode data, and supports a wide range of SQL Server-specific features and options. Devart PostgreSQL ODBC driver: Enables applications to connect to PostgreSQL databases via the ODBC interface. It supports all major PostgreSQL versions, provides support for Unicode data, and supports a wide range of PostgreSQL-specific features and options. Devart SQLite ODBC driver: Enables applications to connect to SQLite databases via the ODBC interface. It supports all major SQLite versions, provides support for Unicode data, and supports a wide range of SQLite-specific features and options. All of these drivers are designed to provide high-performance and reliable connectivity to their respective databases. They are easy to install and configure, and provide a range of features and options for working with data. They also provide support for Unicode data, making them ideal for international applications. FAQ Can I connect multiple Excel files to the same Oracle database? Yes, you can connect multiple Excel files to the same Oracle database using ODBC drivers. Each file can have its own ODBC connection to the database, allowing them to independently access or interact with the same data. Simply set up an ODBC data source for Oracle and connect each Excel file to it via the Get Data option in Excel. What is the best way to refresh imported data automatically? The best way to refresh imported data automatically in Excel is by using Power Query with automatic refresh settings, or by configuring an ODBC connection to refresh at specified intervals or when opening the workbook. Alternatively, you can use a VBA macro to automate the refresh process. How do I filter data before importing to Excel? To filter data before importing to Excel, you can use Power Query to set up filters during the import process. When connecting to your Oracle database via ODBC or other sources, use Power Query’s built-in filtering options to specify the conditions for the data you want to import. How to get data from Excel to Oracle? To get data from Excel to Oracle, you can export the data as a CSV file and use Oracle SQL Developer to import it, or set up an ODBC connection and use Excel’s data features to send the data directly. Alternatively, Power Query or a VBA macro can be used to execute SQL queries and insert the data into Oracle. Tags [database](https://blog.devart.com/tag/database) [devart odbc](https://blog.devart.com/tag/devart-odbc) [excel](https://blog.devart.com/tag/excel) [odbc drivers](https://blog.devart.com/tag/odbc-drivers) [Oracle](https://blog.devart.com/tag/oracle) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-oracle-database-to-excel-import-your-data-in-minutes.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Oracle+Database+to+Excel%3A+Import+Your+Data+Easily&url=https%3A%2F%2Fblog.devart.com%2Fconnect-oracle-database-to-excel-import-your-data-in-minutes.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html&title=How+to+Connect+Oracle+Database+to+Excel%3A+Import+Your+Data+Easily) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html&title=How+to+Connect+Oracle+Database+to+Excel%3A+Import+Your+Data+Easily) [Copy URL](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-php-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Connect PHP With MySQL By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) November 8, 2024 [0](https://blog.devart.com/connect-php-mysql.html#respond) 1355 When it comes to building interactive, scalable, and data-driven websites and applications, you may consider a combination of PHP with MySQL. Integrating PHP as a server-side language with MySQL as a relational database allows for dynamic content generation, user authentication, form handling, and easier data management. In the article, we’ll explore several methods to connect PHP with a MySQL database, including the mysqli and PDO (PHP Data Objects) extensions. After establishing the connection, we’ll validate this using [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , an all-in-one IDE that helps database developers and administrators manage and optimize their MySQL and MariaDB environments, increasing productivity and simplifying database workflows. Contents Connecting PHP to MySQL using mysqli Connecting PHP to MySQL using PDO (PHP Data Objects) Validating a PHP connection to MySQL using dbForge Studio for MySQL Using a legacy mysql_connect method Connecting PHP to MySQL using mysqli mysqli is a MySQL improved extension of PHP used to access MySQL databases and manipulate data in them. mysql_connect is a PHP function that was previously used to establish a connection to a MySQL database. It belongs to the original MySQL extension that allowed PHP applications to access MySQL databases. mysqli offers improved functionality, security, and performance over the older mysql_connect extension, which is deprecated as of PHP 5.5.0 and has been completely removed, meaning it is no longer available for use. Why is mysqli recommended to use? Support for prepared statements that enhance security by reducing the risk of malicious code execution. Support for object-oriented and procedural programming styles that make it easier for developers to choose an approach that best suits their coding style. Ability to execute multiple statements in a single query that improves performance in complex scenarios. Support for transactional operations that help maintain data integrity and ensure the successful execution of all the related queries before being committed. Compatibility with the MySQL Native Driver (mysqlnd), which reduces memory usage and improves query execution times. Prerequisites Before we connect to a MySQL database, we need to download [XAMPP](https://sourceforge.net/projects/xampp/) . It is a free and open-source software package to set up a local web server environment on your computer for development and testing. The software will help us connect PHP to MySQL using mysqli because it provides a complete and pre-configured environment that includes: Apache (Web Server) : To run PHP scripts, which process server-side code. MySQL (Database Server) : To store and manage data for your PHP applications. PHP (Scripting Language) : To write and execute server-side code that interacts with MySQL. After XAMPP has been installed, launch it and start the Apache and MySQL servers from the Control Panel. In addition, it includes a web-based tool – [phpMyAdmin](https://www.devart.com/dbforge/mysql/studio/alternative-to-phpmyadmin.html) that allows for easier management of MySQL databases. Note that the required ports must be available; otherwise, errors may occur. If an error arises because the port is in use, you must change it to a free one by adjusting the configuration. To do this, select Config next to the required server: For Apache : Select Apache (httpd.conf) from the Config shortcut menu. In the configuration file that opens, locate the port number, update it to an available port, and save the changes. For MySQL : Select my.ini from the Config shortcut menu. In the file that opens, update the port number to a free one and save your changes. When the servers are running, they are highlighted in green. Note that when using XAMPP to host websites or web applications locally, you must save your PHP files to the htdocs folder. To access these .php files, you need to navigate to this folder, copy the file name, and open it in a web browser at http://localhost/filename . For example, if you have an index.php file in the htdocs folder, you can open it in a web browser at http://localhost/index.php . After we have prepared the environment, we can create a connection between PHP and MySQL using the mysqli extension. Creating a connection Use the following script to connect with a MySQL server database and execute a SQL query. To proceed, open the text editor and save the following PHP script to the htdocs folder with a .php extension. The script establishes a connection with a MySQL database: The script parameters include: @mysqli_connect(...) is a function that creates a connection to a MySQL database using the mysqli extension. localhost is the address of the database server. root is the username for the specified database connection. \"\" is the password for the specified database connection. In the example, it is left empty. It is not recommended for security but can be used for local testing. test is the database name to which the script connects. 3307 is the port number for the MySQL server. mysqli_connect_errno() is the function that checks for connection errors. If an issue occurs, the function returns an error message with the code, and the script exits to prevent further processing. If no error arises, the script confirms a successful connection by displaying ‘Connected successfully!’. Let us now check if the connection has been created. To do this, open the file in the browser at http://localhost:80/ConnectToMySQL.php where ConnectToMySQL.php is the name of the PHP file to create to a MySQL connection. Note that since the port 80 is the default, so specifying it in the URL is unnecessary unless it has been reconfigured to use a different port. As you can see, the connection is now successfully in place. Handling connection errors For testing purposes, it might be required to debug a PHP script for possible errors in PHP when connecting to a MySQL database using mysqli . In such cases, you can use the mysqli_error() function. It displays the specific error message MySQL returns. To debug and output the connection error message, run the following script: if (mysqli_connect_errno()) { \n\n echo \"Failed to connect to MySQL: (\" . mysqli_connect_errno() . \") \" . mysqli_connect_error(); \n\n exit(); \n} In this script, mysqli_connect_errno() checks if there was an error when connecting to the MySQL database. Upon a successful connection, it returns 0 . However, if an error has been detected, the function outputs an error message providing a clear explanation for the connection failure. In case the connection fails, the function stops the execution of the script. For example, if we save the following script, in which we intentionally changed the database name for test1 , and open it in a browser: We’ll get the following result: In the error message, we see that the script stops executing since it is trying to connect to a non-existent database – test1 – that does not exist on the specified MySQL server. Connecting PHP to MySQL using PDO (PHP Data Objects) What is PDO (PHP Data Objects)? It is a database access layer in PHP that provides a unified interface to interact with different databases, including MySQL. Unlike the mysqli extension, which only works with MySQL, PDO supports multiple DBMSs, including PostgreSQL, SQLite, and Microsoft SQL Server. This makes it popular for developers working on projects where migration between different databases is their common task. PDO also offers advanced features, such as prepared queries and transaction management. Here are some key advantages of using PDO for database interactions: Compatibility with different databases : PDO supports multiple database systems, such as MySQL, PostgreSQL, SQLite, Oracle, etc. For example, if you want to migrate your project from MySQL to PostgreSQL or any supported database, you’ll only need to change the DSN (Data Source Name) line instead of rewriting the entire code. Support for prepared statements : Security: They help reduce the risk of SQL injection attacks. Efficiency : They allow compiling a SQL query once and executing it multiple times with different parameters. This, in turn, improves performance when running similar queries multiple times. Support for an object-oriented interface : This approach allows for better code organization and enhanced code reuse. This makes database code cleaner and more maintainable. Errors handling : PDO offers robust error handling through exception-based error reporting. You can manage database-related errors more efficiently by setting the error handling mode to throw exceptions (PDO::ERRMODE_EXCEPTION). This makes debugging and error tracking easier, as PDO displays a clear error message on exceptions whenever a failure occurs. Support for named parameters in prepared statements : They can improve code readability. In addition, named parameters allow variables to be associated with query parameters by name rather than in order, which makes code more understandable, especially when working with large queries. Support for transactions : They can ensure data integrity when running multiple dependent queries. PDO can validate or roll back all changes based on whether all queries are successful or fail. Flexibility in data retrieval : PDO offers different selection modes to retrieve data in multiple formats, such as an associative array, an indexed array, an object or even a custom class. This gives flexibility in handling query results and improves code structure. Support for parameter binding by data type : PDO allows you to explicitly bind parameters to specific data types (int, string, etc.) in prepared statements to ensure that the data is processed correctly and cast to the correct type. This helps to improve security and avoid potential SQL errors due to type incompatibility. Backward compatibility : PDO is compatible with multiple versions of PHP, ranging from PHP 5.x to PHP 8.x. This ensures that your code remains functional and portable across different environments without significantly modifying the database interaction logic. Creating a connection using PDO with new PDO() To create a connection to a MySQL database using PDO, you use the new PDO() function, which initializes a PDO object with the necessary database credentials and connection options. Here is a step-by-step example: Open the text editor and save the following script with a .php extension to the htdocs folder: PDO::ERRMODE_EXCEPTION, // Throw exceptions on errors\n PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC, // Set default fetch mode to associative array\n PDO::ATTR_EMULATE_PREPARES => false, // Use real prepared statements\n];\n\necho \"Connection successful!\";\n?> The variables in the code define the information to connect to a MySQL database. To run this code, you must initialize a PDO object with $dsn , $user , $pass , and $options for an actual database connection. The DSN string provides connection details, including: host is the MySQL server location. port is the port number for the specified MySQL server. dbname is the name of the MySQL database you want to connect. charset is the character encoding. PDO options include: PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION ensures any database errors trigger exceptions for easier error handling. PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC sets the default fetch mode to return results as associative arrays. PDO::ATTR_EMULATE_PREPARES => false ensures prepared statements are processed directly by the database server. When we open this file in a browser, we’ll see that the connection has been established successfully: Handling exceptions After successfully connecting to the MySQL database using PDO, it is recommended to handle potential errors to ensure your application remains stable. In this case, try-catch blocks may be useful. They can help you detect and respond to errors like connection issues or failed queries. Let us explore how to manage these exceptions in PDO. In the editor, create a .php file with the following code and save it to the htdocs folder. As you may have noticed, we have intentionally changed the database from test to test1 , a non-existent database. PDO::ERRMODE_EXCEPTION, // Throw exceptions on errors\n PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC, // Set default fetch mode to associative array\n PDO::ATTR_EMULATE_PREPARES => false, // Use real prepared statements\n];\n\ntry {\n $pdo = new PDO($dsn, $user, $pass, $options); // Create a new PDO instance\n echo \"Connection successful!\"; // Connection is successful\n} catch (PDOException $e) {\n echo \"Connection failed: \" . $e->getMessage(); // Handle connection error\n}\n?> In this code, the try block attempts to create a PDO instance and connect to the MySQL database. If an exception is thrown, the catch block catches the PDOException and outputs the error message using $e->getMessage() , which explains why the connection failed. After opening this .php file in the browser, we’ll get the following error message – ‘Connection failed: Unknown database ‘test1” . Validating a PHP connection to MySQL using dbForge Studio for MySQL We have already established the PHP connection to a MySQL database. It is time to verify if we can work on it. To validate this, we’ll use an advanced [MySQL IDE](https://www.devart.com/dbforge/mysql/studio/) – dbForge Studio for MySQL, which is available on Windows, Linux, and macOS platforms. The Studio is designed to simplify MySQL and MariaDB database development, administration, and management. It offers multiple built-in tools for database-related operations, such as [query building](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) , [database design](https://www.devart.com/dbforge/mysql/studio/database-designer.html) , [data editing](https://docs.devart.com/studio-for-mysql/working-with-data-in-data-editor/editing-data-in-grid.html) , [debugging](https://www.devart.com/dbforge/mysql/studio/code-debugger.html) , [query profiling](https://www.devart.com/dbforge/mysql/studio/query-profiler.html) , automation of routine tasks, and performance analysis. With its user-friendly interface, the rich functionality of the Studio boosts productivity by providing visual [data import/export tools](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) , [schema comparison](https://www.devart.com/dbforge/mysql/studio/mysql-database-schema-compare.html) , and [backup management](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) . To enhance productivity and ensure consistency across SQL scripts, dbForge Studio for MySQL simplifies SQL development by providing IntelliSense-style [code completion](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) , context-aware prompts, and automated [formatting](https://www.devart.com/dbforge/mysql/studio/sqlmanagement.html) for clean, efficient, and error-free code. Setting up dbForge Studio for MySQL You can [download](https://www.devart.com/dbforge/mysql/studio/download.html) dbForge Studio from the Devart website and read about [compatibility requirements](https://docs.devart.com/studio-for-mysql/getting-started/requirements.html) before the tool’s installation. After downloading the Studio, navigate to the installation folder and run the dbforgemysql.exe file. The default installation folder is C:\\Program Files\\Devart\\dbForge Studio for MySQL . Alternatively, you can install the Studio [from the com](https://docs.devart.com/studio-for-mysql/getting-started/installing-from-the-command-line.html) [mand line](https://docs.devart.com/studio-for-mysql/getting-started/installing-from-the-command-line.html) . In the installation wizard that opens, follow the steps and then launch the Studio. That’s it! Creating a connection with dbForge Studio for MySQL Now, you can establish a PHP connection to a MySQL server using dbForge Studio for MySQL. To do this, you need to open the Database Connection Properties dialog. To connect to a server, follow the steps: 1. Open the Studio. 2. Navigate to the Database menu and select New Connection to open the Database Connection Properties dialog. Alternatively, on the Database Explorer toolbar, select New Connection . 3. On the General tab of the dialog that opens, specify the connection details: Host : Specify the MySQL server you want to connect. In our example, it is localhost . Port : Specify the port number of the server. For our case, it is 3307 . User : Enter the username for the specified database connection. Database : Specify the database name to which you want to connect. If the tool allows you to choose the test database from the list, this validates that the connection to PHP has been established. The sample database is test . 4. Select Test Connection to verify that the connection has been created. If it is successful, the popup window displays the server version, and the Successfully connected message appears. 5. In the popup window, select Connect . The Database Explorer tree will show the connection and a list of databases available for this connection. Using a legacy mysql_connect method In PHP, the mysql_connect function was earlier used to connect PHP to MySQL databases. However, since PHP 5.5.0 it has become outdated and was fully removed in PHP 7.0.0. This function is no longer used or recommended due to a number of limitations and vulnerabilities it has compared to more advanced alternatives, such as mysqli and PDO . Why has mysql_connect been deprecated? Deprecation in favor of improved extensions : mysql_connect() was removed in PHP 7.0. Any code that uses this function will cause fatal errors in PHP 7 and higher versions. Therefore, it is recommended that developers who maintain code using this function switch to more modern extensions, such as mysqli or PDO . Lack of support for advanced MySQL features : mysql_connect() does not support prepared statements, which are necessary to prevent SQL injection. Instead, mysqli and PDO can be used with prepared statements, making them much more secure when dealing with dynamic data from users. Single database connection : mysql_connect() opens a new database connection each time it is called unless the connection is persistent (via mysql_pconnect() ). This can lead to inefficiencies and increased load, especially in high-traffic applications. Modern extensions, such as mysqli and PDO , allow for more efficient management of the connection pool. No object-oriented support : mysql_connect() uses a procedural API, which means it lacks the flexibility and code organization capabilities that object-oriented programming (OOP) provides. In contrast, mysqli and PDO support an object-oriented interface to improve code organization, modularity, and reusability. No support for multiple queries : mysql_connect() does not support executing multiple queries in a single statement, which limits its use for batch processing or handling complex logic in one query. This feature, however, is available in mysqli , allowing more flexibility in executing multiple queries at once. Limited error handling : Error handling in mysql_connect() is minimal. It relies on functions, such as mysql_error() and mysql_errno() , which provide only basic error messages. The newer extensions, such as mysqli and PDO , offer error handling via exceptions, which is much more reliable and easier to manage. Lack of encoding support : Setting the character encoding for a MySQL connection in mysql_connect() requires additional steps, such as running a separate SQL query ( SET NAMES ), which can lead to inconsistencies or errors. In contrast, mysqli and PDO allow developers to specify encoding directly when connecting, which makes it easier to work with different languages and encodings. Limitations of mysql_connect() Security risks : One of the most critical limitations of mysql_connect() is the lack of support for prepared statements. This means that developers must manually escape user input, which increases the risk of SQL injection, one of the most common vulnerabilities in web applications. Procedural approach : It is not suitable for scaling and maintenance in large projects. Object-oriented programming (OOP) is used in modern development practices. OOP is supported by mysqli and PDO . Issues with persistent connections : mysql_pconnect() , which is the persistent version of mysql_connect() , has own challenges, including difficulties with connection pool management, which can lead to resource management issues. Persistent connections often don’t behave as expected in modern environments and can potentially cause unpredictable results. No transaction support : mysql_connect() does not support database-level transactions, which can be used to ensure data integrity for complex operations. Without transactions, it is difficult to ensure that some operations will either fully execute or completely roll back. Modern alternatives, such as mysqli and PDO , provide full support for transactions. Lack of support for executing multiple queries : Unlike mysqli_multi_query() , which allows executing multiple SQL queries in a single call, mysql_connect() does not support this feature. This limits performance and efficiency when working with batch queries. Compatibility issues : Since mysql_connect() is no longer available in recent versions of PHP, any code using this function is not compatible with future versions of PHP. Developers have to rewrite older codebases to ensure compatibility with PHP 7+ and higher versions. We have already discussed the reasons and limitations of why the mysql_connect() is not recommended for use. Still, we would like to illustrate this in an example. Connecting with mysql_connect Open the text editor and save the script to the .php file in the htdocs folder. This PHP script establishes a basic connection to a MySQL database using the mysql_connect() function. Then, open the created file in the browser. The result would be as follows: The error informs that the mysql_connect() function is called, but PHP cannot find it because it is no longer available. This happened because the mysql_connect() function was removed in PHP 7.0. Migration to mysqli or PDO To fix the problem, we can use either mysqli or PDO instead of mysql_connect() . Let us rewrite the script and validate it using dbForge Studio for MySQL. In the Studio, open a SQL document and execute the following queries to create the sample city table in the test database and populate it with data: USE test;\n\n-- Create the city table in the test database\nCREATE TABLE city (\n city_id smallint(5) UNSIGNED NOT NULL AUTO_INCREMENT,\n city varchar(50) NOT NULL,\n country_id smallint(5) UNSIGNED NOT NULL,\n last_update timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP() ON UPDATE CURRENT_TIMESTAMP,\n PRIMARY KEY (city_id)\n);\n\n-- Populate the city table with data\nINSERT INTO city (city_id, city, country_id) VALUES\n(8, 'New York', 1),\n(9, 'Los Angeles', 1),\n(10, 'Tokyo', 2); Running the SELECT query will show us that the table has been created and it contains three records: Next, we rewrote the script using mysqli , saved it to the .php file, and opened it in the browser: \";\n}\n\nmysqli_close($conn); // Close the MySQL connection\n?> As you can see, the result is the same. Conclusion In the article, we have covered the main methods for connecting PHP to a MySQL database – mysqli and PDO extensions, along with dbForge Studio for MySQL. We explained why these options are preferable over the outdated mysql_connect() function and demonstrated how to use each through specific examples. They all can help you easily create advanced and efficient web applications. You can try a free fully functional trial version of dbForge Studio for MySQL for 30 days and have a better user experience in PHP development. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [mysqli](https://blog.devart.com/tag/mysqli) [pdo](https://blog.devart.com/tag/pdo) [php](https://blog.devart.com/tag/php) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-php-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+PHP+With+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fconnect-php-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-php-mysql.html&title=How+to+Connect+PHP+With+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-php-mysql.html&title=How+to+Connect+PHP+With+MySQL) [Copy URL](https://blog.devart.com/connect-php-mysql.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connect-power-bi-to-oracle-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [ODBC](https://blog.devart.com/category/odbc) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Connect Power BI to Oracle Database: The Definitive Guide By [Max Remskyi](https://blog.devart.com/author/max-remskyi) April 28, 2023 [0](https://blog.devart.com/connect-power-bi-to-oracle-database.html#respond) 2707 Connect Power BI to Oracle Database Analytics systems are essential in today’s data-driven world, enabling organizations to collect, analyze, and gain insights from large volumes of data. [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) is a popular relational database management system that allows organizations to store, manage and retrieve data. Power BI is a powerful business intelligence tool that will enable users to visualize and analyze data. Integrating Oracle and Power BI data can provide organizations with a comprehensive analytics solution. This article will discuss the various options for connecting Oracle and Power BI. [https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-03T19_30_30_How-to-Connect-Power-BI-to-Oracle-Database_-The-Definitive-Guide-1.mp3](https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-03T19_30_30_How-to-Connect-Power-BI-to-Oracle-Database_-The-Definitive-Guide-1.mp3) Listen to the Devart podcast to learn how to connect Power BI to Oracle database . Prerequisites Specific system requirements must be met to set up the integration between Oracle and Power BI. These include: Oracle database: You must have access to an Oracle database that contains the data you want to analyze in Power BI. The database should be running on a supported version of Oracle. Power BI: You must have a Power BI account to access the Power BI service. The service requires a supported browser and an internet connection. Additionally, you may need to install the Power BI desktop application to create reports locally. On-premises data gateway: If you plan to use the on-premises data gateway to connect to your Oracle database, you must have a supported operating system, such as Windows Server 2012 R2 or later. The server running the gateway must also have a minimum of 8GB of RAM and a 64-bit processor. Azure Data Factory: If you plan to use Azure Data Factory to connect to your Oracle database, you must have an Azure subscription and access to Azure data integration runtime. Additionally, you must ensure that your Oracle database is accessible from Azure. Third-party connectors: If you plan to use a third-party connector to connect to your Oracle database, you must ensure that the connector is compatible with your Oracle and Power BI versions. In addition to these requirements, it’s important to ensure that your system meets the minimum hardware and software requirements for Oracle and Power BI. This includes having sufficient disk space, memory, and processing power to handle the data and queries you plan to run. Overall, setting up the integration between Oracle and Power BI requires careful planning and considering system requirements to ensure a smooth and successful deployment. Step-by-step Guide: Connection Power BI to Oracle Database This article will provide three step-by-step instructions for setting up a connection between Power BI and Oracle. These instructions will cover different methods for connecting to Oracle. Following these instructions, you can create a comprehensive and powerful analytics solution to gain insights from your Oracle data in Power BI. Method 1: Using ODBC driver for Oracle Here’s a step-by-step guide to setting up a connection between Power BI and Oracle using the [Devart ODBC driver](https://www.devart.com/odbc/oracle/) . Step 1: Download and Install Devart ODBC Driver First, download and install the Devart ODBC driver for Oracle from the Devart website. Follow the installation wizard to complete the installation. Step 2: Open Power BI Desktop Next, open Power BI Desktop and click “Get Data” in the Home tab. Step 3: Select “ODBC” Connector In the “Get Data” window, select “ODBC” as the connector and click on “Connect.” Step 4: Select Devart ODBC Driver In the “ODBC” window, select “Devart ODBC Driver for Oracle” from the list of available drivers and click on “Connect.” Step 5: Enter Connection Details In the “ODBC” window, enter the connection details for your Oracle database, including the server name, port number, database name, and authentication details. You can also test the connection by clicking on “Test Connection.” Step 6: Choose Data to Import Once the connection is established, you can choose which data to import from your Oracle database. You can select individual tables or views or write a custom SQL query. When you’re ready, click on “Load” to import the data into Power BI. Step 7: Create Reports and Visualizations With the data imported into Power BI, you can create reports and visualizations to gain insights from your Oracle data. You can use the various tools and features in Power BI to create interactive dashboards, charts, graphs, and more. That’s it! By following these steps, you should be able to set up a connection between Power BI and Oracle using one of the [Devart ODBC drivers](https://www.devart.com/odbc/) . Method 2: Using Universal Cloud Connector Skyvia is a cloud-based data integration platform that allows users to integrate, backup, and manage their data across various sources and destinations. It offers a range of features that make it a powerful tool for businesses of all sizes, including: User-friendly interface: Skyvia’s user-friendly interface makes it easy for users to create and manage data integration workflows without requiring specialized technical skills or programming knowledge. Wide range of connectors: Skyvia supports many connectors, including popular cloud applications like Salesforce, Microsoft Dynamics, NetSuite, HubSpot, and more. It also supports on-premises databases like Oracle, SQL Server, MySQL, and PostgreSQL. Automate workflows: Skyvia allows users to automate their data integration workflows with scheduling options, real-time synchronization, and data replication features. Data backup and recovery: Skyvia offers automated backups and data recovery from various sources, ensuring that your data is always protected and available when needed. Secure data integration: Skyvia uses the latest security protocols and encryption technologies to ensure that your data is safe and secure during the integration process. Data transformation: Skyvia provides a range of data transformation options that allow users to manipulate, filter, and transform their data to fit their specific needs. Monitoring and reporting: Skyvia provides users with comprehensive monitoring and reporting tools, including real-time alerts, error tracking, and detailed logs, enabling users to quickly identify and resolve any issues that may arise during the integration process. Overall, Skyvia is a powerful, easy-to-use data integration platform that provides businesses with a range of features to help them manage their data across various sources and destinations. Here’s a step-by-step guide to setting up a connection between Power BI and Oracle using Skyvia. Step 1: Create a Skyvia Account First, go to the Skyvia website and create a free account. Once you’ve created an account, sign in to the Skyvia dashboard. Step 2: Create a New Connection In the Skyvia dashboard, click “Connections” in the top menu and click “New Connection.” Step 3: Select “Oracle” as Source In the “New Connection” window, select “Oracle” as the source and enter the connection details for your Oracle database, including the server name, port number, database name, and authentication details. Click on “Create Connection” to save the connection. Step 4: Create a New Package In the Skyvia dashboard, click “Packages” in the top menu and click “New Package.” Step 5: Select “Oracle” as Source and “Power BI” as Destination In the “New Package” window, select “Oracle” as the source and “Power BI” as the destination. Select the source and destination connections you created earlier and click “Create Package” to save the package. Step 6: Map Source and Destination Data In the “Package Details” window, you can map the source and destination data by selecting the tables or views you want to transfer and configuring the field mappings. You can also add transformations or filters to the data. Step 7: Run the Package Once the package is configured, click “Run” to transfer the data from Oracle to Power BI. You can monitor the progress of the transfer in the “Execution Log” tab. Step 8: Create Reports and Visualizations With the data imported into Power BI, you can create reports and visualizations to gain insights from your Oracle data. You can use Power BI’s various tools and features to create interactive dashboards, charts, graphs, and more. Method 3: Using Power BI Connector Using Power BI Connector is a way to establish a direct connection between Power BI and Oracle databases using Power BI’s built-in connector. This method is relatively straightforward and requires no additional tools or software. Here are the steps to connect Power BI to Oracle using the [Power BI Cloud Connector](https://skyvia.com/data-integration/powerbi) : Step 1: Open Power BI Launch Power BI Desktop and click “Get Data” from the home ribbon. Step 2: Select Oracle In the “Get Data” window, select “Oracle” from the list of available connectors and click on “Connect.” Step 3: Enter Connection Details In the “Oracle database” window, enter the connection details for your Oracle database, including the server name, port number, database name, and authentication details. Step 4: Select Tables or Views Once you have established the connection, you can select the tables or views you want to import into Power BI. You can also apply filters or transformations to the data at this stage. Step 5: Load Data into Power BI After selecting the data, click “Load” to import the data into Power BI. You can then start creating reports, dashboards, and visualizations using the imported data. One of the benefits of using the Power BI Connector is that it provides a direct, real-time connection to your Oracle database. Any changes made to the database will be reflected immediately in Power BI. Additionally, this method is easy to set up. It does not require any additional software or tools, making it a convenient option for users already familiar with Power BI. Best Practices: How to Oracle Load Data to Power BI To load data from Oracle into Power BI using the ODBC driver, follow these steps: Step 1: Install ODBC Driver for Oracle Install the ODBC driver for Oracle on your computer. You can download the driver from the Oracle website or use a third-party driver. Step 2: Configure ODBC Data Source Configure the ODBC data source for your Oracle database. Open the ODBC Data Source Administrator on your computer and create a new data source for your Oracle database. Enter the necessary connection details, such as the server name, port number, database name, and authentication information. Step 3: Open Power BI Open Power BI Desktop and click “Get Data” from the home ribbon. Step 4: Select ODBC Data Source In the “Get Data” window, select “ODBC” from the list of available connectors and click on “Connect.” Step 5: Select ODBC Data Source In the “ODBC” window, select the ODBC data source you configured in Step 2 and click “Connect.” Step 6: Select Tables or Views Once you have established the connection, you can select the tables or views you want to import into Power BI. You can also apply filters or transformations to the data at this stage. Step 7: Load Data into Power BI After selecting the data, click “Load” to import the data into Power BI. You can then start creating reports, dashboards, and visualizations using the imported data. One of the benefits of using the ODBC driver to connect to Oracle is that it provides a flexible and customizable way to load data into Power BI. Users can configure the ODBC data source to suit their specific requirements and apply filters or transformations to the data during the import process. Additionally, the ODBC driver is a widely-used and well-supported standard, making it a reliable choice for users who need to connect to various data sources. Conclusion In this article, we discussed three methods to connect Power BI to Oracle databases. The first method was to use the Devart ODBC driver, which required installing the driver and configuring the ODBC data source. You can quickly try it yourself and [download drivers for free](https://www.devart.com/odbc/oracle/download.html) . The second method was to use Skyvia, a cloud-based integration platform allowing easy data integration between Oracle and Power BI. The third method was to use Power BI’s built-in connector to establish a direct connection to Oracle. All three methods are effective ways for [Oracle connection](https://skyvia.com/connectors/oracle) to Power BI. The choice of method will depend on the user’s specific requirements, technical expertise, and resources. Users who require a high degree of flexibility and customization may prefer to use the ODBC driver, while those who prefer a cloud-based solution may opt for Skyvia. Users already familiar with Power BI may find the built-in connector the most convenient option. Overall, we highlighted the importance of data integration and analytics in today’s business environment and the need for practical tools and platforms to facilitate this process. By connecting Oracle databases to Power BI, users can gain valuable insights into their data, improve decision-making, and drive business growth. Tags [Oracle](https://blog.devart.com/tag/oracle) [Powerbi](https://blog.devart.com/tag/powerbi) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-power-bi-to-oracle-database.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Power+BI+to+Oracle+Database%3A+The+Definitive+Guide&url=https%3A%2F%2Fblog.devart.com%2Fconnect-power-bi-to-oracle-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-power-bi-to-oracle-database.html&title=How+to+Connect+Power+BI+to+Oracle+Database%3A+The+Definitive+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-power-bi-to-oracle-database.html&title=How+to+Connect+Power+BI+to+Oracle+Database%3A+The+Definitive+Guide) [Copy URL](https://blog.devart.com/connect-power-bi-to-oracle-database.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-power-bi-to-sql-server-html.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Connect Power BI to SQL Server: 3 Easy Ways By [Max Remskyi](https://blog.devart.com/author/max-remskyi) April 28, 2023 [0](https://blog.devart.com/connect-power-bi-to-sql-server-html.html#respond) 2450 How to Connect Power BI to SQL Server In today’s data-driven business world, analytical systems are becoming increasingly important for organizations to make informed decisions. Analytical systems help organizations collect, store, process, and analyze data, which can provide insights into the performance of the organization and its customers. By analyzing data, organizations can identify patterns, trends, and anomalies that would be impossible to detect by manual analysis. SQL Server is one of the world’s most widely used relational database management systems. It is used by businesses of all sizes and across all industries to store and manage data. SQL Server provides a range of features and tools that enable businesses to store and manage large volumes of data and perform complex data analytics. Why Use Analytical Systems There are several reasons why it is important to use analytical systems, such as Power BI, and how integration with SQL Server or Azure SQL Database can help businesses. Here are a few reasons: Improved Decision-Making Analytical systems, such as Power BI, help businesses to make better-informed decisions. By analyzing data from multiple sources, businesses can gain insights into their operations and identify areas for improvement. For example, a retail business can use Power BI to analyze sales data and identify which products are selling well and which are not. This information can help the business to make better decisions about inventory management, pricing, and marketing. Increased Efficiency Analytical systems can help businesses to improve their efficiency by automating data collection and analysis. Using tools like Power BI, businesses can collect data from multiple sources, clean and transform it, and perform complex analysis in a fraction of the time it would take to do it manually. This can help businesses to identify problems and opportunities more quickly and respond to them faster. Competitive Advantage Analytical systems allow businesses to gain a competitive advantage by identifying trends and opportunities before their competitors. For example, a financial services firm can use Power BI to analyze market trends and identify investment opportunities. By acting on this information before their competitors, the firm can gain an edge in the market. Improved Customer Experience Analytical systems can help businesses understand their customers better and provide a more personalized experience. By analyzing customer data, businesses can identify customer behavior and preferences trends and tailor their products and services to meet their needs. For example, a healthcare provider can use Power BI to analyze patient data and identify patterns in health outcomes. This information can help the provider to develop personalized treatment plans for patients and improve overall health outcomes. About Power BI Power BI is a business analytics solution that enables organizations to visualize and analyze their data. It provides users with a range of tools to create interactive dashboards, reports, and visualizations, which can be used to gain insights into their data. Power BI is widely used in business to facilitate data-driven decision-making processes. One of the primary data sources that Power BI users connect to is SQL Server, a relational database management system widely used in organizations to store, manage, and retrieve data. Connecting Power BI to SQL Server enables users to create data visualizations and reports based on their SQL Server data. In this article, we will explore different methods to connect Power BI to SQL Server. Some of the key advantages of Power BI include: Ease of Use: Power BI has a user-friendly interface that makes creating data visualizations and reports easy. Users can create interactive dashboards with just a few clicks without requiring extensive programming or data analysis knowledge. Integration With Multiple Data Sources: Power BI can integrate with a wide range of data sources, including Microsoft SQL Server, Excel spreadsheets, and cloud-based services like Azure and Google Analytics. This allows businesses to consolidate data from different sources and gain a holistic view of their data. Real-Time Data Analysis: Users can connect to live data sources and perform real-time data analysis. This allows businesses to monitor data and make decisions in real time, which can be critical in fast-paced environments. Interactive Dashboards: Users can create interactive dashboards customized to meet their needs. Users can drill down into data, apply filters, and explore data in different ways to gain insights into their business. Mobile Access: Power BI offers mobile applications for iOS, Android, and Windows devices, which allow users to access data and reports on the go. This feature is particularly useful for businesses with remote workers or field teams. Collaboration: Multiple users can collaborate on the same report or dashboard. They can share data and insights with colleagues, leading to better decision-making and improved business outcomes. Security: Power BI offers robust security features, including role-based access control, encryption, and data loss prevention. This ensures that sensitive data is protected and that only authorized users can access it. How to Connect Power BI to SQL Server Using ODBC Devart ODBC drivers are high-performance drivers designed to connect various databases (including SQL Server, MySQL, Oracle, PostgreSQL, SQLite, and others) to ODBC-compliant applications. These drivers provide a standard way for applications to communicate with different database systems, regardless of the underlying database technology. ODBC drivers are built upon a high-performance architecture that delivers fast and reliable data access to databases. They are designed to optimize the use of the database’s native protocol, which ensures that the drivers work seamlessly with the database and provide optimal performance. Also, check the info about [SQL database tools with support for SSMS 19](https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html) . Some of the key features of Devart ODBC drivers include: High Performance: Devart ODBC drivers are optimized to leverage the native database protocol and deliver fast and reliable access to the database. Compatibility: The drivers are compatible with ODBC-compliant applications. This makes integrating them into existing applications easy without modifying the application code. Cross-Platform Availability: The supported platforms include Windows, Linux, and macOS. This makes it easy to deploy the ODBC drivers in different environments and use them with different applications. Secure Connectivity: The drivers support SSL/TLS encryption and provide secure connectivity to databases. This ensures that data transmitted between an application and a database is protected against unauthorized access. Ease of Installation and Use: You get started with a user-friendly installer that guides you through the installation process and a simple configuration wizard that makes it easy to configure the drivers to work with your database. One of the ways to connect Power BI to Microsoft SQL Server is by using the corresponding Devart ODBC driver. Follow the below steps to connect using the [Power BI data connector](https://www.devart.com/odbc/powerbi/) : First, install the Devart ODBC driver on the computer where Power BI is installed. To do this, [download the ODBC driver for SQL Server](https://www.devart.com/odbc/sqlserver/download.html) from the Devart website and follow the installation instructions. Open Power BI on your computer. Click the Get Data button on the Home tab of the Power BI ribbon. In the Get Data window, select ODBC and click Connect . In the ODBC window, select the Devart ODBC driver from the list of available drivers. Enter the connection details for your Microsoft SQL Server database, such as the server name, database name, and authentication details. Click the Connect button to connect to your SQL Server database. Select the data you want to use in your report or visualization from the tables and views in the database. Using Devart ODBC Driver and Python Connector for SQL Server Whether you’re a seasoned developer or just getting started with Python and SQL Server, this tutorial will help you harness the power of Devart tools for efficient database connectivity. Let’s dive into the details of integrating these components and enhancing your Python-based SQL Server interactions. Install Devart ODBC Driver Make sure you have the ODBC driver installed on your system. Follow the installation instructions provided by Devart. Install Devart Python Connector for SQL Server Similarly, install the [Devart Python Connector](https://www.devart.com/python/sqlserver/download.html) for SQL Server by following the instructions. Import the Required Modules In your Python script, start with importing the modules you need. This includes the ODBC module for Python and the Devart Python Connector module. import pyodbc\nimport dbforge_sql_server 4. Establish an ODBC Connection Use the ODBC driver to establish a connection to your SQL Server. Replace your_odbc_dsn with your actual ODBC Data Source Name. connection_string = 'DSN=your_odbc_dsn;'\nconnection = pyodbc.connect(connection_string) 5. Create a Cursor Create a cursor object to execute SQL queries. cursor = connection.cursor() 6. Execute SQL Queries You can now use the cursor to execute SQL queries on your SQL Server database. cursor.execute('SELECT * FROM your_table')\nrows = cursor.fetchall()\n\nfor row in rows:\n print(row) 7. Close the Connection Remember to close the connection when you are done. cursor.close()\nconnection.close() Make sure to replace placeholder values such as your_odbc_dsn and your_table with your actual ODBC Data Source Name and table name. Additionally, refer to the documentation provided by Devart for any specific configurations or features offered by Python Connector for SQL Server. The documentation will provide detailed information on using this connector in various scenarios. Export and Import Data From SQL Server via a GUI Tool [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is a powerful GUI tool for managing and developing Microsoft SQL Server databases. Much like the well-known SQL Server Management Studio, it is designed to help database developers and administrators perform a wide range of tasks, from creating and editing database objects to writing complex queries and scripts. The Studio provides a user-friendly interface that makes it easy to perform database management tasks, even for users with little or no experience in SQL Server. Some of the key features of dbForge Studio for SQL Server include: Object Explorer: It provides a hierarchical view of all database objects and makes it easy to navigate through the database and quickly find the object you need. Visual Query Builder: It helps you create complex SQL queries on visual diagrams without having to write any code. SQL Editor: A powerful tool for writing and executing SQL queries with advanced features like code auto-completion, formatting, and syntax validation to help you write error-free code. Data Editor: A handy tool that allows you to edit data directly in the database. It has an easy-to-use interface that makes it simple to update, insert, and delete data. Data Export and Import: Handy wizards help you export and import data between databases using 14 supported formats, which include TXT, HTML, XLS, XLSX, CSV, MDB, RTF, PDF, JSON, SQL, Google Sheets, and more. Database Diagrams: They allow you to create visual representations of your database schemas, making it easy to understand the relationships between tables and the structures of your databases. You can also read [how to connect to the HubSpot database and retrieve its data using Tableau, Excel, and Power BI, as well as the ODBC driver for HubSpot](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) . One of the most essential features of dbForge Studio for SQL Server is the ability to export and import data from SQL Server. Here’s how to use it to export and import data from SQL Server to Power BI: Install dbForge Studio for SQL Server on the computer where SQL Server is installed. To do this, download the tool from the Devart website and follow the installation instructions. Open dbForge Studio for SQL Server on your computer. Connect to your SQL Server database by entering the connection details, such as the server name, database name, and authentication details. To export data from SQL Server, right-click the table or view you want to export and select Export Data from the context menu. Specify the export settings using a feature-rich wizard and click Export . To import the exported data to Power BI, click on the Get Data button in Power BI and select the file format of the exported data. Power BI will then prompt you to select the exported file and import the data into your report or visualization. Conclusion In conclusion, analytical systems are essential for businesses that want to make informed decisions, improve efficiency, gain a competitive advantage, and provide a better customer experience. Integration with SQL Server can help store and manage large volumes of data and perform complex data analysis using tools like Power BI. By leveraging the power of analytical systems and SQL Server integration, businesses can gain valuable insights into their operations and drive success. You can also check the [article](https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html) about new functions and operators that were introduced in SQL Server 2022 or about Devart’s [dbForge tools for SQL Server](https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html) . The choice of connection between Power BI and SQL Server ultimately depends on your specific needs and preferences. If real-time data is crucial, DirectQuery might be your go-to. For large datasets with less frequent updates, the Import mode could be more fitting. Live Connection, on the other hand, strikes a balance, allowing you to harness Power BI’s features without the need to import all the data. Consider your use case carefully to make an informed decision. Connecting Power BI to SQL Server or Azure SQL Database is important for creating data-driven reports and visualizations in Power BI. This article explored different methods to connect Power BI to SQL Server, including using the Devart ODBC driver, Python scripts, and dbForge Studio for SQL Server. By following the steps outlined in this article, you can easily connect Power BI to SQL Server and start creating compelling visualizations and reports based on your SQL Server data. Tags [howto](https://blog.devart.com/tag/howto-2) [Powerbi](https://blog.devart.com/tag/powerbi) [sqlserver](https://blog.devart.com/tag/sqlserver) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-power-bi-to-sql-server-html.html) [Twitter](https://twitter.com/intent/tweet?text=Connect+Power+BI+to+SQL+Server%3A+3+Easy+Ways&url=https%3A%2F%2Fblog.devart.com%2Fconnect-power-bi-to-sql-server-html.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-power-bi-to-sql-server-html.html&title=Connect+Power+BI+to+SQL+Server%3A+3+Easy+Ways) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-power-bi-to-sql-server-html.html&title=Connect+Power+BI+to+SQL+Server%3A+3+Easy+Ways) [Copy URL](https://blog.devart.com/connect-power-bi-to-sql-server-html.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [Python Connectors](https://blog.devart.com/category/products/python-connectors) How to Connect to Oracle Database From Python With Devart Python Connector By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) January 13, 2025 [0](https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html#respond) 874 Connecting Python projects to Oracle databases is an essential step for many workflows, but the process can feel unnecessarily tedious. Dependency-heavy setups, complex setup files like tnsnames.ora, and platform compatibility issues can make even straightforward projects frustratingly complex. To keep workflows running well, developers must discover a reliable and effective approach to connect Python to Oracle. In this guide, we’ll explore methods for making Python to Oracle connections that minimize these challenges. Along the way, you’ll also learn how modern tools, including [Devart Python Connector for Oracle](https://www.devart.com/python/oracle/) , can simplify the process. By using such tools and techniques, you can ensure reliable, low-latency integration for your Python-powered projects. Let’s dive in! Table of contents How does the Devart Python Connector compare with other connectors Installing Devart Python Connector for Oracle Setting up your Oracle Database for Python connection Establishing a connection in Python Benefits of using Devart Python Connector for Oracle When do you need a Python-Oracle integration? How does the Devart Python Connector compare with other connectors Before attempting to have Python connect to Oracle DB, developers and businesses need to know they have a reliable tool. The table below shows how the Devart Python Connector compares to other options. Feature Devart Python Connector python-oracledb (formerly cx_Oracle) SQLAlchemy with Oracle Dialect pyodbc with Oracle ODBC Driver Installation Single pip install, no dependencies. Requires Oracle Instant Client, manual set up. Moderate; needs SQLAlchemy and cx_Oracle setup. Complex; ODBC driver and DSN setup required. Client libraries Not required; reduces version conflicts. Required; can lead to compatibility issues. Required; inherits cx_Oracle’s dependencies. Required; depends on ODBC driver. Performance Optimized for bulk queries and low latency. High performance; impacted by library versions. Moderate; ORM overhead slows complex queries. Variable; depends on ODBC driver quality. Secure connection Built-in SSL/SSH tunneling for enterprise use. SSL/TLS supported; manual setup needed. Security features depend on driver implementation. Security depends on ODBC driver setup. Cross-platform support Supports Windows, Linux, and macOS (32/64-bit). Supports major platforms but requires client installation per OS. Cross-platform; mirrors cx_Oracle support. Cross-platform; depends on ODBC driver availability. Integration with ecosystem Plug-and-play compatibility with Pandas, Matplotlib, and SQLAlchemy. Compatible with Pandas but requires additional steps. Native ORM workflows but lacks control for direct tasks. Manual integration for most Python libraries. Support & documentation Comprehensive guides, dedicated support, and timely updates. Relies on community support and official documentation. Community-driven documentation for SQLAlchemy users. Sparse, community-driven documentation. Use case flexibility Ideal for data scientists, ML engineers, and developers working on real-time analytics, AI workflows, and enterprise applications. Versatile but suited for Oracle-heavy use cases. Best for ORM-specific workflows; lacks direct control. Generic; suited for applications requiring ODBC. Installing Devart Python Connector for Oracle To get started with the Devart Python Connector for Oracle, head over to the [official Devart website](https://www.devart.com/python/oracle/) . There, you’ll find everything you need: product details, documentation, and a quick link to the [download page](https://www.devart.com/python/oracle/download.html) . Devart’s Python Oracle connector installation process is fast and hassle-free. Prerequisites Before you let Python connect to Oracle databases, check that your system meets these requirements: Python version : Python 3.6 or later. Oracle database access : Credentials include host, port, service name, username, and password. System compatibility : Supported operating systems include Windows, macOS, and Linux. Pip package manager : Confirm pip is installed by running pip –version. Installation steps Once you’ve met all the requirements, installing the Devart Python Connector is simple. Just follow these instructions to get it set up on your system. Step 1: Open your terminal or command prompt Ensure that Python and pip are accessible from your terminal. Step 2: Run the installation command Use pip to install the Devart Python Connector. pip install devart-python-connector Step 3: Verify the installation Confirm that the connector is installed by running the command shown below. pip show devart-python-connector This shows package details like the version and installation path. Step 4: Test the installation Import the connector in a Python script to ensure it’s working correctly. import devart \nprint(\"Devart Python Connector installed successfully!\") System requirements Keeping your system compatible is essential for reliable performance. Here’s a quick overview of the supported platforms and database versions: Operating systems : Consider using either Windows, macOS, or Linux. Oracle versions supported : Use Oracle Database 11g or later versions. With the Devart Python Connector installed and your system ready, you can start working with Oracle databases in Python. But first, ensure your Oracle database is properly configured for smooth integration. Let’s move on to setting up your database for connectivity. Setting up your Oracle Database for Python connection To efficiently connect to Oracle database Python scripts, you need the right tools and setups. Thus, before you connect to Oracle database using Python, ensure that you have ticked all the boxes on your checklist. Proper preparation eliminates common connectivity issues and ensures an easy Python Oracle database integration. Key preparations for Oracle database Preparing your Oracle database involves verifying accessibility, setting permissions, and gathering connection details. Now, let’s examine these processes closely. Step 1: Check network access As a general rule, you should check that the Oracle database is up, running, and listening on port 1521. You should also see if any security policies or firewalls are blocking the Oracle Python connection. Step 2: Set up user roles and permissions In order to connect to the database, the user you’re using must have the appropriate rights. For stored procedures, this includes using commands like SELECT, INSERT, and EXECUTE. Step 3: Gather the required connection details To connect Python to Oracle, you’ll need details, including the host (IP address or hostname) and port (often 1521). You’ll need the service name or SID to identify the database instance. These details can be found in the database’s setup files or provided by the administrator. Direct Connection set up To skip using the tnsnames.ora file, specify the connection details directly in your Python script. connection_string = \"user/password@host:port/service_name\" This approach simplifies the setup process, making it easier to integrate Python with Oracle databases. Suggestions for a smooth setup The reliability and safety of a link depend on its setup. In order to keep your setup secure and error-free, follow these steps: Safe login information : Protect sensitive details like database credentials by storing them in environment variables. For example, use export ORACLE_DB_PASS=your_password on Linux/Mac or set it up in Windows Environment Variables. Verify the details of the connection : Use tools like Oracle SQL Developer to test the database connection. This ensures your host, port, and service name/SID are accurate. Simplify connection strings : Keep connection strings simple to reduce the chances of setup errors. Final checklist Before you let Python connect to Oracle, confirm the following: Confirm your Oracle database is up and running. Verify user privileges match the required operations. Double-check the accuracy of host, port, and service name details. By ensuring these steps are completed, you’ll set the foundation for a smooth [Oracle db connection in Python.](https://www.devart.com/odbc/oracle/integrations/oracle-python-connection.html) In the next section, we’ll cover how to connect Python to Oracle database using the Devart Python Connector. Establishing a connection in Python [Connecting to Oracle database in Python](https://www.devart.com/python/oracle/) is really simple with the Devart connector. To be able to manage problems, nevertheless, you should abide by some best practices. These methods help you to guarantee a consistent connection for your operations. Key steps to create a connection Here are the steps to connect Python to Oracle. Step 1: Import the required library Before you connect to Oracle db using Python, you need to install the Devart Python Connector. Use the command pip install devart-python-connector to install it. Once installed, import the connector into your script. Step 2: Define the Python Oracle db connection parameters Prepare these details: Host : This is the Oracle database server address. Port : It’s usually 1521 unless set up in a different way. Service name : The name of the database’s service. User : Your Oracle username. Password : Your Oracle password. Step 3: Establish the connection Use the connect() function from the Devart library to establish a connection. Step 4: Implement error handling Handle database-specific errors to keep your application resilient. Log any errors to make debugging easier. Example code for connection Here is the Python code to connect to Oracle database: # Import the Devart Python Connector library \nimport devart \n \n# Define connection parameters \nconnection_params = { \n    \"host\": \"your_host\",  # Replace with your Oracle database host \n    \"port\": 1521,         # Default Oracle database port \n    \"service_name\": \"your_service_name\",  # Replace with your service name \n    \"user\": \"your_username\",              # Replace with your username \n    \"password\": \"your_password\"           # Replace with your password \n} \n \n# Establish the connection \ntry: \n    connection = devart.connect(**connection_params) \n    print(\"Connection to Oracle database established successfully!\") \nexcept devart.DatabaseError as e: \n    print(f\"Error connecting to the database: {e}\") \nfinally: \n    # Close the connection when done \n    if 'connection' in locals() and connection: \n        connection.close() \n        print(\"Connection closed.\") See for yourself how simple Oracle integration can be. Test the Devart Python Connector with a free trial today! [Get Started!](https://www.devart.com/python/oracle/download.html) Best practices for error handling Error handling is an important part of building a resilient application. Follow these practices to resolve common issues effectively: Catch specific exceptions : Use devart.DatabaseError to handle Oracle-specific errors smoothly and keep your application running reliably. Validate parameters : Double-check the host, port, and service name details before running the script to avoid connection issues. Log issues : Keep a log of database connection errors, especially in production, to make troubleshooting faster and easier. Use the finally block : Always close the connection when you’re done to free up resources and prevent leaks. Troubleshooting common issues When issues arise, a systematic approach can help resolve them quickly. Address these common problems to maintain a smooth connection process: Invalid credentials : Ensure the username and password are correct and authorized for the Oracle database. Network connectivity problems : Verify that the Oracle host is reachable and the port is open. Incorrect service name : Cross-check the service name with your database setup or consult the DBA. Firewall restrictions : Confirm that your machine’s firewall permits outgoing traffic on the Oracle port. By following these steps and best practices, you can connect Oracle database using Python. In the next section, we’ll explore why you should connect to Oracle using Python. Benefits of using Devart Python Connector for Oracle The Devart Python Connector simplifies connecting Python applications to Oracle databases. It offers speed, security, and flexibility. Here are the top benefits: Simplified Integration : Forget about Oracle client libraries and complicated setups. With direct TCP/IP connectivity, getting started is quick and hassle-free. Faster Performance : Enjoy low-latency queries and transactions with a streamlined, error-reducing architecture. Robust Security : Secure your data with SSL encryption and SSH tunneling, meeting Oracle’s latest standards. Risk-Free Trial : Not ready to commit? Explore the connector’s features with a free trial—test its capabilities before deciding. When do you need a Python-Oracle integration? Here are key scenarios where this powerful combination proves invaluable: Real-time data analytics : With a Python connection to Oracle database, analyzing large datasets becomes much easier. Libraries like Pandas and Matplotlib let you create dashboards and monitoring tools that help you make quicker, smarter decisions. Automating database workflows : Repetitive tasks like syncing data or generating reports can drain your time and energy. A Python Oracle connection automates these workflows, reducing mistakes and freeing up time for more meaningful work. Scalable application development : Oracle databases are built for scalability. Python adds flexibility to backend development. A Python Oracle database connection makes it easier to build high-performance, data-driven applications in industries like finance and e-commerce. Optimizing machine learning pipelines : AI projects require secure and scalable storage. Oracle easily integrates with Python’s ML frameworks like TensorFlow. This lets you manage training data, preprocess it, and deploy models effectively. Streamlining ETL processes : When transferring data between systems, you can use a Python script to connect to Oracle database. This simplifies complex ETL workflows and ensures your data remains clean and reliable for analytics and compliance with industry regulations and standards. Cross-system data integration : In modern tech environments, systems need to share data effortlessly. Python connects Oracle databases with CRMs, ERPs, and cloud platforms to ensure smooth operations and consistent data flow. Over to you Now that you understand how to connect to Oracle database using Python, you can focus on building efficient workflows. The [Devart Python Connector for Oracle](https://www.devart.com/python/oracle/) makes database integration simple, powerful, and stress-free. It’s easy to use, delivers great performance, and contains useful features. This means you can focus on creating solutions instead of wrestling with complicated setups. Whether you’re building scalable apps, crunching data, or automating workflows, it gets the job done. You’ll get reliable performance, solid security, and compatibility across platforms. It’s a must-have for developers and data pros looking to make the most of [Oracle databases](https://blog.devart.com/oracle-tutorial) with Python. Don’t let integration challenges slow you down. Unlock the full potential of Python-Oracle integration with the Devart Python Connector, built to meet the demands of today’s professionals and tomorrow’s enterprises. Connect Python to Oracle in a few simple steps: [download the connector](https://www.devart.com/python/oracle/download.html) , [explore licensing options](https://www.devart.com/python/oracle/ordering.html) , and get started today! Tags [connect Python to Oracle database](https://blog.devart.com/tag/connect-python-to-oracle-database) [python connectors](https://blog.devart.com/tag/python-connectors) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-python-to-oracle-quickly-and-effortlessly.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+Oracle+Database+From+Python+With+Devart+Python+Connector&url=https%3A%2F%2Fblog.devart.com%2Fconnect-python-to-oracle-quickly-and-effortlessly.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html&title=How+to+Connect+to+Oracle+Database+From+Python+With+Devart+Python+Connector) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html&title=How+to+Connect+to+Oracle+Database+From+Python+With+Devart+Python+Connector) [Copy URL](https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connect-salesforce-python.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Python Connectors](https://blog.devart.com/category/products/python-connectors) How to Connect to a Salesforce Database From Python With Devart Python Connector By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) December 30, 2024 [0](https://blog.devart.com/connect-salesforce-python.html#respond) 625 At first glance, integrating Python with Salesforce looks simple—install a library, connect to an API, and start querying data. But the reality is far more complex. Salesforce’s API structure presents challenges that aren’t immediately obvious. Developers must navigate polymorphic fields, hierarchical relationships, and varying data formats—such as numeric fields, date/time values, nested JSON, and binary objects—which often require careful transformations for compatibility. On top of that, you’ll need to manage tokens, batch API calls, and optimize workflows for both real-time queries and bulk data migrations. Add in strict rate limits and the complexity of handling both real-time queries and bulk data migrations, and what seemed straightforward can quickly become a time sink. The [Devart Python Connector for Salesforce](https://www.devart.com/python/salesforce/) makes your integration process easier. It simplifies setting up secure, high-performance connections using direct TCP/IP connectivity. With full support for Python DB API 2.0, you can stick to workflows you already know, avoiding unnecessary compatibility issues. Plus, it handles diverse data types seamlessly, so you spend less time fixing errors and more time building something meaningful. Whether you’re syncing millions of records with the Bulk API or fetching real-time updates via the REST API, this tool helps you handle even the most complex workflows. Instead of struggling with API quirks, you’ll have the confidence to focus on creating smarter, more scalable solutions.” Table of contents Benefits of using Devart Python Connector for Salesforce What sets Devart Python Connector apart Prerequisites for connecting Python to Salesforce How to authenticate Python with Salesforce API Querying Salesforce data with Devart Connector Benefits of using Devart Python Connector for Salesforce Traditional Python Salesforce integration tools often complicate things with clunky client libraries, tricky API setups, and endless configuration steps that slow everything down. The Devart Python Connector changes the equation, giving developers a straightforward and reliable way to [connect Python and Salesforce.](https://docs.devart.com/odbc/salesforce/python.htm) Here’s how the Devart Python Connector makes integration faster, easier, and more effective. Establish direct TCP/IP connectivity for streamlined performance Traditional integrations rely on client libraries that add layers of complexity, increase configuration time, and risk version conflicts. The Devart Python Connector eliminates this bottleneck by establishing a direct Python simple Salesforce connection. Faster Performance : With no unnecessary dependencies, queries and transactions are processed with minimal latency. Enhanced reliability : Fewer components mean fewer points of failure, reducing errors and downtime. Quick setup : Skip installing additional software—get started in minutes. For developers, this means less troubleshooting and more time to focus on solutions. For businesses, it’s about faster deployments and smoother operations without the usual roadblocks. Use Python DB API 2.0 for familiar workflows For a developer, learning a new tool shouldn’t feel like starting from scratch. You need something that fits into your workflow, not something that forces you to relearn everything. The Devart Python Salesforce connector keeps things simple by fully adhering to the Python DB API 2.0 standard so you can hit the ground running. Here’s how it makes your life easier: Familiar syntax : Use the Python commands you already know to interact with Salesforce, like creating cursors, running queries, and fetching results. cursor.execute(\"SELECT Name, Email FROM Contact\") \nresults = cursor.fetchall() In this example, cursor.execute() runs a SQL-like query to retrieve “Name” and “Email” from Salesforce contacts, and cursor.fetchall() stores the resulting records. Easy integration : Work seamlessly with tools like Pandas and SQLAlchemy to manipulate and analyze your data without extra steps. With the Devart Python Connector, you skip the frustration of unfamiliar syntax and jump straight into building solutions. It’s built to keep you productive, so you can spend less time figuring things out and more time getting things done. Simplify integration with REST and Bulk APIs for any use case Integrating with Salesforce isn’t one-size-fits-all. Sometimes you need real-time data for live updates, and other times you’re dealing with massive datasets that require efficiency at scale. The Devart Python Connector lets Python connect to Salesforce, offering seamless support for both REST and Bulk APIs so you can choose the best approach for your needs. Bulk API : Perfect for heavy lifting, like migrating millions of records or syncing large datasets without overloading the system. Developers can use the Salesforce bulk API Python setup to reduce API calls, boosting performance for high-volume operations. REST API : Ideal for quick, dynamic tasks like updating dashboards, retrieving customer details during a live chat, or running on-the-fly data queries. This dual compatibility ensures that the Devart Salesforce Python connector delivers the performance and scalability required, regardless of the complexity or size of your project, Secure your data with enterprise-grade protection When working with sensitive customer data, security is non-negotiable. A single breach can result in regulatory fines, reputational damage, and lost trust. The Devart Python Connector is designed with enterprise-grade security features to keep your Salesforce integrations safe and compliant with data protection standards and Salesforce’s security protocols. SSL encryption : Protects data during transmission, preventing unauthorized access or interception. SSH tunneling : Adds an extra layer of protection, ideal for environments with higher network vulnerabilities. Regular updates : Ensures compliance with Salesforce’s latest security protocols so your integration stays robust over time. With these safeguards in place, you can focus on innovation without worrying about vulnerabilities. The Devart Python Connector gives you the confidence to handle sensitive data securely, whether you’re scaling operations or running everyday workflows. Scale efficiently with growth-focused features High-demand scenarios like Black Friday or massive data migrations can push your systems to the limit. To keep up with growth and rising workloads, you need a tool that not only handles the load but supports seamless scaling. The Devart Python Connector is designed for growing enterprises, delivering consistent performance and reliability when it matters most. Bulk data handling : Insert, update, or delete thousands of records in a single operation, significantly reducing processing time and minimizing errors. Smart error management : Advanced error detection ensures smooth operations, even during complex workflows. Continuous optimization : Nightly builds and updates keep the connector ready for evolving enterprise needs. Whether you’re managing millions of transactions or syncing critical datasets, the Devart Python Connector enables growth with confidence. It’s built to handle demanding environments, helping you scale seamlessly and avoid bottlenecks. What sets Devart Python connector apart While there are other tools for Salesforce Python integration, the Devart Python Connector distinguishes itself with a unique combination of features: Direct connectivity : Faster and more reliable than traditional client libraries. Developer-centric design : Familiar Python syntax means less time spent learning and more time building. Enterprise-grade features : Built-in security, bulk data handling, and error management meet the demands of high-volume workflows. Flexibility : Full support for both REST and Python Salesforce Bulk API enables seamless integration, making it suitable for projects of any scale. By solving common integration pain points, the Devart Python Connector turns a traditionally challenging process into a smooth, efficient workflow. Prerequisites for connecting Python to Salesforce Before integrating Python with Salesforce using the Devart Python Connector, you must ensure that both your Salesforce account and Python environment are appropriately configured. This section provides step-by-step instructions for setting up a Salesforce account for API access and installing the Devart Python Connector in your Python environment. Setting up a Salesforce Account To access Salesforce data programmatically, you need to configure your Salesforce account for API access. This involves creating a Salesforce developer account, enabling API access, and setting up a connected app. Follow these steps to get started. Step 1: Create a Salesforce developer account If you don’t have one, sign up at [Salesforce Developer](https://developer.salesforce.com/) to access the tools needed for integration. Step 2: Enable API access Log in to Salesforce and go to Setup > Users > Profiles . Select your user profile (e.g., System Administrator). Ensure the API Enabled checkbox is selected under Administrative Permissions and save the changes. Step 3: Set up a connected app This step allows your Python application to securely authenticate with Salesforce: Navigate to Setup > App Manager , and click New Connected App . Enter the App Name , for Example: PythonIntegrationApp. Configure OAuth Settings , enable OAuth, set the callback URL to http://localhost:8000/callback , and add relevant scopes (e.g., Access and manage your data (api)). This configuration is an essential step when using Python for Salesforce. It ensures secure and efficient integration. Save the app. Salesforce may take a few minutes to process your configuration. Step 4: Obtain API credentials After saving, note the Consumer Key and Consumer Secret from your connected app’s settings. If your account is using username-password authentication, generate a security token by going to Setup > My Personal Information > Reset My Security Token . Salesforce will email the token to you. With these steps completed, your Salesforce account is ready for integration. Installing Devart Python Connector Installing the connector is a straightforward process. Follow the steps described below to proceed with the installation. Step 1: Verify your environment Before installing the connector, ensure your Python environment is ready: Check Python version : Make sure Python 3.x is installed by running the following commands: python --version Ensure pip is installed : Verify pip (Python’s package manager) is available. python -m ensurepip --upgrade If Python or pip isn’t installed, refer to the official [Python installation guide](https://www.python.org/downloads/) . Step 2: Download and install the Devart Python Connector Use pip to install the connector or download it directly from the [Devart website](https://www.devart.com/python/salesforce/download.html) . Step 3: Test the installation A simple test ensures the connector is properly installed and functional. Use the following script to validate the setup by connecting to Salesforce and running a query. import devart \n \n# Replace with your Salesforce credentials \nconn = devart.connect( \n    consumer_key=\"your_consumer_key\", \n    consumer_secret=\"your_consumer_secret\", \n    username=\"your_username\", \n    password=\"your_password\" + \"your_security_token\" \n) \n \ncursor = conn.cursor() \ncursor.execute(\"SELECT Name FROM Account LIMIT 5\") \nfor row in cursor.fetchall(): \n    print(row) How to authenticate Python with Salesforce API Authentication is a critical step in [integrating Python with Salesforce.](https://www.devart.com/python/salesforce/) Primarily, there are two Python Salesforce API authentication methods: OAuth for secure, scalable integrations and security tokens for simpler setups. Authentication using OAuth OAuth is the recommended method for production environments. It ensures security and scalability, making it ideal for enterprise-grade integrations where exposing credentials isn’t an option. Steps for OAuth authentication Step 1: Set up a connected app in salesforce Ensure that you’ve already created a connected app as outlined in the [Setting up a Salesforce Account](https://developer.salesforce.com/signup) section. The connected app will provide you with the Consumer Key , Consumer Secret , and Callback URL , which are required for OAuth authentication. Step 2: Generate an access token Use Python to generate an access token by sending a POST request to Salesforce’s OAuth token endpoint. import requests \n \n# Salesforce OAuth token endpoint \noauth_url = \" https://login.salesforce.com/services/oauth2/token \" \n \n# Define your credentials \npayload = { \n    \"grant_type\": \"password\", \n    \"client_id\": \"your_consumer_key\",         # Consumer Key from Connected App \n    \"client_secret\": \"your_consumer_secret\", # Consumer Secret from Connected App \n    \"username\": \"your_salesforce_username\",  # Your Salesforce username \n    \"password\": \"your_password_and_token\"    # Password + Security Token \n} \n \n# Send the request \nresponse = requests.post(oauth_url, data=payload) \n \n# Parse the response \nif response.status_code == 200: \n    access_token = response.json().get(\"access_token\") \n    instance_url = response.json().get(\"instance_url\") \n    print(\"Access Token:\", access_token) \n    print(\"Instance URL:\", instance_url) \nelse: \n    print(\"Error:\", response.json()) Replace the placeholders with your actual credentials. The access_token and instance_url will be used for subsequent salesforce  API python calls. Step 3.  Use the access token Include the access token in the Authorization header for every API request. The following example demonstrates how to use it with the Devart Python Connector. connection_params = { \n    \"host\": \"your_instance_url\",  # Instance URL from the OAuth response \n    \"access_token\": access_token,  # Access Token from the OAuth response \n    \"database\": \"salesforce\" \n} \n \nconnection = devart.connect(**connection_params) \nprint(\"Authenticated via OAuth successfully!\") Considerations for OAuth Security : Store your Consumer Key, Consumer Secret, and access tokens securely (e.g., in environment variables or a secrets manager). Token expiry : Access tokens have a limited lifespan. Use Salesforce’s refresh token mechanism to obtain new tokens without re-authenticating. Using security tokens for authentication For simpler setups or development environments, you can authenticate using Salesforce’s security tokens. Steps for security token authentication Step 1: Obtain a security token Log in to Salesforce. Navigate to Setup > My Personal Information > Reset My Security Token . Click Reset Security Token , and Salesforce will email a new token to your registered email address. Step 2: Authenticate using username and security token When using the Devart Python Connector, append the security token to your Salesforce password. connection_params = { \n    \"host\": \"your_salesforce_instance.salesforce.com\", \n    \"user\": \"your_username\", \n    \"password\": \"your_password\" + \"your_security_token\", \n    \"database\": \"salesforce\" \n} \n \nconnection = devart.connect(**connection_params) \nprint(\"Authenticated using security token successfully!\") Step 3: Test the connection Verify the connection by running a simple query. cursor = connection.cursor() \ncursor.execute(\"SELECT Id, Name FROM Account\") \nfor row in cursor.fetchall(): \n    print(row) Considerations for security tokens Development-Only Use : Security token-based authentication is ideal for testing or small-scale projects but should not be used in production environments due to its lower security level. Token Reset : Resetting your security token invalidates the previous one, so ensure you update your scripts or configuration when this happens. For better security, store your security tokens in an encrypted format and update configurations immediately after token resets. Summary of authentication methods Feature OAuth Security Token Security Level High (recommended for production) Moderate (suitable for development) Ease of Use Requires connected app setup Quick setup, no app required Best use Case Large-scale, secure integrations Quick testing and small-scale projects Credential Expiry Tokens expire but can be refreshed Token resets need manual updates With authentication configured, you’re now ready to query Salesforce data using the Devart Python Connector. Querying Salesforce data with Devart Connector Once authenticated, the Devart Python Connector makes querying Salesforce data as simple and efficient as querying any relational database. By combining the flexibility of Python with the structured capabilities of [Salesforce Object Query Language (SOQL)](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) , developers can extract valuable insights and manipulate data directly within their Python applications. Basic queries to fetch data With its SQL-like syntax, the Devart Python Connector allows you to query Salesforce objects easily. Whether you need real-time data for dashboards or filtered results for analytics, SOQL makes it straightforward to extract and manipulate Salesforce records. Example: Fetching data from the contact object import devart \n \n# Define connection parameters \nconnection_params = { \n    \"host\": \"your_salesforce_instance.salesforce.com\", \n    \"user\": \"your_username\", \n    \"password\": \"your_password_with_security_token\", \n    \"database\": \"salesforce\" \n} \n \n# Establish connection \nconnection = devart.connect(**connection_params) \n \n# Create a cursor for executing queries \ncursor = connection.cursor() \n \n# Simple SELECT query \ncursor.execute(\"SELECT Id, FirstName, LastName, Email FROM Contact\") \n \n# Fetch and display results \nfor record in cursor.fetchall(): \n    print(record) Key points: SQL-like syntax : The connector allows you to query Salesforce objects using familiar SQL commands like SELECT, WHERE, and ORDER BY. Data fetching : The fetchall() method retrieves all results as a list of tuples for easy manipulation in Python. Example: Filtering results with a WHERE clause You can filter data to retrieve specific records. cursor.execute(\"\"\" \n    SELECT Id, Name, Phone  \n    FROM Account  \n    WHERE Industry = 'Technology' \n\"\"\") \n \n# Print filtered accounts \nfor account in cursor.fetchall(): \n    print(account) Here, the query fetches accounts belonging to the “Technology” industry, helping you drill down into specific data subsets. Example: Sorting and limiting results For better performance or specific use cases, you can limit the number of results and sort data. cursor.execute(\"\"\" \n    SELECT Name, AnnualRevenue  \n    FROM Account  \n    ORDER BY AnnualRevenue DESC \n    LIMIT 10 \n\"\"\") \n \n# Top 10 accounts by revenue \nfor account in cursor.fetchall(): \n    print(account) This query retrieves the top 10 accounts with the highest annual revenue. Advanced querying techniques For more complex queries, Salesforce Object Query Language (SOQL) offers powerful features such as relationship queries, grouping, and aggregate functions. SOQL queries are executed directly on Salesforce servers, minimizing latency and optimizing performance for complex data operations. The Devart Python Connector fully supports SOQL, enabling advanced data interactions. Example: querying related objects SOQL allows you to query related objects in a single statement. For instance, retrieving accounts and their associated contacts. cursor.execute(\"\"\" \n    SELECT Id, Name,  \n          (SELECT FirstName, LastName  \n            FROM Contacts)  \n    FROM Account \n    WHERE Industry = 'Finance' \n\"\"\") \n \n# Print account names with their contacts \nfor account in cursor.fetchall(): \n    print(account) This query fetches accounts in the “Finance” industry along with their associated contacts, demonstrating SOQL’s ability to handle hierarchical relationships. Example: aggregating data with SOQL Aggregate functions like COUNT, SUM, and AVG enable you to summarize data directly in Salesforce. cursor.execute(\"\"\" \n    SELECT Industry, COUNT(Id)  \n    FROM Account  \n    GROUP BY Industry \n\"\"\") \n \n# Print industry count \nfor industry in cursor.fetchall(): \n    print(industry) This query groups accounts by industry and counts the number of accounts in each category. Example: Combining filters and aggregates For more refined queries, combine filters and aggregate functions. cursor.execute(\"\"\" \n    SELECT Owner.Name, SUM(Amount)  \n    FROM Opportunity  \n    WHERE StageName = 'Closed Won'  \n    GROUP BY Owner.Name \n    ORDER BY SUM(Amount) DESC \n\"\"\") \n \n# Print total sales by owner \nfor owner in cursor.fetchall(): \n    print(owner) This query retrieves the total sales amount for each opportunity owner, sorted in descending order. Example: Querying data with date fields SOQL also allows querying using date functions and fields. cursor.execute(\"\"\" \n    SELECT Name, CloseDate, Amount  \n    FROM Opportunity  \n    WHERE CloseDate >= LAST_N_DAYS:30 \n\"\"\") \n \n# Print opportunities closed in the last 30 days \nfor opportunity in cursor.fetchall(): \n    print(opportunity) This query fetches opportunities that closed within the last 30 days, making it useful for time-sensitive reports. Key advantages of querying with Devart Connector Efficiency : SOQL queries are executed directly on Salesforce servers, reducing the overhead on your local application. Flexibility : The connector supports a wide range of SOQL features, enabling both simple and complex queries. Integration with Python : Results are returned in Python-friendly formats, making it easy to process and analyze data using libraries like Pandas. The Takeaway Salesforce Python integration has never been easier or more efficient than with the Devart Python Connector . By providing a direct, secure, and seamless connection between Python applications and Salesforce databases, this tool eliminates the complexities traditionally associated with data integration. The Devart Python Connector bridges the gap between Salesforce’s robust capabilities and Python’s versatility, empowering you to streamline workflows, automate tasks, and unlock new possibilities in data analysis, AI, and seamless integration with other platforms. Whether you’re a data scientist, developer, or enterprise architect, our Python connector simplifies your complex processes, boosts efficiency, and drives innovation, helping you focus on achieving your goals and growing your business. Ready to simplify your Salesforce-Python workflows? [Order your Devart Python Connector today](https://www.devart.com/python/salesforce/ordering.html) and unlock the power of streamlined data integration. Tags [how to](https://blog.devart.com/tag/how-to) [Products](https://blog.devart.com/tag/products) [python connectors](https://blog.devart.com/tag/python-connectors) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-salesforce-python.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+a+Salesforce+Database+From+Python+With+Devart+Python+Connector%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fconnect-salesforce-python.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-salesforce-python.html&title=How+to+Connect+to+a+Salesforce+Database+From+Python+With+Devart+Python+Connector%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-salesforce-python.html&title=How+to+Connect+to+a+Salesforce+Database+From+Python+With+Devart+Python+Connector%C2%A0) [Copy URL](https://blog.devart.com/connect-salesforce-python.html) RELATED ARTICLES [Product Release](https://blog.devart.com/category/product-release) [Meet Our New Python Connectors for Dynamics 365 Business Central, Excel Online, and Google Sheets](https://blog.devart.com/meet-our-new-python-connectors-for-dynamics-365-business-central-excel-online-and-google-sheets.html) February 19, 2025 [Python Connectors](https://blog.devart.com/category/products/python-connectors) [How to Connect MongoDB Using Python Connector From Devart to Perform DML Operations](https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html) April 10, 2025 [Python Connectors](https://blog.devart.com/category/products/python-connectors) [How to Connect to Oracle Database From Python With Devart Python Connector](https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html) April 10, 2025"} {"url": "https://blog.devart.com/connect-to-aiven-database-using-gui-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Connect to Your Aiven Database Using dbForge GUI Tools: A Step-by-Step Guide By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) May 29, 2023 [0](https://blog.devart.com/connect-to-aiven-database-using-gui-tools.html#respond) 2267 The well-known as-a-service business model has long encompassed various digital solutions presented to customers, and databases are no exception. In this article, we’ll have an overview of Aiven, a nice example of a cloud service that offers, among other things, fully managed MySQL and PostgreSQL databases. Additionally, we’ll show you how to get started with Aiven and suggest a [few GUI tools](https://blog.devart.com/top-10-mysql-gui-tools-for-database-management-on-windows.html) that will make your work with it a breeze. Contents Cloud databases vs traditional databases What is Aiven? The advantages of Aiven Aiven’s integrations, connectors, and extensions How to create a cloud database service on Aiven How to connect to a MySQL database on Aiven How to connect to a PostgreSQL database on Aiven Your universal solution for managing multiple database systems Cloud databases vs traditional databases Let us start by briefly outlining the differences between traditional and cloud databases. The key difference is rather simple: while traditional databases are typically located and maintained by a company directly on its premises, cloud databases are hosted (unsurprisingly) in the cloud. The deployment, configuration, maintenance, security, and scaling of cloud databases are the typical responsibilities of their vendor/provider; you only need to pay for all that—yes, you got that right—as a service. On the other hand, you don’t need to allocate extra resources to purchase, install, and maintain hardware, you don’t need to worry about storage, and you don’t need to hire extra people—you simply rent a database and get yourself a monthly operational expense. And whenever your databases may require scaling, you can get it done quickly, and your users won’t even notice that. That makes a highly flexible, reliable, and accessible database system—the rest depends on your budget and the specifics of your company. What is Aiven? Aiven is one such vendor of cloud-based open-source databases, offering an all-in-one cloud data platform alongside all the tools one might need to keep full control over databases and auxiliary data services. The available cloud providers include Amazon Web Services, Google Cloud, Microsoft Azure, DigitalOcean, and UpCloud. As for the supported databases and data services, the choice is rich as well, comprising Apache Kafka, Apache Flink, Apache Cassandra, ClickHouse, M3, OpenSearch, Redis, InfluxDB, Grafana—and, most importantly, MySQL and PostgreSQL databases. The advantages of Aiven The advantages of Aiven are rather typical of your average cloud database vendor: Managed open-source databases Multi-cloud compatibility Easy integration with multiple connectors Unlimited scalability Enhanced security High availability World-class support Simplified database management and maintenance Aiven’s integrations, connectors, and extensions And just in case, here’s a concise list of Aiven’s available integrations for monitoring, orchestration, and data integration: Metrics: Grafana, Datadog, Prometheus, and Jolokia Logs: Aiven for OpenSearch, Aiven for Apache Kafka, Rsyslog custom endpoint, AWS CloudWatch, GCP Cloud Logging, Datadog logs, as well as external Elasticsearch and OpenSearch integrations 30+ Apache Kafka connectors and auxiliary tools, including Karapace Clickhouse extensions for Apache Kafka, PostgreSQL, and Grafana Internal orchestration tools, which include Aiven Console, Aiven CLI, Aiven REST API, Aiven Provider for Terraform, and Aiven Operator for Kubernetes 70+ PostgreSQL extensions How to create a cloud database service on Aiven Without further ado, let’s create a database service on Aiven. Since you can get started for free, it won’t be a problem. First off, [proceed to the signup page](https://console.aiven.io/signup) and create an Aiven account. Once you do it, you’ll be able to create a service. Create a service Once you click Create service , you will need to select one. In our case, we select MySQL. Choose a plan Here, you can either opt for a free plan with one free managed MySQL service, or go for a full 30-day trial. If you opt for the latter, you can select a preferred cloud provider. Select a service region Then scroll down to select a preferred service region. Select a service plan Scroll further down to select a preferred service plan for your trial. Note that if you go with a free plan, you’ll have it selected by default. Provide a service name Finally, you can add extra disk storage and enter a name for your service. Afterwards, you can review the service summary on the right and click Create service and start trial (or, if you’ve opted for a free plan, click Create free service ). Connection details Your next page is Connection details , where you can configure your credentials. Note that SSL mode will be required later on to connect to your Aiven service. After you finish setting it all up, click Next: Secure connection to proceed. Secure connection This is where you can restrict access to allow only trusted IP addresses to connect to your service. Once you’re ready, click Next: Add data . Add data On the final page, you can migrate an external database to your newly created service. You can either click Set up migration to do it or click Finish setup to proceed to your Aiven console. Extensions Note that if you start creating a PostgreSQL service, you’ll have an additional page called Extensions , where you’ll be able to add a number of functional extensions supported by Aiven. Now you’ve got your Aiven service up and running! How to connect to a MySQL database on Aiven Now it’s time to establish a connection to the service. There’s a nice [MySQL GUI tool](https://www.devart.com/dbforge/mysql/studio/) that will help you with it, and it’s called dbForge Studio for MySQL. We’ll expand a bit on its capabilities later on; and now, you only need to launch it. Once you do that, you’ll be greeted with the Database Connection Properties dialog that will help you connect to the newly created service using the credentials previously indicated in Connection details . As we mentioned, you’ll need to use SSL to connect to your Aiven service. To do that, go to the Security tab of the Database Connection Properties dialog, select the Use security protocol checkbox, and make sure SSL is selected as well. After that, simply click Connect . And so you are connected to your new service! The advantages of dbForge Studio for MySQL [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is a universal IDE for the development, management, and administration of MySQL and MariaDB databases, and it delivers a rich set of [features](https://www.devart.com/dbforge/mysql/studio/features.html) and versatile [compatibility](https://www.devart.com/dbforge/mysql/studio/database-connections.html) options that [make it one of the top solutions on the market](https://blog.devart.com/top-10-reasons-why-users-choose-dbforge-studio-for-mysql.html) . To give you a clear picture of the Studio, here’s a list of what it’s capable of: Visualization of database structures on ER diagrams IntelliSense-like code completion, formatting, and refactoring Debugging of T-SQL scripts, stored procedures, triggers, and functions Visual query building that involves no coding Versatile data management Comparison and synchronization of database schemas and table data Data aggregation in visual pivot tables Generation of customizable data reports Query performance optimization Generation of realistic test data Generation of database documentation Flexible database administration and maintenance Availability on Windows, Linux, and macOS (the latter two options are enabled via a compatibility solution called CodeWeavers CrossOver) That said, we gladly invite you to [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) , which is a nice opportunity to get some firsthand experience with all of the abovementioned features. How to connect to a PostgreSQL database on Aiven Now let us show you how to connect to a PostgreSQL database using a similar [PostgreSQL GUI client](https://www.devart.com/dbforge/postgresql/studio/) —yet another Studio that will ask you to connect to a database right on launch. Similarly, you’ll have to enable Use SSL protocol on the Security tab. Additionally, you will need to download the CA certificate from your Aiven console and specify the path to it in the Authority certificate field. After it’s done, click Connect . That’s it! The connection has been successfully established. The advantages of dbForge Studio for PostgreSQL [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is an IDE that effectively covers all the essentials of database development and management and delivers [compatibility](https://www.devart.com/dbforge/postgresql/studio/database-connections.html) that is just as diverse as the one offered by its MySQL counterpart. The feature sets of both Studios also have quite a lot in common: IntelliSense-like code completion, formatting, and instant syntax validation Versatile data editing, import, and export Comparison and synchronization of database schemas and table data Data aggregation in visual pivot tables Generation of customizable data reports Query performance optimization Generation of realistic test data Availability on Windows, Linux, and macOS (the latter two options are enabled via a compatibility solution called CodeWeavers CrossOver) And similarly, if you are a PostgreSQL user, we invite you to [download it for a free 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) and see it in action. Your universal solution for managing multiple database systems Both of the abovementioned Studios come separately and as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , a solution comprising four universal database IDEs that help you handle a diversity of development, management, and administration tasks across MySQL, MariaDB, PostgreSQL, Microsoft SQL Server, Oracle, and a number of other databases and cloud services. Download dbForge Edge for a free 30-day trial today! You can get started with dbForge Edge right away—simply [download it for a free month-long trial](https://www.devart.com/dbforge/edge/download.html) and get acquainted with the full power of its capabilities. We bet it won’t leave you indifferent. Tags [Aiven](https://blog.devart.com/tag/aiven) [connect to database](https://blog.devart.com/tag/connect-to-database) [connect to mysql](https://blog.devart.com/tag/connect-to-mysql) [connect to postgresql](https://blog.devart.com/tag/connect-to-postgresql) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [MySQL](https://blog.devart.com/tag/mysql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-to-aiven-database-using-gui-tools.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+Your+Aiven+Database+Using+dbForge+GUI+Tools%3A+A+Step-by-Step+Guide&url=https%3A%2F%2Fblog.devart.com%2Fconnect-to-aiven-database-using-gui-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-to-aiven-database-using-gui-tools.html&title=How+to+Connect+to+Your+Aiven+Database+Using+dbForge+GUI+Tools%3A+A+Step-by-Step+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-to-aiven-database-using-gui-tools.html&title=How+to+Connect+to+Your+Aiven+Database+Using+dbForge+GUI+Tools%3A+A+Step-by-Step+Guide) [Copy URL](https://blog.devart.com/connect-to-aiven-database-using-gui-tools.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-to-postgresql-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) Connect to PostgreSQL Database Using psql, pgAdmin, and PostgreSQL Client Tool By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) December 6, 2024 [0](https://blog.devart.com/connect-to-postgresql-database.html#respond) 17110 This article will be useful for those who have already installed PostgreSQL on personal computers. In case you still haven’t, [this blog post](https://blog.devart.com/download-install-postgresql-on-windows.html) will come in handy. Now, we will focus on the most popular ways of accessing the PostgreSQL database on Windows. We will describe the PostgreSQL connection options using psql, pgAdmin, and fast and convenient [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) by Devart. Moreover, here you will find the instructions on connecting to local and remote PostgreSQL servers. Content Connecting to PostgreSQL databases using psql How to connect to a local database How to connect to a remote database How to exit psql Common commands in psql Connecting to PostgreSQL databases with pgAdmin How to use pgAdmin to set up local databases How to connect to remote databases Connecting to PostgreSQL from dbForge Studio for PostgreSQL Most frequent connection errors in PostgreSQL Conclusion Connecting to PostgreSQL databases using psql psql is a command-line utility coming by default with any PostgreSQL installation that allows you to connect to PostgreSQL and execute various commands to create and manage Postgres databases. After installing PostgreSQL, you can access this database terminal by searching for it in the Windows search bar. Just type psql and switch to the Apps section: To connect to the Postgres database from this SQL Shell app, you need the following credentials: Server Database Port Username Password How to connect to a local database If the database is hosted on your personal computer, you can connect to localhost. As we have mentioned earlier, it requires Server , Database , Port , Username , and Password . The application suggests the default values right away: Server [localhost]:\nDatabase [postgres]:\nPort [5432]:\nUsername [postgres]:\nPassword for user postgres: If you simply press Enter without entering any variables, they will be filled in automatically with the default values. Although, you will have to enter a password anyway. Note You will not see any characters while entering the password. This is how psql protects your security. Just keep typing and press Enter . In a couple of seconds, you will get connected to Postgres locally. How to connect to a remote database In case you are using several servers, and some of them are remote, you can use psql to connect to remote servers and work with databases on them from your local machine. Actually, the only difference in the connection process is the necessity to enter the credentials for that remote server and the database. Note You can enter either a hostname or an IP address for Server . Before connecting to a remote PostgreSQL server, it’s important to ensure that the server is [configured to allow external connections.](https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html) How to exit psql The psql utility is a very lightweight and quick tool, and its commands are short. The simplest way to exit the psql utility is using the below command: \\q Note After exiting the psql tool, you need to perform all the steps to connect to PostgreSQL anew. Therefore, make sure to complete the tasks you worked on and exit the database before quitting the psql tool. Common commands in psql psql offers a broad range of commands to manipulate PostgreSQL databases. To list all databases hosted on the server, execute the following command: \\l To connect to any particular database, use the below command: \\c To [list tables in a PostgreSQL database](https://www.devart.com/dbforge/postgresql/studio/postgres-list-all-tables.html) , you can use the \\d command to get the basic information or the \\dt+ command for the detailed information about the tables (if you add + to commands, you will get additional information in the output): To describe any specific table, execute \\d tablename You can use psql to [list schemas in PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/postgres-list-schemas.html) , [review Postgres functions](https://www.devart.com/dbforge/postgresql/studio/postgresql-list-functions.html) , and in many other tasks. To see more commands, [click here](https://www.postgresql.org/docs/13/app-psql.html) . Connecting to PostgreSQL databases with pgAdmin pgAdmin is the free community client for PostgreSQL that is usually installed along with PostgreSQL. While psql is a plain command-line tool, pgAdmin is a graphical user interface (GUI) that provides pretty much the same functionality in a visual mode. To find the pgAdmin application on your computer, use the Windows search bar under the Apps section: How to use pgAdmin to set up local databases Click Add New Server . In the dialog box that opens, give that server a name on the General tab and proceed to the Connection tab. Enter your hostname and password and click Save . After that, you will see the newly-added server with the databases hosted there in the Object Explorer pane: How to connect to remote databases Connecting to a remote database in pgAdmin is similar to connecting to a local one. You need to add a new server, give it a name, and enter the connection details into the corresponding fields. Click Save , and the remote server will be added. Connecting to PostgreSQL from dbForge Studio for PostgreSQL [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is a multi-featured integrated development environment (IDE) designed for PostgreSQL and Amazon Redshift specialists and a powerful alternative to pgAdmin. A robust toolset covers all standard tasks of database development and management along with data analysis and reporting, delivering all the functionality in a neat visual interface for the user’s convenience. The Studio offers a comprehensive set of features for Postgres specialists of any skill level, all within a single IDE. An intuitive [SQL editor and formatter](https://www.devart.com/dbforge/postgresql/studio/postgresql-formatter.html) , advanced tools for [database comparison and synchronization](https://www.devart.com/dbforge/postgresql/studio/database-synchronization.html) (covering both the database schemas and table data), and professional [data migration](https://www.devart.com/dbforge/postgresql/studio/data-export-import.html) , as well as [data analysis and reporting](https://www.devart.com/dbforge/postgresql/studio/key-features.html#analysis) capabilities, improve every aspect of your workflow. Additionally, advanced customization and automation options allow the users to get rid of manual routine tasks, enhance overall productivity, and save costs significantly. With dbForge Studio for PostgreSQL, connection to the server takes a couple of clicks. Open the Studio and click New Connection . Enter the connection properties: Host , Port , User , and Password . Then, choose the desired database from the drop-down menu, and click Connect . Most frequent connection errors in PostgreSQL Connecting to PostgreSQL is typically straightforward when all necessary details are provided. However, errors can occasionally occur due to common issues, which are often easy to resolve. Below, we review the most frequent connection errors and how to fix them. 1. No such file or directory This error occurs when the local server is not running, or the hostname or port are specified incorrectly. Fix: Verify that the PostgreSQL server is running and that the hostname and port (default is 5432) are correct. The port must be open and accessible. 2. Connection refused This error indicates the PostgreSQL server is not responding to a network connection attempt. It often occurs if the server is down, listening on a different port, or is blocked by a firewall. Fix: Check the server status, confirm the port settings, and ensure the specified port matches the configuration. You may need to review server logs and address any network-related issues, such as firewall block 3. Authentication failed This error, commonly shown as “authentication failed for user…,” arises when the provided username or password is incorrect. Fix: Double-check the connection credentials and ensure the username and password match the PostgreSQL user configuration. 4. Database does not exist This error indicates that the specified database does not exist on the server. Fix: Verify the database name is correct. You may need to create the database if it does not already exist. Alternatively, omit the default database during the initial connection process. Instead, connect to the server and use the \\l command to list all available databases. Once identified, use the \\c database_name command to connect to a specific database 5. Server terminated the connection The server unexpectedly closes the connection, which signals a critical error. Fix: Check the PostgreSQL logs to identify the root cause. If necessary, contact the system administrator for further assistance. 6. Too many clients already This error occurs when the maximum allowed number of connections to the database is exceeded. The limit is controlled by the max_connections parameter. Fix: Review the current connection count and terminate unnecessary connections if possible If needed, adjust the max_connections parameter value in the PostgreSQL configuration file to allow more connections. Common errors can be avoided with proactive checks and proper server configuration. By understanding the root cause of each issue, you can ensure a smooth and reliable connection to your PostgreSQL database. Conclusion We have described different ways of connecting to local and remote PostgreSQL databases. SQL Shell (psql) is an easy way to connect and manipulate a database by means of command-line queries. As for pgAdmin, it might be more convenient for those who don’t wish to memorize tons of commands but would like to work in a more user-friendly environment. Finally, [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is a perfect combination of an intuitive graphical interface and an augmented command line. You can try dbForge Studio for PostgreSQL under full workload using the [fully functional free trial](https://www.devart.com/dbforge/postgresql/studio/download.html) that is provided for 30 days. Download the software, install it, and explore all its capacities in your daily workflows! If your daily tasks involve working with multiple database management systems—a common scenario for most organizations, as relying on a single DBMS is often impractical—Devart provides a [powerful multi-database solution](https://www.devart.com/dbforge/edge/) : dbForge Edge . It handles database tasks in Microsoft SQL Server, MySQL, MariaDB, Oracle, and PostgreSQL, offering robust functionality for all database routines, from generating high-quality code to managing version control. Tags [command line](https://blog.devart.com/tag/command-line) [connect to database](https://blog.devart.com/tag/connect-to-database) [connect to postgresql](https://blog.devart.com/tag/connect-to-postgresql) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [PostgreSQL Tutorial](https://blog.devart.com/tag/postgresql-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-to-postgresql-database.html) [Twitter](https://twitter.com/intent/tweet?text=Connect+to+PostgreSQL+Database+Using+psql%2C+pgAdmin%2C+and+PostgreSQL+Client+Tool&url=https%3A%2F%2Fblog.devart.com%2Fconnect-to-postgresql-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-to-postgresql-database.html&title=Connect+to+PostgreSQL+Database+Using+psql%2C+pgAdmin%2C+and+PostgreSQL+Client+Tool) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-to-postgresql-database.html&title=Connect+to+PostgreSQL+Database+Using+psql%2C+pgAdmin%2C+and+PostgreSQL+Client+Tool) [Copy URL](https://blog.devart.com/connect-to-postgresql-database.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connect-to-sql-azure.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) How to Connect to Azure SQL Database By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) November 17, 2023 [0](https://blog.devart.com/connect-to-sql-azure.html#respond) 2072 The shift from on-premises to cloud services is a growing trend due to their cost-efficiency, scalability, and speed. Also, cloud providers handle platform maintenance. This means users can focus on their applications without concerns about availability, security, updates, patches, or backups. [Azure SQL](https://blog.devart.com/what-is-azure-sql.html) is one of the most popular cloud platforms – the relational database-as-a-service, the SQL Server engine hosted in the Azure cloud. It allows for migrating any applications developed with SQL Server to the cloud easily and working on them using familiar tools and resources. The Azure portal has its own query editor, but if you want to apply other tools, like SQL Server Management Studio (SSMS), you can easily connect to Azure SQL Database using this application, or other compatible apps. This article will explore connecting to Azure SQL Database with on-premises applications. Contents Before we start: the Azure connection credentials and the firewall How to retrieve the Azure connection credentials How to configure the server firewall Connect to Azure using SQL Server Management Studio Connect to Azure using dbForge Studio for SQL Server Connecting with Azure Active Directory authentication Connect to Azure using Visual Studio Connect to Azure using Power BI Connect to Azure using PowerShell Conclusion Before we start: the Azure connection credentials and the firewall To connect to Azure from on-premises applications, you need an active Azure subscription and a database in the cloud. Then you can obtain the login credentials to use them when connecting to Azure from the on-premises applications. How to retrieve the Azure connection credentials If you never used Azure SQL, you need to start by [creating a free account](https://azure.microsoft.com/free/) . Azure Database provides a test database AdventureWorksLT that you can use as a sample. However, first, you need to [create a single database](https://learn.microsoft.com/en-us/azure/azure-sql/database/single-database-create-quickstart?view=azuresql&tabs=azure-portal#create-a-single-database) to deploy that sample database and access its schema and data. Microsoft provides detailed instructions, so the task takes a couple of minutes. Azure Database is a paid service, but it provides a free trial for 30 days, so you can try the functionality and evaluate it appropriately with that test database and other databases you’d like to work with. When the empty database is created and the test AdventureWorksLT database is deployed, you get it as mySampleDatabase in your Azure account. The username and password you set during the database creation process are the login details you require to connect to Azure SQL Database from other applications. In the Overview section, you will see the fully qualified server name next to Server name on the top: These details allow you to connect to the Azure SQL Database from other applications. However, there is one more mandatory step left – you need to configure the firewall. How to configure the server firewall Connections to Azure SQL Databases from outside the Azure environment are blocked by default. Azure establishes a server-level firewall for these databases. Therefore, to allow external access, you must set up a firewall rule specifying the permitted IP addresses or IP address ranges. In your Azure account, navigate to the SQL databases section and then select mySampleDatabase . Click Set server firewall . By default, the platform shows the networks and existing firewall rules. Click Add your client IP to configure a new firewall rule, and save the changes. This rule ensures that Port 1433 (the server listens on this port) is open for the specified IP addresses. Also, you can configure any firewall rules for your environment by clicking Add a firewall rule. Now, let us review how to connect to Azure SQL Database from on-premises tools. Connect to Azure using SQL Server Management Studio SQL Server Management Studio is the default integrated development environment for SQL Server databases, and if you are an Azure SQL user, there is no reason to reject SSMS. You can use it to connect SQL databases in Azure, query the databases, retrieve data, and perform other necessary operations. Let’s examine how to connect Azure Database from SQL Management Studio. This process is in general similar to connection to other databases on-premise, but with some specificities that you should note. Open your SSMS and launch a new connection. Enter the following details: Server type: Database engine Server name: the necessary fully qualified server name Authentication: SQL Server Authentication Login: the username you set during the database creation Password: your password Then click Options > Connection Properties . You need to define the database to connect: Select Browse from the menu. Confirm that you want to continue by clicking Yes . Select mySampleDatabase from the drop-down menu and click OK . Click Connect . After that, SSMS connects to Azure SQL and shows the database in the Object Explorer pane. You can query that database in the same way as other databases in SSMS. Important: The Azure SQL Database does not support the USE statement. In our scenario, we work with one SQL database in the cloud only, but if you have more of them, you need to establish a new connection for a new database if you want to switch between them in SSMS. Connect to Azure using dbForge Studio for SQL Server Many SQL Server pundits employ [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) as an SSMS alternative. The Studio is a multi-functional IDE that includes features and options that are not available in the default SSMS. Also, the Studio is [fully compatible with Azure](https://www.devart.com/dbforge/sql/studio/dbforge-tools-for-sql-azure.html) , so you can apply it to work with Azure SQL databases efficiently. To connect to Azure SQL Database, establish a new connection in the Studio and enter the server name and your credentials. Click Connect . When the connection is established, you will see mySampleDatabase in the Object Explorer menu and will be able to query that database directly. Connecting with Azure Active Directory authentication dbForge Studio for SQL Server also supports connecting to the Azure Database with an Azure Active Directory authentication that grants more secure access. Users can use the following authentication types: Azure Active Directory – Universal with MFA support Azure Active Directory – Password Azure Active Directory – Integrated The default option is Universal with MFA support. To use it, you need first of all [register dbForge Studio for SQL Server as an Azure Active Directory application](https://docs.devart.com/studio-for-sql-server/connecting-to-databases/registering-tool-as-azure-active-directory-app.html) and get the application ID that will be generated during the registration process. After that, you will be able to connect to Azure Database with Azure Active Directory Universal with MFA support authentication. Launch a new connection in the Studio: Database > New Connection > select Azure Active Directory – Universal with MFA support as the authentication type. Enter the credentials details: Server : URL of the Azure SQL Server instance. User name : Azure Active Directory user with Azure SQL database permissions. Select Use common MFA options and click Change Common Options . In the new window, enter the Application ID and Redirect URL generated during the registration of dbForge Studio for SQL Server: Note: If you want to insert an alternative application ID, choose Override MFA options for this connection at the previous step and enter the Application ID and Redirect URL. Click OK to save the details, then click Connect to establish the connection. You will be prompted to sign in to your account. In the Sign in window that will appear, enter the credentials for your Azure Active Directory account. After the successful login, you will get access to your databases in dbForge Studio for SQL Server. The Studio also supports Azure Active Directory – Password and Azure Active Directory – Integrated authentication types, you can select one of those types and follow the [detailed illustrated instructions](https://docs.devart.com/studio-for-sql-server/connecting-to-databases/connecting-with-azure-active-directory-authentication.html) to establish the connection. Connect to Azure using Visual Studio Azure SQL cloud platform is one of the popular resources for application developers, many of whom use Visual Studio. It allows performing all the development tasks with one solution as it provides the selection of the necessary tools and the possibility to write, modify, and debug the code, and then deploy the application. You can connect to Azure SQL Server from Visual Studio and employ your favored tools to manage the databases and build applications based on those databases. Open the project and navigate to the Connected Services > Service Dependencies . Choose Azure SQL Database . Select the database. Provide the username and password. Click Finish . This way, Visual Studio connects to Azure SQL Database, so the developers can use the databases in the cloud. Connect to Azure using Power BI Microsoft created Power BI as a business intelligence platform that allows users to connect to various data sources and visualize the data and trends as well as transfer those visuals into other popular apps. Azure databases are one of the data sources supported by default, so let us review how to connect Azure SQL database to Power BI and use its data for analysis. The Power BI service connects to Azure SQL Database via the desktop application. So, if you want to retrieve the data for analysis from your database in the cloud and apply the Power BI capacities to that data, you need to [download Power BI Desktop](https://powerbi.microsoft.com/en-us/downloads/) first. After installing this application, you can retrieve data from the databases. Open Power BI Desktop and click Get data > More . Choose Azure > Azure SQL Database . Click Connect . In the next window, enter the fully qualified server name, specify Direct Query as the data connectivity mode, and click OK . In the SQL Server database window, in the Database section, provide your Azure login credentials. Click Connect . The Power BI Desktop solution connects to the Azure database and presents the data. Connect to Azure using PowerShell If you are a PowerShell devotee, you can connect to Azure SQL Database using PowerShell and manage and administer your databases and other Azure resources from the command line. First of all, you need to [install the Azure PowerShell module](https://learn.microsoft.com/en-us/powershell/azure/install-azure-powershell?view=azps-10.4.1) if you don’t have it already installed. With the module installed, open PowerShell and execute the below command to sign in to the Azure account. Connect-AzAccount You will be prompted to log in. After getting connected, you can work with the Azure SQL Database from the command-line interface. For example, you can fetch the list of all databases on the server with the following command: Get-AzSqlDatabase Further, you can manage your Azure SQL Database resources using the standard [Azure-specific commands](https://learn.microsoft.com/en-us/powershell/azure/get-started-azureps?view=azps-10.4.1#find-commands) . Learn how to [export Azure SQL database](https://blog.devart.com/export-azure-sql-database.html) with step-by-step instructions in this article. Conclusion Azure SQL Database is a preferred choice for many SQL Server users, particularly for database development. While the cloud platform offers a comprehensive set of online tools, it also accommodates users who prefer their trusted on-premises applications, such as SSMS, Visual Studio, or dbForge Studio for SQL Server. Among these options, dbForge Studio for SQL Server stands out as one of the most robust SQL Server IDEs, with full support for Azure. This means you can seamlessly develop, manage, and administer databases in the cloud, harnessing all the capabilities of the Studio. To experience the complete functionality of dbForge Studio for SQL Server in action, you can take advantage of our [fully functional free trial](https://www.devart.com/dbforge/sql/studio/download.html) , which is available for a 30-day period. This trial allows you to explore all features and options with unlimited databases, both in the cloud and on-premises. Tags [Azure Cloud](https://blog.devart.com/tag/azure-cloud) [Azure SQL](https://blog.devart.com/tag/azure-sql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-to-sql-azure.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+Azure+SQL+Database&url=https%3A%2F%2Fblog.devart.com%2Fconnect-to-sql-azure.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-to-sql-azure.html&title=How+to+Connect+to+Azure+SQL+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-to-sql-azure.html&title=How+to+Connect+to+Azure+SQL+Database) [Copy URL](https://blog.devart.com/connect-to-sql-azure.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connect-to-sql-server-in-java.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Connect Java to Microsoft SQL Server By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) February 1, 2025 [0](https://blog.devart.com/connect-to-sql-server-in-java.html#respond) 2685 In today’s data-centric era, the ability to establish effective connections to databases is crucial for any Java developer. SQL Server stands as one of the most popular database management systems, powering numerous enterprise applications worldwide. For Java developers, mastering the skill of seamlessly connecting to SQL Server isn’t just an advantage—it’s absolutely essential. Traditional relational database management systems, such as Microsoft SQL Server, offer robust encryption and security measures to safeguard the sensitive data of companies and their applications. For organizations seeking secure automated data access and transaction processing, integrating Microsoft SQL Server with Java proves to be a reliable choice. Java, a widely-used programming language known for its versatility, complements SQL Server’s security features. Thus, the integration of Java with Microsoft SQL Server databases is widely adopted for developing secure applications. In this article, we will explore how to use the JDBC driver to connect Java applications to Microsoft SQL Server databases. Contents Prerequisites Connect to Microsoft SQL Server in Java Step 1: Download and integrate Microsoft JDBC Driver Step 2: Build the connection URL Step 3: Register the driver and specify the connection details Step 4: Establish a connection to the SQL Server database Step 5: Execute basic SQL commands Authentication methods Best practices and security considerations Limitations of connecting Java to Microsoft SQL Server FAQ Conclusion Prerequisites Java Development Kit (JDK) Ensure Java is installed on your system. You can download it from the official Oracle website or use an open-source alternative like OpenJDK. SQL Server Make sure you have SQL Server installed and running. You can download the free SQL Server Express edition from the Microsoft SQL Server website for development and small-scale projects or use a licensed version for enterprise-level applications. Integrated Development Environment (IDE) Choose and install an Integrated Development Environment (IDE) such as Eclipse, IntelliJ IDEA, or NetBeans to facilitate the writing and management of your Java code. Connect to Microsoft SQL Server in Java Java is a flexible programming language, ideal for creating sophisticated interactive applications and games. Due to its strong security features, many businesses rely on Java SQL Server connections, often through JDBC (Java Database Connectivity), to link their databases and servers with web applications. JDBC (Java Database Connectivity) is a Java API (Application Programming Interface) that enables Java applications to interact with databases. It provides a standard interface for Java applications to perform database operations such as querying data, inserting records, updating data, and deleting records. Step 1: Download and integrate Microsoft JDBC Driver To begin connecting Java applications with Microsoft SQL Server, the first step is to get the Microsoft JDBC Driver. 1.1 Navigate to the [Microsoft JDBC Driver for SQL Server](https://learn.microsoft.com/en-us/sql/connect/jdbc/download-microsoft-jdbc-driver-for-sql-server?view=sql-server-ver16) webpage. Download and extract the archive files. Add the mssql-jdbc-8.2.0.jreVERSION.jar file to the classpath of your project. If you’re using Maven, declare the following dependencies: \n com.microsoft.sqlserver\n mssql-jdbc\n 8.2.1.jre11\n Step 2: Build the connection URL The syntax for the database URL to connect Java to Microsoft SQL Server is as follows: jdbc:sqlserver://[serverName[\\instanceName][:portNumber]][;property=value[;property=value]] Where: serverName : Refers to the name of the host or IP address of the device where SQL Server is currently running. instanceName : Specifies the instance name to connect to the server. portNumber : Indicates the port number of the server. The default value is typically 1433. property=value : Defines the connection settings. Step 3: Register the driver and specify the connection details In Java, before establishing a connection to the Microsoft SQL Server database, you need to register the JDBC driver and specify the connection details. Here’s how you can do it: // Register the JDBC driver\nClass.forName(\"com.microsoft.sqlserver.jdbc.SQLServerDriver\");\n\n// Specify the connection URL, username, and password\nString url = \"jdbc:sqlserver://serverName\\\\instanceName:portNumber;databaseName=yourDatabase\";\nString username = \"yourUsername\";\nString password = \"yourPassword\";\n\n// Create the connection\nConnection connection = DriverManager.getConnection(url, username, password); Replace serverName , instanceName , portNumber , databaseName , yourUsername , and yourPassword with your actual server, instance, port, database name, username, and password, respectively. Once you’ve registered the driver and specified the connection details, you can establish a connection to the SQL Server database using the DriverManager.getConnection() method. Step 4: Establish a connection to the SQL Server database To establish a Java SQL Server connection to the required database, follow these steps: Call the getConnection() method from the DriverManager class. Pass the username and password as parameters to the method. Use the java.util.Properties object to store connection properties. Ensure integrated security is set to true for authentication mode, and include sqljdbc_auth.dll in the classpath. Here’s how you can achieve this: import java.sql.Connection;\nimport java.sql.DatabaseMetaData;\nimport java.sql.DriverManager;\nimport java.sql.SQLException;\n\npublic class ConnectToServer {\n\n public static void main(String[] args) throws ClassNotFoundException {\n\n Connection conn = null;\n\n try {\n\n DriverManager.registerDriver(new com.microsoft.sqlserver.jdbc.SQLServerDriver());\n\n String dbURL = \"jdbc:sqlserver://demo-mssql\\SQLEXPRESS02;encrypt=true;trustServerCertificate=true;databaseName=DemoDatabase\";\n String user = \"sa\";\n String pass = \"12345\";\n conn = DriverManager.getConnection(dbURL, user, pass);\n \n if (conn != null) {\n System.out.println(\"The connection has been successfully established.\");\n \n DatabaseMetaData dm = conn.getMetaData();\n System.out.println(\"Driver name: \" + dm.getDriverName());\n System.out.println(\"Driver version: \" + dm.getDriverVersion());\n System.out.println(\"Product name: \" + dm.getDatabaseProductName());\n System.out.println(\"Product version: \" + dm.getDatabaseProductVersion());\n }\n\n } catch (SQLException ex) {\n System.out.println(\"An error occurred while establishing the connection:\");\n ex.printStackTrace();\n } finally {\n try {\n if (conn != null && !conn.isClosed()) {\n conn.close();\n }\n } catch (SQLException ex) {\n ex.printStackTrace();\n }\n }\n }\n} Replace SQL Server’s server and instance name, database name, username, and password values in the script with your actual credentials. Step 5: Execute basic SQL commands Once you’ve successfully established a connection to SQL Server within your Java application, you can proceed to execute SQL queries for a variety of database operations. Below is an example illustrating how to insert data into a table: import java.sql.Connection;\nimport java.sql.DriverManager;\nimport java.sql.PreparedStatement;\nimport java.sql.ResultSet;\nimport java.sql.SQLException;\n\npublic class InsertMultipleData {\n public static void main(String[] args) {\n String dbURL = \"jdbc:sqlserver://demo-mssql\\SQLEXPRESS02;encrypt=true;trustServerCertificate=true;databaseName=DemoDatabase\";\n String user = \"sa\";\n String pass = \"123\";\n\n int totalRowsAffected = 0;\n\n try {\n Connection connection = DriverManager.getConnection(dbURL, user, pass);\n\n // Data set to be inserted\n Object[][] data = {\n {10, \"Smith\", \"James\", \"Riddle Hill\", \"Chicago\"},\n {20, \"Johnson\", \"Charlie\", \"North Church\", \"Chicago\"},\n {30, \"Williams\", \"Oscar\", \"Red Lake\", \"Chicago\"}\n };\n\n String insertQuery = \"INSERT INTO Persons (PersonID, LastName, FirstName, Address, City) VALUES (?, ?, ?, ?, ?)\";\n PreparedStatement preparedStatement = connection.prepareStatement(insertQuery);\n\n for (Object[] row : data) {\n preparedStatement.setInt(1, (int) row[0]);\n preparedStatement.setString(2, (String) row[1]);\n preparedStatement.setString(3, (String) row[2]);\n preparedStatement.setString(4, (String) row[3]);\n preparedStatement.setString(5, (String) row[4]);\n\n int rowsAffected = preparedStatement.executeUpdate();\n\n if (rowsAffected > 0) {\n totalRowsAffected += rowsAffected;\n } else {\n System.out.println(\"Failed to insert data.\");\n }\n }\n\n // The output message regarding the total number of rows added.\n System.out.println(\"Data has been successfully added. Total number of rows added: \" + totalRowsAffected);\n\n // Displaying the outcome\n String selectQuery = \"SELECT * FROM Persons\";\n PreparedStatement selectStatement = connection.prepareStatement(selectQuery);\n ResultSet resultSet = selectStatement.executeQuery();\n\n System.out.println(\"Outcome:\");\n System.out.printf(\"%-10s %-20s %-20s %-30s %-20s%n\", \"PersonID\", \"LastName\", \"FirstName\", \"Address\", \"City\");\n\n while (resultSet.next()) {\n System.out.printf(\"%-10s %-20s %-20s %-30s %-20s%n\",\n resultSet.getInt(\"PersonID\"),\n resultSet.getString(\"LastName\"),\n resultSet.getString(\"FirstName\"),\n resultSet.getString(\"Address\"),\n resultSet.getString(\"City\"));\n }\n\n selectStatement.close();\n preparedStatement.close();\n connection.close();\n } catch (SQLException e) {\n e.printStackTrace();\n }\n }\n} This code inserts new records into the Persons table. Replace SQL Server’s server and instance name, database name, username, and password values in the script with your actual credentials. Creating an application, yet still need to populate its database with the test data? Discover [SQL Server Data Generator](https://www.devart.com/dbforge/sql/studio/sql-server-data-generator.html) , a powerful tool to empower your development workflow. Let’s check whether the data has been successfully inserted. For this task, we’ll use a GUI tool tailored for SQL Server – [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . This comprehensive integrated environment is specifically crafted to address all kinds of database development, management, and administration tasks efficiently. You can see that our data insertion efforts have been successful. Step Description Key Consideration Step 1: Download JDBC Driver Download the Microsoft JDBC Driver from the official website and add it to your project’s classpath. Ensure the driver version matches your Java version. Step 2: Build Connection URL Create a connection URL using the syntax: jdbc:sqlserver://[serverName\\instanceName][:portNumber][;property=value]. Default port is 1433; ensure correct server and instance details. Step 3: Register Driver Use Class.forName(“com.microsoft.sqlserver.jdbc.SQLServerDriver”) to register the driver. Handle ClassNotFoundException if the driver is not found. Step 4: Establish Connection Use DriverManager.getConnection() with URL, username, and password to connect to the database. Secure credentials and use encrypted connections for security. Step 5: Execute SQL Commands Use PreparedStatement or Statement to execute SQL queries like SELECT, INSERT, UPDATE, or DELETE. Always use parameterized queries to prevent SQL injection. Step 6: Close Connection Close the database connection using conn.close() to release resources. Ensure connections are closed in a finally block to avoid leaks. Authentication methods Authentication plays a crucial role in establishing secure connections to SQL Server from Java applications. When configuring the authentication method, developers have two primary options: Windows authentication and SQL Server authentication. Windows Authentication (integratedSecurity=true) : With Windows authentication, the credentials of the currently logged-in Windows user are used to authenticate against the SQL Server instance. This method leverages the security mechanisms of the Windows operating system, such as Active Directory, to verify the user’s identity. It offers seamless authentication for users who are already logged into the Windows domain, eliminating the need to provide additional credentials. Windows authentication is often preferred in environments where centralized user management and authentication are enforced through Active Directory. SQL Server Authentication : SQL Server authentication requires providing a username and password explicitly when establishing the connection. Users authenticate directly against SQL Server with a username and password stored within the SQL Server instance. This method allows for greater flexibility, as credentials can be managed independently of the Windows domain, making it suitable for scenarios where centralized user management is not feasible or desired. SQL Server authentication is commonly used in scenarios such as web applications or third-party integrations, where users may not have Windows domain accounts or where cross-platform compatibility is required. Best practices and security considerations Use parameterized queries or prepared statements : Instead of directly concatenating user inputs into SQL queries, use parameterized queries or prepared statements to prevent SQL injection attacks. This approach helps sanitize user input and avoids potential vulnerabilities. Implement connection pooling : Use connection pooling to manage database connections efficiently. This helps improve performance and scalability by reusing existing connections instead of creating new ones for each request. Encrypt database connections : Use SSL/TLS encryption to secure database connections and protect data transmitted between the Java application and SQL Server. This helps prevent eavesdropping and tampering with sensitive information. Apply the principle of least privilege : Follow the principle of least privilege when configuring database user accounts. Grant only the necessary permissions required for the Java application to perform its intended operations. Avoid using privileged accounts for regular application tasks. Store database credentials securely : Avoid hardcoding database credentials in the source code. Instead, use secure methods such as environment variables, encrypted configuration files, or credential vaults to store and retrieve database credentials. Enable firewall and IP whitelisting : Configure firewall rules and IP whitelisting to restrict access to the SQL Server from specific IP addresses or network ranges. This helps prevent unauthorized access and protects against external threats. Update dependencies regularly : Keep your Java application and database drivers up to date with the latest security patches and updates. This helps address known vulnerabilities and ensures that your application remains secure against emerging threats. Monitor and audit database activity : Implement logging and monitoring mechanisms to track database activity and detect suspicious behavior or unauthorized access attempts. Regularly review logs and audit trails to identify potential security incidents and take appropriate action. Limit exposed database interfaces : Minimize the exposure of database interfaces such as JDBC endpoints to reduce the attack surface. Consider implementing network segmentation or using a VPN to restrict access to internal database resources. Perform input validation and sanitization : Validate and sanitize user input before processing it in SQL queries to prevent malicious input from causing unintended behavior or compromising data integrity. By adhering to these best practices and security considerations, you can help mitigate risks and ensure the secure integration of your Java application with SQL Server. Limitations of connecting Java to Microsoft SQL Server One potential limitation when connecting Java to Microsoft SQL Server is the possibility of encountering version incompatibility issues. For example, if you’re using Java 8 to execute your program with a Microsoft JDBC driver designed for Java 11 or higher. In such cases, attempting to load the SQL Server driver for Java 11 with Java 8 will result in an error. To resolve this issue, you’ll need to either download the appropriate Java 8 version of the driver or upgrade your Java software to version 11 for seamless execution. Considering other data export and import methods applicable to SQL Server? Check [data import and export tools](https://www.devart.com/dbforge/sql/studio/data-export-import.html) available in dbForge for SQL Server. FAQ Can I use JDBC to connect to SQL Server? Yes, you can connect Java applications to SQL Server using JDBC. The process requires a compatible JDBC driver, such as the Microsoft JDBC Driver for SQL Server, along with the correct connection string. Can you use SQL and Java together? Yes, Java and SQL work together using JDBC, allowing Java applications to connect and send SQL queries to SQL Server. This makes possible such database operations as inserting, updating, deleting, and retrieving data. How do you retrieve data from the SQL Server in Java? First of all, you need to set up a working connection. This is done through JDBC: create a Statement or PreparedStatement, execute a SELECT query, and afterward process the returned ResultSet. When all is finished, Java applications will be able to utilize the retrieved data. What is required to connect to SQL Server in Java? You need the following: A Java Development Kit (JDK) installed. A JDBC driver (Microsoft JDBC Driver). SQL Server must be running and allow TCP/IP connections. A valid connection string with authentication credentials. Which JDBC driver should be used to connect Java applications to the SQL Server? The Microsoft JDBC Driver for SQL Server is the official option. It’s a Type 4 driver that provides database connectivity through standard JDBC APIs. How can I connect to SQL Server using Windows Authentication in Java? To use Windows Authentication, configure the Microsoft JDBC Driver with integrated security by adding integratedSecurity=true to the connection string. Ensure the JDBC driver .dll files are correctly added to the system path. Conclusion In this article, we’ve explored the correct usage of JDBC drivers in how to connect Java to SQL Server, specifically focusing on connecting Java applications to Microsoft SQL Server databases. By following the outlined steps and best practices, developers can ensure secure integration of their Java applications with SQL Server, mitigating risks and safeguarding sensitive data effectively. With Devart’s dbForge product line, which includes powerful database tools for all major database management systems such as SQL Server, MySQL, MariaDB, Oracle, and PostgreSQL, developers gain access to robust tools designed to streamline database development, management, and administration tasks. We invite you to [download dbForge Studio for SQL Server for a 30-day free trial](https://www.devart.com/dbforge/sql/studio/download.html) to test its extensive functionality. Discover how its functionality can improve your workflow and broaden your database management experience. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnect-to-sql-server-in-java.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Java+to+Microsoft+SQL++Server&url=https%3A%2F%2Fblog.devart.com%2Fconnect-to-sql-server-in-java.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connect-to-sql-server-in-java.html&title=How+to+Connect+Java+to+Microsoft+SQL++Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connect-to-sql-server-in-java.html&title=How+to+Connect+Java+to+Microsoft+SQL++Server) [Copy URL](https://blog.devart.com/connect-to-sql-server-in-java.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Connect to MySQL Remotely With SSH PuTTY Tunnel and SSL: A Step-by-Step Guide By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) April 3, 2023 [0](https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html#respond) 3567 MySQL is a popular relational database management system to organize and store data. Depending on your specific use cases and preferences, you can connect to a MySQL Server through a command-line interface, using [GUI tools](https://blog.devart.com/top-10-mysql-gui-tools-for-database-management-on-windows.html) , such as [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , programming languages or via web-based interfaces such as phpMyAdmin. In addition, you can connect to a MySQL server remotely using SSH or VPN tunnels to ensure a secure connection. In this article, we’ll give a brief overview of SSH protocol and tunnel, as well as explore how to connect to a MySQL server remotely using the SSH tunnel and SSL created with the help of dbForge Studio for MySQL. Contents Introduction to SSH tunnel What is PuTTY? PuTTY: Pros and Cons Advantages of MySQL secure connection How to access a MySQL server remotely by creating an SSH tunnel with PuTTY How to connect to the MySQL server through PuTTY using dbForge Studio for MySQL Benefits of using a GUI tool to connect to MySQL Introduction to SSH tunnel SSH, also known as Secure Shell, is a network communication protocol to connect to a remote host over the internet in an encrypted manner. The SSH protocol operates on the client-server model, which means that the connection to the SSH server is created by the SSH client directly without being intercepted by third parties. The protocol uses authentication to enable communication between two remote systems – host and client – through a secure encrypted channel. The host is the remote server to access, while the client is the computer to be used to access the host. SSH tunnelling, also known as SSH port forwarding, is a technique that uses SSH to create an [encrypted connection](https://www.devart.com/dbforge/mysql/studio/mysql-encryption.html) between a client and a server. Users specify a source port on their local machine and a target port on the remote machine and configure the SSH client to forward traffic from the source port to the target port via the encrypted tunnel. Some SSH clients include OpenSSH, a free and open-source implementation of the SSH protocol available on most Linux-based systems, and PuTTY, a popular SSH client for Windows. Let’s dive into a better understanding of PuTTY, its advantages and disadvantages, and how to connect a MySQL server remotely using PuTTY and dbForge Studio for MySQL. What is PuTTY? PuTTY is a free and open-source client application that supports multiple network protocols including SSH, Telnet, rlogin, and SCP. It is mostly used on Windows computers to connect users to other network devices such as switches, routers, or remote servers. In other words, PuTTY can be used to transfer commands to the server. That is you connect to the server using the configured PuTTY, enter a command, and then the server executes it. Due to its simplicity, reliability, and ease of use, it is a popular choice among system administrators whose work often includes the remote management of servers and network devices. Prior to connecting to the remote server, an SSH tunnel must be set up. PuTTY: Pros and Cons Before we proceed, let’s consider the advantages and disadvantages of PuTTY. Advantages Secure and reliable connection to the remote server. Free and open-source application that enables developers to easily modify and customize the software. Flexible configuration of the remote host. Lightweight and fast tool that does not require much system resources to work. User-friendly GUI. Cross-platform application that allows developers to run it on various operating systems. Support for multiple protocols such as SSH, Telnet, rlogin, and SCP. Quick and easy transfer of large files between two systems. Log files tracking. Disadvantages Only the username is saved during the session. The copy/paste operation cannot be used. The application supports basic functionality. Non-text files cannot be transferred. Advantages of MySQL secure connection When you work with a MySQL database located on a remote server and want to transmit confidential data, such as any personal or financial data, it is critical to establish an encrypted secure TCP/IP connection between a MySQL server and a MySQL client. To achieve this, an SSH tunnel can be created with the help of PuTTY, which encrypts all traffic between the PuTTY client and the server. Also, you can access your database host through an SSH tunnel if you want to connect to the database in an encrypted way from any third-party database tool, such as dbForge Studio for MySQL, MySQL Workbench. In all these cases, you can run dbForge Studio for MySQL from your local computer and access the remote database instance through an encrypted connection. Additionally, you can use SSH tunnels to set up offsite replication for your database. Shortly speaking, if you want to establish a secure and encrypted connection to a MySQL server which is not directly accessible, use an SSH connection, created with PuTTY, over IP/TCP. How to access a MySQL server remotely by creating an SSH tunnel with PuTTY Let’s put aside the boring but necessary theory and get down to practice! So, in this block, we’ll provide a step-by-step procedure on how to create an SSH tunnel on Windows to access a remote MySQL server using PuTTY. Before we start, make sure the following prerequisites are met: PuTTY is installed on your computer. If you don’t have it installed, download it from the [official website](https://www.putty.org/) . Then, run the PuTTY installer and follow the instructions. Once done, click Finish to close the installer. Ensure that SSH is properly installed on your server based on your operating system. For detailed instructions, refer to the [How to Install SSH](https://www.devart.com/dbforge/edge/install-ssh-server.html) guide and follow the instructions specific to your OS. Now, we can set up an SSH tunnel. 1. Go to the directory that stores the PuTTY execution file – putty.exe – and double-click it to set up an SSH tunnel. The PuTTY Configuration window opens. 2. In the Session category, do the following: Select SSH as a connection type. In the Host Name (or IP address) field, enter the hostname or the IP address of the remote server on which you have SSH configured. In the Port field, set the [port of the remote server](https://www.devart.com/dbforge/mysql/studio/mysql-port.html) . Note: If you want to establish SSH server connection that uses a public-key authentication, you need to perform some additional steps and specify the private key in the PuTTY configuration: 2.1. Navigate to the Connection > SSH > Auth > Credentials subcategory and click Browse to search for your PuTTY Private Key. 2.2. In the Select private key file window that opens, select the private key and click Open to enter it. 3. Go to the Connection > SSH > Tunnels category and specify the details about your local machine as follows: Under Add new forwarded port : In the Source port field, specify any free port of your local computer, for example, 4567 . In the Destination field, enter the hostname of your local machine and the port on which your remote MySQL server is running separated by a colon, for example, localhost : 3306 . Note : We use localhost since the MySQL and SSH servers are hosted on the same server. If the MySQL server is installed on a different server than SSH, specify its IP address. Click Add to add the forwarded port. 4. Go to the Session category, enter a name for your session in the Saved Sessions field, and then click Save to keep the changes. To connect to the remote MySQL server, click Open . Note : If you haven’t logged in to this system with PuTTY before, you will receive an alert message that the host key is not cached for the given server. Verify that this is the server you want to connect to and then click Yes . In the PuTTY command-line interface, you’ll be prompted to enter the SSH username and password. Once done, the SSH tunnel is established. How to connect to the MySQL server through PuTTY using dbForge Studio for MySQL After we have created the SSH tunnel, let’s see how to connect to the MySQL server that uses SSL with the help of dbForge Studio for MySQL. It is a powerful integrated development environment (IDE) with feature-rich tools and functionalities that streamline MySQL database development, management, and administration. With dbForge Studio for MySQL, you can [easily write SQL code](https://www.devart.com/dbforge/mysql/studio/mysql-code-editor.html) , [debug stored procedures](https://www.devart.com/dbforge/mysql/studio/debugging.html) , [generate ER diagrams](https://www.devart.com/dbforge/sql/studio/database-diagram.html) , and more. To get started, launch the Studio to create a connection to the MySQL database. In the Database Explorer , click New Connection . In the Database Connection Properties window that opens, on the General tab, enter the information that refers to the MySQL server you want to connect to: Host: IP address or host name where the MySQL server you want to connect to is installed Login : Database username Password : Database user password Database : Database name Port number: The one that you specified in the Source port field in PuTTY Then, go to the Security tab and do the following: Select SSL as the security protocol. Specify the client key, client certificate, and authority certificate in the corresponding fields. Click Connect to establish the connection to the server. Benefits of using a GUI tool to connect to MySQL A GUI tool can provide a lot of benefits over the usage of a command-line interface, for example: User-friendly interface which makes it easier to work with databases for newbies. Better productivity by automating repetitive tasks and using shortcuts or pre-defined templates for common tasks. Visual editors that allow users to represent database structure and data, as well as build relationships between tables and create queries. Built-in features for managing databases, such as backup and restore, query optimization, performance monitoring, query builder, etc. Cross-platform support that enables tools to run on multiple operating systems, including Windows, [Linux](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-linux/) , and [macOS](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-macos/) . dbForge Studio for MySQL is a full-featured and cross-platform GUI tool packed with extensive visual database development and management tools that ensure the best user experience and developer productivity, [optimized query performance](https://www.devart.com/dbforge/mysql/studio/mysql-performance-tips.html) , easy testing and reporting analysis, as well as maintain database integrity and data consistency. Supported platforms: Windows, macOS, and Linux. Intelligent SQL coding features, such as code completion, formatting, MySQL prompt, [syntax check](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) , and code navigation that maximize coding. Query Profiler that enables to identify errors and fine-tune query and database performance. Query Builder to quickly generate queries of any complexity and build the relationship between them. [Import and export tools](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) to export/import data in/from multiple data formats. Data Generator to populate databases with random and testing data in a wink. Comparison and synchronization tools that make it easier to analyze differences between data and schemas. Backup and restore functionalities to effectively migrate databases between two instances. And it is not a full list of advanced tools that dbForge Studio for MySQL can offer. That’s why the Studio takes the leading place among database development and management tools. No more words, [download](https://www.devart.com/dbforge/mysql/studio/download.html) a free 30-day trial version to see the tool’s features and functionalities in action! Conclusion To sum it up, connecting to a remote MySQL server through an SSH tunnel is a secure and reliable way to access your database and ensure the safety of your data to transmit. In the article, we have explored how to create an SSH tunnel and [connect to your MySQL server](https://blog.devart.com/how-to-connect-to-mysql-server.html) using any MySQL client. Also, we have provided two methods to connect to a remote MySQL server using dbForge Studio for MySQL. So, do not hesitate and start using the Studio to evaluate its advanced capabilities. Tags [connect to remote mysql server](https://blog.devart.com/tag/connect-to-remote-mysql-server) [encrypted](https://blog.devart.com/tag/encrypted) [PuTTY](https://blog.devart.com/tag/putty) [ssh tunnel](https://blog.devart.com/tag/ssh-tunnel) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-mysql-with-putty-and-ssh-tunnels.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+MySQL+Remotely+With+SSH+PuTTY+Tunnel+and+SSL%3A+A+Step-by-Step+Guide&url=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-mysql-with-putty-and-ssh-tunnels.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html&title=How+to+Connect+to+MySQL+Remotely+With+SSH+PuTTY+Tunnel+and+SSL%3A+A+Step-by-Step+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html&title=How+to+Connect+to+MySQL+Remotely+With+SSH+PuTTY+Tunnel+and+SSL%3A+A+Step-by-Step+Guide) [Copy URL](https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/connecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Connecting to Shopify With Devart ODBC: A Guide to Data Analysis Using Power BI By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) August 28, 2024 [0](https://blog.devart.com/connecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html#respond) 851 Choosing the Devart ODBC Driver for Shopify enhances your ability to conduct in-depth data analysis and improves the integration process, offering a dependable and efficient connection to your Shopify data. In this article, we’ll demonstrate how to use the Devart ODBC Driver to connect to your Shopify database, followed by utilizing Power BI for data analysis. While there are multiple connectivity options available, such as REST APIs, direct database connections, and other ODBC drivers, choosing the right method is essential for efficient data management and analysis. The Devart ODBC Driver for Shopify stands out due to its superior performance, ease of deployment, and extensive compatibility. Unlike some alternatives that may require complex configurations or additional software, Devart’s ODBC driver streamlines installation process and provides robust features tailored for integration with various tools and platforms. Before diving into the technical steps, let’s take a moment to ensure you have a clear understanding of the products in use and the rationale behind choosing this setup. Table of Contents Technical Background Devart ODBC Driver for Shopify Shopify Architecture of Connecting Shopify to Power BI Setting Up the ODBC Connection Between Shopify and Power BI Generate Access Token Install the Devart ODBC Driver for Shopify Configure ODBC Driver Create ODBC Connection in PowerBI Desktop Conclusion Technical Background Before proceeding, let’s make sure you have a clear understanding of the products in use. Devart ODBC Driver for Shopify The Devart ODBC Driver for Shopify is a connectivity solution designed to deliver high-performance access to Shopify data through ODBC-compliant reporting, analytics, BI, and ETL tools on both 32-bit and 64-bit Windows systems. This driver supports the full range of standard ODBC API functions and data types, making it easy to access real-time Shopify data from any location. Shopify Shopify is a leading e-commerce solution that helps you start and grow your business with customizable online stores, marketing tools, sales channels, and more. Before starting to connect Shopify database by using Devart ODBC for Shopify and building Analytics, make sure to download necessary tools and drivers: Power BI Desktop Devart ODBC for Shopify Shopify account and Shopify database sample Ready to elevate your data analysis capabilities? Download the [Devart ODBC Driver for Shopify](https://www.devart.com/odbc/shopify/) today and experience seamless integration with your favorite BI tools. With the background established, let’s move on to exploring the overall architecture that connects Shopify with Power BI through the Devart ODBC Driver. This will give you a clear picture of how the data flows and is managed in this integration setup. Architecture of Connecting Shopify to Power BI This setup involves connecting your Shopify data to Power BI through the ODBC driver, ensuring a stable and efficient data flow for your analytics. The architecture primarily comprises of these three components: the Shopify platform, the Devart ODBC Driver, and Power BI. Here’s how these components interact: Shopify serves as the source of your data, hosting all the essential information about your products, customers, orders, and more. All this data is stored in Shopify’s cloud-based database and is accessible via APIs. The Devart ODBC Driver acts as the intermediary between Shopify and Power BI. It uses the ODBC (Open Database Connectivity) standard to facilitate a connection between Shopify’s API and any ODBC-compliant application. This driver efficiently translates API calls into queries that Power BI can process, enabling real-time data access and updates. Power BI is the tool used for data analysis and visualization. Once connected to Shopify via the Devart ODBC Driver, Power BI can query, process, and display Shopify data in various visual formats. This allows for dynamic reporting and analytics, helping you make data-driven decisions. With this architecture in mind, you are now ready to move on to the practical steps of setting up the connection between Shopify and Power BI. Setting Up the ODBC Connection Between Shopify and Power BI Whether you are a data analyst, business owner, or IT professional, you can follow the steps described below for a smooth setup of a connection between Shopify and Power BI using the Devart ODBC Driver. Generate Access Token This step is required to generate a Shopify Access Token, which is then used when configuring ODBC for the Shopify driver. To do it, follow these instructions. Access the Shopify Admin Page. Click on the Search textbox , then select the App option. 3. Select Apps and sales channel settings . 4. Navigate to the Settings page, and select the Apps & Sales Channels tab . Then, click the Develop apps button to proceed. 5. In the next screen click Allow custom app development to create a custom app by yourself. Then generate the API access token. 6. Next, click on the Create an app button. 7. Before you press Create app , you have to come up with a unique name for it. 8. After creating your app, go to Configure Admin API Scopes panel. 9. Select All or a specified API which you want to use and click Save . 10. Go to the top of the Settings page. Here you’ll see the Install App to be visible button. Click it to install your app. 11. Go to the API Credentials tab. Here, check that the Admin API Access Token has been generated. 12. Click the Reveal token once option. After that, you’ll get a string for Admin API Token Access. You need to copy this token to use it in the next step. 13. Configure the URL address of your store. On the Settings page, get the URL under My Store . With your access token and store URL ready, you’re all set to configure the Devart ODBC Driver and establish a reliable connection. Install the Devart ODBC Driver for Shopify Before proceeding with the connection setup, you need to install the Devart ODBC Driver for Shopify. This section will walk you through the installation process, ensuring that your system is properly configured to use the driver for accessing Shopify data from Power BI. [Download](https://www.devart.com/odbc/shopify/) and run the installer file to start working with the driver. Follow the instructions provided in a modal window to proceed. Choose the folder where you are going to keep the driver for easy access. Go to Select Components and then choose Full Installation . Take the last step and click Next to start the installation. After the installation is completed, let’s configure the driver. Configure ODBC Driver Now that we have installed an ODBC driver, we are ready to configure the ODBC connection in your environment. Note, that we are using Windows OS to demonstrate these steps.  To start configuring an ODBC driver, follow these instructions. Click Start to run the driver. In the modal window type C:\\Windows\\System32\\odbcad32.exe if the system is 64-bit to open the ODBC Data Source Administrator . Click the Driver tab and make sure Devart ODBC Driver for Shopify is in the list of drivers. Select the User DSN or System DSN tab . Click Add . Then, you will see the Create New Data Source dialog. Select the Devart ODBC Driver for Shopify driver and click Finish . The driver setup dialog will open. Enter the connection information in the matching fields: Data Source Name: ODBC_MyShopify Store: the URL of your Shopify store . Check how you can get the URL in previous steps if you haven’t done this yet. Access Token: the Admin API Access Token you’ve generated in previous steps . Click Test Connection button and make sure the connection is corrected. Create ODBC Connection in PowerBI Desktop In this step, you will have to use the Devart ODBC Driver for Shopify to establish the connection bridge between Power BI Desktop and Shopify database. Make sure that Power BI Desktop was installed correctly to avoid any issues. Follow the guide below to proceed. Open Power BI Desktop and click Get Data. Choose Others in the modal window. On the left panel, you’ll see other drivers Power BI supports. Choose ODBC driver and click Connect . Choose the DNS: ODBC_MyShopify from the Data Source dropdown list which was created in ODBC Driver Configuration step. Click OK . Then, enter the username and password if required. In this case, the username is your Shopify Account credentials. Click Connect . The Navigator window will appear, and you’ll see the list of tables in Shopify database. Select Comments and Users collection to add them into Power BI Desktop. Now we are ready to build the report with data from Shopify databases. Conclusion Using Devart ODBC Driver for Shopify enables your applications to connect to your Shopify database, opening up a variety of possibilities. You can extract and analyze data from your stores, products, sales, and customers, utilizing powerful analytics tools like Power BI, TIBCO Spotfire, QlikView, and Tableau. Additionally, this connection supports the development of applications that interact directly with the Shopify API, enhancing the capabilities of your e-commerce platform. Managing your Shopify data becomes more straightforward with tools such as Microsoft SQL Server Management, EMS SQL Management Studio, DBeaver, and RazorSQL. Once the connection is in place, accessing and utilizing any data from Shopify for advanced analytics is going to be simple and effective. Feel free to explore the [Devart ODBC Driver](https://www.devart.com/odbc/) further along with other integration options. Stay tuned to learn more about additional integration features that can potentially enhance your data management and analytics experience. Tags [odbc](https://blog.devart.com/tag/odbc) [odbc driver](https://blog.devart.com/tag/odbc-driver) [odbc drivers](https://blog.devart.com/tag/odbc-drivers) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html) [Twitter](https://twitter.com/intent/tweet?text=Connecting+to+Shopify+With+Devart+ODBC%3A+A+Guide+to+Data+Analysis+Using+Power+BI%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html&title=Connecting+to+Shopify+With+Devart+ODBC%3A+A+Guide+to+Data+Analysis+Using+Power+BI%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html&title=Connecting+to+Shopify+With+Devart+ODBC%3A+A+Guide+to+Data+Analysis+Using+Power+BI%C2%A0) [Copy URL](https://blog.devart.com/connecting-to-shopify-with-devart-odbc-a-guide-to-data-analysis-using-power-bi.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Connecting to SQL Server from Android and iOS in Direct Mode Using SDAC By [DAC Team](https://blog.devart.com/author/dac) February 12, 2015 [7](https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html#comments) 12187 Using [SDAC in Direct Mode](https://www.devart.com/sdac/) for access to SQL Server from iOS and Android platforms doesn’t significantly differ from the one on Mac OS X or Windows platforms. Let’s see how this works in a sample below for RAD Studio XE7. Note: those, who have already read our previous article [Connecting to SQL Server on Mac OS X in Direct Mode Using SDAC](https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html) can skip the description of creating a new project and move on to the Project configuration for Android and iOS platforms step. Design-time Let’s create a new Android / iOS application, that will work with SQL Server. For this, in the File|New menu select Multi-Device Application – Delphi . In the appeared dialog select Blank Application . Place the TMSConnection component onto the form, which will be named as MSConnection1. Set up the MSConnection1 component by setting the Options.Provider ProviderName property to prDirect. Open MSConnection1 editor and fill in the required properties: Server, Port, Authentication (the SQL Server value), Username, Password and Database. MSConnection1 settings for Direct Mode have been done. Now we just have to write event handlers for the Connect and Disconnect buttons: procedure TMainForm.ConnectClick(Sender: TObject);\nbegin\n MSConnection1.Connect;\n MSQuery1.Open;\nend;\n\nprocedure TMainForm.DisconnectClick(Sender: TObject);\nbegin\n MSQuery1.Close;\n MSConnection1.Disconnect;\nend; Setting up Direct Mode for Android When developing Android applications working with SQL Server in Direct Mode, SDAC doesn’t require any additional settings for deployment and execution. For my sample, I have just selected Target Platform – Android and a device. Application execution in Direct Mode on Android In RAD Studio press F9 and in a few seconds get the application running on Android, that works with SQL Server in Direct Mode with the help of SDAC. Setting up Direct Mode for iOS When developing iOS applications working with SQL Server in Direct Mode, SDAC doesn’t require any additional settings for deployment and execution. For my sample, I have just selected Target Platform – iOS and a device. Application execution in Direct Mode on iOS (Simulator) The application running on iOS Simulator looks like the following. Application execution in Direct Mode on iOS Device And finally a screenshot of running application on an iOS device In such a way, we have demonstrated how easy to use SDAC functionality – Direct Mode for SQL Server when developing applications for Android and iOS. The source code of this demo can be found in the SDAC Demo folder on your PC. Tags [android development](https://blog.devart.com/tag/android-development-2) [delphi](https://blog.devart.com/tag/delphi) [direct mode](https://blog.devart.com/tag/direct-mode) [ios development](https://blog.devart.com/tag/ios-development) [rad studio](https://blog.devart.com/tag/rad-studio) [sdac](https://blog.devart.com/tag/sdac) [SQL Server](https://blog.devart.com/tag/sql-server) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html) [Twitter](https://twitter.com/intent/tweet?text=Connecting+to+SQL+Server+from+Android+and+iOS+in+Direct+Mode+Using+SDAC&url=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html&title=Connecting+to+SQL+Server+from+Android+and+iOS+in+Direct+Mode+Using+SDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html&title=Connecting+to+SQL+Server+from+Android+and+iOS+in+Direct+Mode+Using+SDAC) [Copy URL](https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 7 COMMENTS rahiche raouf February 17, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 10:16 pm thank you FAhd March 1, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 3:06 am plez can u show me how 2 deploy this code step by step pleeeeez DAC Team March 1, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 11:21 am Hello, FAhd! To get familiar with Direct Mode work in SDAC open the demo project included in the samples supplied with SDAC – [SDAC install folder]\\Demos\\Mobile . Then using Project Manager select the required platform, go to the menu item “Project” and select “Deploy” jens May 27, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 8:35 pm Hi, I’m trying to create a first mobil app by using sdac MSConnection and MSQuery. Unfortunatelly I can not find any DBGrid component to present the selected data. Is there no DBGRID component available for mobile apps? DAC Team May 29, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 10:20 am Hello, Jens! Unfortunately, TCRDBGrid is a VCL component, therefore, you won’t be able to use it in FMX applications. To display tabular data, you can use the TGrid component from a standard set of FMX components. Raul September 7, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 12:10 am Hello. We are contemplating to buy SDAC. In fact, we already have downloaded trial version for Delphi XE6, however the component works fine with Android 4 but it doesn´t work in other versions such Android 7. Are there any solution about that? Thanks. DAC Team September 10, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 10:05 am Hello, Raul! Unfortunately, RAD Studio XE6 does not support Android 7.0 (API Level 24), you can read more about compatibility of RAD Studio XE6 with Android by the following link: docwiki.embarcadero.com/RADStudio/XE6/en/Android_Devices_Supported_for_Application_Development Comments are closed."} {"url": "https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Connecting to SQL Server from Mac OS X in Direct Mode Using SDAC By [DAC Team](https://blog.devart.com/author/dac) December 26, 2014 [0](https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html#respond) 4565 Devart Company released [SDAC](https://www.devart.com/sdac/) and [UniDAC](https://www.devart.com/unidac/) for RAD Studio with a new functionality – Direct Mode for connection to SQL Server. The existing providers for SQL Server by Microsoft use the OLE DB interface, that allows using client applications only on Windows platforms. Due to absence of native solutions by Microsoft for Mac OS X, SDAC Direct Mode was the only opportunity to support SQL Server for Mac OS X. Devart provides a ready solution for work with SQL Server from Mac OS X – Direct Mode. Let’s consider a simple sample of using Direct Mode. Design time Here we create a new Mac OS X application in Delphi XE7 working with SQL Server. For this, in the File|New menu select Multi-Device Application – Delphi. In the appeared dialog select Blank Application. Connection settings in Direct Mode Place the TMSConnection component onto the form, which will be named as MSConnection1. Set up the MSConnection1 component by setting the Options.Provider ProviderName property to prDirect. Open MSConnection1 editor and fill in the required properties: Server, Port, Authentication (the SQL Server value), Username, Password and Database. MSConnection1 configuration for Direct is finished. If you want to set up a connection for the Direct mode at run-time, the code will be look like the following: MSConnection1.Options.Provider := prDirect;\nMSConnection1.Server := 'Server';\nMSConnection1.Port := 1433;\nMSConnection1.Authentication := auServer;\nMSConnection1.Database := 'FISH_FACTS_DEMO';\nMSConnection1.Username := 'sa';\nMSConnection1.Password := '*************'; Opening connection and dataset in Direct Mode The complete project in the IDE looks as shown below: Now we just have to write event handlers for the Connect and Disconnect buttons: procedure TMainForm.ConnectClick(Sender: TObject);\nbegin\n MSConnection1.Connect;\n MSQuery1.Open;\nend;\n\nprocedure TMainForm.DisconnectClick(Sender: TObject);\nbegin\n MSQuery1.Close;\n MSConnection1.Disconnect;\nend; Application execution in Direct Mode on Mac OS X For application deployment and execution, the Direct mode requires no libraries, etc. This is our example running on Mac OS X. The following post will show [how to use SDAC to develop applications for Android and iOS](https://blog.devart.com/connecting-to-sql-server-on-android-and-ios-in-direct-mode-using-sdac.html) working with SQL Server in Direct Mode. The source code of this demo can be found in the SDAC Demo folder on your PC. Tags [delphi](https://blog.devart.com/tag/delphi) [direct mode](https://blog.devart.com/tag/direct-mode) [macos development](https://blog.devart.com/tag/macos-development) [rad studio](https://blog.devart.com/tag/rad-studio) [sdac](https://blog.devart.com/tag/sdac) [SQL Server](https://blog.devart.com/tag/sql-server) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html) [Twitter](https://twitter.com/intent/tweet?text=Connecting+to+SQL+Server+from+Mac+OS+X+in+Direct+Mode+Using+SDAC&url=https%3A%2F%2Fblog.devart.com%2Fconnecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html&title=Connecting+to+SQL+Server+from+Mac+OS+X+in+Direct+Mode+Using+SDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html&title=Connecting+to+SQL+Server+from+Mac+OS+X+in+Direct+Mode+Using+SDAC) [Copy URL](https://blog.devart.com/connecting-to-sql-server-on-mac-os-x-in-direct-mode-using-sdac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/connectivity-support-for-oracle-20c.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Connectivity to Oracle 20c is supported in dbForge for Oracle tools By [dbForge Team](https://blog.devart.com/author/dbforge) July 2, 2020 [0](https://blog.devart.com/connectivity-support-for-oracle-20c.html#respond) 2525 We are glad to inform our Oracle users, that the new versions of [dbForge for Oracle](https://www.devart.com/dbforge/oracle/) tools have been just released. The new versions of all dbForge for Oracle tools feature connectivity support for Oracle 20c. The new versions of [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) and [dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/) also features a brand new data synchronization option. Connectivity To ensure that our Oracle users can work with the latest database engines, all tools of dbForge for Oracle product line been enhanced with an option to connect and work with Oracle 20c databases. Data Compare We have implemented a brand new data sync option, Reseed identity columns , that is now available in dbForge Studio for Oracle and dbForge Data Compare for Oracle. When selected, the option reseeds current values for identity columns and properly modifies the START WITH values for the IDENTITY columns. Availability Click the links below to get and try the new versions of dbForge for Oracle tools: [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) [dbForge Compare Bundle for Oracle](https://www.devart.com/dbforge/oracle/compare-bundle/download.html) [dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/download.html) [dbForge Schema Compare for Oracle](https://www.devart.com/dbforge/oracle/schemacompare/download.html) [dbForge Data Generator for Oracle](https://www.devart.com/dbforge/oracle/data-generator/download.html) [dbForge Documenter for Oracle](https://www.devart.com/dbforge/oracle/documenter/download.html) Tags [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [what's new oracle tools](https://blog.devart.com/tag/whats-new-oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconnectivity-support-for-oracle-20c.html) [Twitter](https://twitter.com/intent/tweet?text=Connectivity+to+Oracle+20c+is+supported+in+dbForge+for+Oracle+tools&url=https%3A%2F%2Fblog.devart.com%2Fconnectivity-support-for-oracle-20c.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/connectivity-support-for-oracle-20c.html&title=Connectivity+to+Oracle+20c+is+supported+in+dbForge+for+Oracle+tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/connectivity-support-for-oracle-20c.html&title=Connectivity+to+Oracle+20c+is+supported+in+dbForge+for+Oracle+tools) [Copy URL](https://blog.devart.com/connectivity-support-for-oracle-20c.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/consume-a-restful-api-using-restsharp-and-c.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) Consume a RESTful API Using RestSharp and C# By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 29, 2022 [0](https://blog.devart.com/consume-a-restful-api-using-restsharp-and-c.html#respond) 5628 This article talks about REST concepts, the RESTSharp library, and how it can be used to work with RESTful APIs in ASP.NET 6. We’ll use a PostgreSQL database using Devart dotConnect for PostgreSQL to store and retrieve data. RESTSharp is an open-source, portable, lightweight .NET library for working with RESTful web services. You can use it to perform CRUD (create, read, update, and delete) operations on your data using any RESTful API. RestSharp is a popular library for interacting with RESTful APIs. It is a port of the Java version, and it is used to make HTTP requests and parse the responses. Pre-requisites You’ll need the following tools to deal with code examples: Visual Studio 2022 Community Edition dotConnect for PostgreSQL You can download PostgreSQL from here: [https://www.postgresql.org/download/](https://www.postgresql.org/download/) What is REST? Why do we need it? The REST architectural style is used to create distributed applications that can communicate among themselves. REST is neither a technology nor a set of standards. It is rather a set of constraints that can be used to define new architectural styles. It is essentially a client-server architecture with stateless connections. REST uses an HTTP-based interface to expose data and services and is based on the concept of resources. What is RESTSharp? Using RestSharp, you can interact with RESTful services while abstracting the technical details of HTTP requests. RestSharp offers a developer-friendly interface for interacting with RESTful services while abstracting the technical workings of HTTP queries. RestSharp can handle synchronous and asynchronous requests. Create a new ASP.NET 6 Core Web API Project In this section, we’ll learn how to create a new ASP.NET 6 Core Web API project in Visual Studio 2022. Now, follow the steps outlined below: Open Visual Studio 2022. Click Create a new project . Select ASP.NET Core Web API and click Next. Specify the project name and location to store that project in your system. Optionally, checkmark the Place solution and project in the same directory checkbox. Click Next. In the Additional information window, select .NET 6.0 (Long-term support) as the project version. Disable the Configure for HTTPS and Enable Docker Support options (uncheck them). Since we’ll not be using authentication in this example, select the Authentication type as None . Since we won’t use Open API in this example, deselect the Enable OpenAPI support checkbox. Since we’ll not be using minimal APIs in this example, ensure that the Use controllers (uncheck to use minimal APIs) is checked. Leave the Do not use top-level statements checkbox unchecked. Click Create to finish the process. We’ll use this project in this article. Install NuGet Packages Before you get started implementing rate limiting, you should install the dotConnect for PostgreSQL package in your project. You can install them either from the NuGet Package Manager tool inside Visual Studio or, from the NuGet Package Manager console using the following commands: PM> Install-Package Devart.Data.PostgreSql Getting Started Create a PostgreSQL database table named Author having the following fields: Id FirstName LastName Author Next, insert some dummy records into this table. We’ll use this table to store and retrieve data using RestSharp. Create the Model Class Create a solution folder in the Solution Explorer window and name it Models. Next, create a .cs file called Author.cs with the following code in there: public class Author\n {\n public int Id { get; set; }\n public string FirstName { get; set; }\n public string LastName { get; set; }\n public string Address { get; set; }\n } Create the AuthorRepository Class The IAuthorRepository interface contains the declaration of two methods: public interface IAuthorRepository\n {\n public void Save(Author author);\n public List GetAuthors();\n } The AuthorRepository class implements the methods of the IAuthorRepository interface and encapsulates all database operations. public class AuthorRepository: IAuthorRepository\n {\n public List GetAuthors()\n {\n try\n {\n List authors = new List();\n using (PgSqlConnection pgSqlConnection =\n new PgSqlConnection(\"User Id = postgres; Password = sa123#;\" +\n \"host=localhost;database=postgres;\")) \n {\n using (PgSqlCommand pgSqlCommand = new PgSqlCommand())\n {\n pgSqlCommand.CommandText = \n \"Select * From public.Author\";\n pgSqlCommand.Connection = pgSqlConnection;\n\n if (pgSqlConnection.State != \n System.Data.ConnectionState.Open)\n pgSqlConnection.Open();\n\n using (PgSqlDataReader pgSqlReader = \n pgSqlCommand.ExecuteReader())\n {\n while (pgSqlReader.Read())\n {\n Author author = new Author();\n author.Id = \n int.Parse(pgSqlReader.GetValue(0).ToString());\n author.FirstName = \n pgSqlReader.GetValue(1).ToString();\n author.LastName = \n pgSqlReader.GetValue(2).ToString();\n author.Address =\n pgSqlReader.GetValue(3).ToString();\n\n authors.Add(author);\n }\n }\n }\n }\n return authors;\n }\n catch\n {\n throw;\n }\n }\n public void Save(Author author)\n {\n try\n {\n using (PgSqlConnection pgSqlConnection =\n new PgSqlConnection(\"User Id = postgres; Password = sa123#;\" +\n \"host=localhost;database=postgres;\"))\n {\n using (PgSqlCommand cmd = new PgSqlCommand())\n {\n cmd.CommandText = \"INSERT INTO public.Author \" +\n \"(id, firstname, lastname, address) VALUES \" +\n \"(@id, @firstname, @lastname, @address)\";\n \n cmd.Connection = pgSqlConnection;\n cmd.Parameters.AddWithValue(\"id\", \n Guid.NewGuid().ToString());\n cmd.Parameters.AddWithValue(\"firstname\", \n author.FirstName);\n cmd.Parameters.AddWithValue(\"lastname\", \n author.LastName);\n cmd.Parameters.AddWithValue(\"address\", author.Address);\n\n if (pgSqlConnection.State != \n System.Data.ConnectionState.Open)\n pgSqlConnection.Open();\n cmd.ExecuteNonQuery();\n }\n }\n }\n catch\n {\n throw;\n }\n }\n } Create the AuthorController Class Next, select and right-click on the Controllers solution folder and create a new controller class called AuthorController with the following code in there: [Route(\"api/[controller]\")]\n [ApiController]\n public class AuthorController : ControllerBase\n {\n private readonly IAuthorRepository _authorRepository;\n public AuthorController(IAuthorRepository authorRepository)\n {\n _authorRepository = authorRepository;\n }\n\n [HttpGet]\n public List Get()\n {\n return _authorRepository.GetAuthors();\n }\n\n [HttpPost]\n public void Post([FromBody] Author author)\n {\n _authorRepository.Save(author);\n }\n } Note how an instance of type IAuthorRepository is injected into the constructor of the AuthorController class. Remember that you must add an instance of type IAuthorRepository to the services container using the following piece of code in the Program.cs file: builder.Services.AddScoped(); Create the Client Application to Consume the RESTful API Now create a console application to consume the RESTful API using RESTSharp. Install NuGet Packages To take advantage of RestSharp, you should install the RestSharp package in your project. You can install them either from the NuGet Package Manager tool inside Visual Studio or, from the NuGet Package Manager console using the following commands: PM> Install-Package RestSharp Once RestSharp has been successfully installed in your project, you can start using the library. First off, you should create an instance of RestClient and pass the base address of the Url in the constructor as shown in the following code: RestClient client = new RestClient(\"http://localhost:5073/api/\"); Now, create an instance of the RestRequest class as shown in the code given below: RestRequest request = new RestRequest(\"Author\", Method.Get); Lastly, you can now call the Execute method using the instance of RestClient and retrieve data using the following code: var response = client.Execute>(request); Here’s the complete code listing of the client app: using RestSharp;\nusing System;\nusing System.Collections.Generic;\n\nnamespace RESTSharpClient\n{\n class Program\n {\n private static RestClient client = new\n RestClient(\"http://localhost:5073/api/\");\n static void Main(string[] args)\n {\n\t\tRestRequest request = new RestRequest(\"Author\", Method.GET);\n\t\tvar response = client.Execute>(request);\n Console.ReadKey();\n }\n }\n} To make Post requests, you can use the following code: Author author = new Author();\nvar request = new RestRequest(\"api/author\")\n .AddJsonBody(author);\nvar response = await client.ExecutePostAsync(request);\n\nif (!response.IsSuccessful)\n{\n //Write code here to handle errors\n} Summary RestSharp can be used in any .NET application that needs to interact with web services. RESTSharp is a lightweight alternative to WebClient in cases where you need more control over how the request passes through your pipeline. It has full support for the HTTP specification and allows you to easily interact with any RESTful web service endpoint. Tags [ASP.NET](https://blog.devart.com/tag/asp-net) [c#](https://blog.devart.com/tag/c) [dotconnect](https://blog.devart.com/tag/dotconnect) [PostgreSQL](https://blog.devart.com/tag/postgresql) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconsume-a-restful-api-using-restsharp-and-c.html) [Twitter](https://twitter.com/intent/tweet?text=Consume+a+RESTful+API+Using+RestSharp+and+C%23&url=https%3A%2F%2Fblog.devart.com%2Fconsume-a-restful-api-using-restsharp-and-c.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/consume-a-restful-api-using-restsharp-and-c.html&title=Consume+a+RESTful+API+Using+RestSharp+and+C%23) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/consume-a-restful-api-using-restsharp-and-c.html&title=Consume+a+RESTful+API+Using+RestSharp+and+C%23) [Copy URL](https://blog.devart.com/consume-a-restful-api-using-restsharp-and-c.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/continuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Continuous Delivery of Database Changes to SQL Server When Working Remotely By [dbForge Team](https://blog.devart.com/author/dbforge) September 3, 2020 [0](https://blog.devart.com/continuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html#respond) 3336 In the process of software product development that involves a database, one of the key points is the possibility of continuous delivery of changes from the development environment to the production environment. Equally important is the fact that more and more people involved in IT-processes start working remotely. In view of this, it is important to adjust the workflow to the fast-changing realities. In this article, we will talk briefly about the main approach to delivering database changes by means of migration using Devart tools, including the delivery of changes when working remotely. Delivery of database changes in the deployment pipeline Comparing two methods of database changes delivery To start with, there are two methods of database changes delivery: The state-based method suggests that database states are stored, but the scripts for a transition from one state to another one are not stored. Migration-based database development suggests that the database scripts for a transition from one state to another are stored. Now, let’s compare the benefits and pitfalls of these two approaches: State-based database development Migration-based database development Main benefits Changes can be made right in the required environment, which enables fast customization of any solution and minimal time for the release of changes (new functionality, edits, and updates of the current functionality). there is a clear order of certain scripts for changes from one state to another; there is a predefined rollback scenario in case migration changes are undone; with time, there is no need to perform reverse engineering; the solution undergoes different kinds of tests much better, which minimizes the occurrence of serious bugs and risks in the future Main pitfalls there is a high probability of new bugs and serious risks in the future; every time you need  to collect database states and compare them to the template, after which you need to generate migration scripts; there is often no rollback scenario in case migration changes are undone; you need to perform reverse engineering with time As any change has to undergo a chain of actions (development, testing, implementation), customizing the solution and releasing changes (introducing new functionality and updating the current functionality) can take up much time. Use In rare cases, when the release time of changes is much more expensive than the stability of the entire system (changes are usually introduced directly in the production environment. This is common for immature IT-systems and rare for developed IT-solutions). When the stability of the current solution is more important than its new or updated functionality(it is more common in developed IT-systems). Database states (schemas and reference data) and the scripts of changes are usually kept and versioned in Version Control systems like GIT, SVN, Microsoft Azure DevOps. While the delivery of changes from a database directly to a version control can be implemented through dbForge Source Control SSMS Add-in. In order to transit from a state-based approach to a migration-based one, you first need to create a baseline schema of the existing database and make further changes in the schema with patches; each of them consists of a migration script from one database version to another. To create such a migration script, you need to compare the previous database version with the database where the changes were made. In this case, DB comparators are good helpers (for instance [SQL Server Schema Synchronization](https://www.devart.com/dbforge/sql/studio/sql-server-schema-synchronization.html) ). And then, follow migration-based database development, not allowing to make changes directly in the required environments. Oftentimes, it is impossible to rid off the state-based approach completely, but we should strive to do this, so that the lifecycle of a product is organized, and the behavior of the system is more stable and foreseeable after making changes. Hence, further on, we will describe migration-based database development. Continuous database delivery through migration Initially, to implement a database delivery through migration, you can use the [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) tool: Fig. 1 DevOps Automation for SQL Server Notably, to implement this approach, one needs to activate the DevOps process as all departments need to be involved: Development. Testing, including load testing. Update. Deployment. It is important to note that you do not have to deliver all migrations from one environment to another. That is, you only need to deliver the difference between two databases, which is very easy to define with the help of the Devart [SQL diff tool](https://www.devart.com/dbforge/sql/schemacompare/) called dbForge Schema Compare for SQL Server. Fig. 2 SQL Server Schema Synchronization Another way to determine the difference between database schemas is by using the Visual Studio IDE tool: Fig. 3 Schema Comparisons using Visual Studio SQL Data Tools Additionally, it is very convenient to store and manage database schema changes by the special instrumentality of version control such as [Source Control for SQL Server](https://www.devart.com/dbforge/sql/database-devops/source-control.html) : Fig. 4 Source Control for SQL Server Still, whatever tool you select to control version changes, it has to meet the requirements of the entire product lifecycle, namely: Rollback selected changes. Rollforward selected changes. View conflicts and resolve them. Enable multiple users to work asynchronously with the same code snippet. Track changes (date, time, source ( who introduced changes and where). All the above-mentioned functionalities are available in both [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) and in [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) : Fig. 5 dbForge Studio for SQL Server The first tool is built-in SSMS, and the second one is delivered as a separate visual system used for database development, testing, and administration. As this approach to the delivery of database changes allows setting up a more predictable and transparent software solution lifecycle, the same approach is better suited for remote work organization. Next, we will briefly describe the main features of remote work. Additionally, watch these videos to discover how dbForge products can boost database development. [How dbForge SQL Complete is involved in the Database DevOps process](https://youtu.be/RNgxe_8InU0) [How to import data to SQL Server database with dbForge Data Pump during the DevOps process](https://youtu.be/R7nq351mlHo) [Creating database documentation during the Continuous Integration workflow](https://youtu.be/S4W0ybixQII) [How to automate database schema changes for the CI process during database deployment](https://youtu.be/hllTzoXvoO8) [dbForge Source Control in the DevOps pipeline](https://youtu.be/reU4ALv2ctg) [Test data generation in the Continuous Integration and Deployment processes](https://youtu.be/G3GNo0i03bk) [Unit Testing for SQL Server Database in DevOps process](https://youtu.be/3A5JEs3Nz0I) Delivering changes to databases in remote working conditions As more and more IT company employees opt for remote work, it became essential to guarantee secure work in databases in remote working conditions. For that reason, let us consider the scenarios of securing the delivery of changes on the database level. The following methods are normally used to secure databases in remote working conditions: Locating a database server in a secure network so that it has no direct internet access, neither inbound nor outbound (special terminals are configured to access the server). Establishing dedicated and encrypted channels between the employee’s hardware and corporate network (often with enhanced security in the form of digital signature and/or certificates). In certain instances of point 2, installing special software on the employee’s computer and making a personal USB key to access the system/ corporate network. But what about the tool for working with databases? Yes, this tool also has to provide a wide range of possibilities to access data. The dbForge Studio for SQL Server tool through [Security Manager](https://docs.devart.com/studio-for-sql-server/managing-users-and-permissions/manipulating-user-logins.html) offers rich capabilities of managing logins: Fig. 6 Security Manager In addition, it stands to mention that dbForge SQL Tools now support the new secure Active Directory authentication (with universal MFA authentication): Fig. 7 SQL Tools support the Active Directory authentication (with universal MFA authentication) To clarify, Active Directory (with universal MFA authentication) is an interactive method that supports Azure multi-factor authentication, among other things. Azure MFA both helps secure access to data and applications and satisfies the simple users’ need to sign in. It provides reliable authentication with a number of simple parameters, such as a phone call, a text message, smart cards with a pin code, or a mobile app notification, enabling users to choose a method preferable for them. You can read more about [Active Directory authentication supported by dbForge SQL Tools.](https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html) For security reasons, it is common practice that access to an employer’s infrastructure is established through special terminal servers for development, test, and production environments. Thus, properly organized access to the company’s IT resources does not usually differ much whether it is remote access or access from the office, as any connection is established through special terminal servers. Conclusion To properly apply new methods and technologies, database administrators have to consider multiple factors, one of which is whether to use a state-based or a migration-based approach to delivering changes and updates. We have compared the two methods and provided handy tools to organize the continuous delivery of changes in a DevOps environment. By means of these tools, one can speed up the process, eliminate the risks, and secure the database, which is especially crucial for remote working conditions. Tags [continuous delivery](https://blog.devart.com/tag/continuous-delivery) [dbForge DevOps Automation](https://blog.devart.com/tag/dbforge-devops-automation) [Schema Compare](https://blog.devart.com/tag/schema-compare) [source control](https://blog.devart.com/tag/source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcontinuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html) [Twitter](https://twitter.com/intent/tweet?text=Continuous+Delivery+of+Database+Changes+to+SQL+Server+When+Working+Remotely&url=https%3A%2F%2Fblog.devart.com%2Fcontinuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/continuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html&title=Continuous+Delivery+of+Database+Changes+to+SQL+Server+When+Working+Remotely) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/continuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html&title=Continuous+Delivery+of+Database+Changes+to+SQL+Server+When+Working+Remotely) [Copy URL](https://blog.devart.com/continuous-delivery-of-database-changes-to-sql-server-when-working-remotely.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/continuous-integration-with-teamcity-and-dbforge.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) Continuous Integration with TeamCity and dbForge By [dbForge Team](https://blog.devart.com/author/dbforge) March 5, 2021 [0](https://blog.devart.com/continuous-integration-with-teamcity-and-dbforge.html#respond) 2804 Business benefits from faster releases. End-users get updates immediately, their applications and systems work seamlessly, and both the software manufacturers and customers are happy. Thus the idea is to make software development and releases faster and safer. There are lots of tools to automate processes, accelerate performances, and eliminate the risks of errors. There are application release automation solutions, but they are not enough. Those tools work with the application code. However, the databases (the base for those applications) behave differently. Moreover, issues that originate from databases are usually much more complicated. Should they get into the release, the problems will start to arise. Enterprises can not afford to miss the database challenges. This way, we come down to the Database Continuous Integration concept. According to this concept, every time developers add changes to the database, a new build is created and tested. The specialists get the necessary feedback immediately and thus do not allow bugs to accumulate. The more they can build and test, the fewer issues appear on production. To ensure all that, we need tools to test, package, and deploy database changes. Automatic Database Releases with TeamCity and dbForge Before the database version comes into Production, the process passes through several stages or steps: The Development step presupposes making changes to the database schema. The Version Control step involves committing changes to the version control system (Git, SVN, etc.). You need to configure your workflow to run for one or more events that will trigger build creation. The Build phase entails creating a database from scripts on the SQL server. When this phase is over, you get a new database. The Unit Test stage involves testing the created database with SQL unit tests. It is crucial to launch unit tests because thus you make sure that the changes won’t affect the required functionality. The Publish Database phase completes the CI process. The changes get into an artifact (NuGet package). The NuGet package then gets put into a dedicated folder or published in the NuGet repository. In our case, these phases take place on the TeamCity server. TeamCity is a Continuous Integration server that is popular among developers mainly due to the fact that it allows the most flexibility for all sorts of workflows and development practices. The continuous integration (CI) server deals with the development processes. It analyzes the source code and changes in it, manages builds, and performs other tasks to ensure that the CI phases are performed correctly. As a result, the process becomes faster, smoother, and all team members interact more effectively. TeamCity Server is a powerful solution with robust functionality “out of the box.” It runs in a Java environment and is compatible with Apache Tomcat, Windows servers, and Linux servers. When you use TeamCity to organize continuous integration processes, it covers the following stages: Tracking changes in the Version Control System (VCS) linked to the build. Detecting new changes, triggering the build, and adding it to the queue. Finding a free build agent to assign that build to. Running the build process according to the configuration steps. Producing the artifacts. Uploading the artifacts to the server. Generating a report for the operator. As TeamCity is one of the most popular automation systems, Devart has developed an additional tool to make the processes of database release automation handier. The [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) solution includes the TeamCity support with a dedicated plugin for the users to create the database on the server, test it, deploy it, as well as document the database and all processes and changes. Additionally, watch these videos to discover how dbForge products can boost database development. [How dbForge SQL Complete is involved in the Database DevOps process](https://youtu.be/RNgxe_8InU0) [How to import data to SQL Server database with dbForge Data Pump during the DevOps process](https://youtu.be/R7nq351mlHo) [Creating database documentation during the Continuous Integration workflow](https://youtu.be/S4W0ybixQII) [How to automate database schema changes for the CI process during database deployment](https://youtu.be/hllTzoXvoO8) [dbForge Source Control in the DevOps pipeline](https://youtu.be/reU4ALv2ctg) [Test data generation in the Continuous Integration and Deployment processes](https://youtu.be/G3GNo0i03bk) [Unit Testing for SQL Server Database in DevOps process](https://youtu.be/3A5JEs3Nz0I) Configure the process with the dbForge DevOps Automation for SQL Server Plug-in Before you start to TeamCity CI, you need to install the dbForge tools on the machine where you will run a pipeline. If you are already using [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , this multi-featured solution has all the necessary functionality. Or, you may choose the [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle. It is also possible to install specific dbForge tools separately. For the DevOps tasks, you mostly need the following ones: [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) . [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) . [dbForge Unit Test for SQL Server](https://www.devart.com/dbforge/sql/unit-test/) . Also, you will need the PowerShell module that comes with [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) . Install the plugin 1. In TeamCity, click Administration . 2. On the left pane, under Server Administration , select Plugins List . 3. Click Browse plugins repository to install the plugin directly from the plugin repository. 4. In the [JetBrains Plugins Repository](https://plugins.jetbrains.com/) , find the necessary plugin. To do this, type dbForge DevOps Automation for SQL Server in the search field, and the repository will transfer you to the necessary page. Next, click Get to open the menu and select Install to . This will install the demanded dbForge plugin on the TeamCity server. Note : You can also install the plugin manually from a zip file. The plugin zip file can be downloaded at [JetBrains Plugins Repository](https://plugins.jetbrains.com/) , or from [the Devart website](https://www.devart.com/dbforge/sql/database-devops/) . Create a New TeamCity Project 1. Navigate to the TeamCity server start page and click Create a project . 2. Configure the necessary VCS settings and click Proceed . After checking the VCS settings, TeamCity will open a new window. There, you start to configure your project. 3. Specify the Project Name and the Build configuration name . Click Proceed . 4. A new page with the settings for the created project will appear. Pay attention to the available options: On the General Settings tab, you can configure the basic project settings (name, ID, description), as well as the settings for artifacts. The Version Control Settings allow you to configure the settings for VCS that will be linked to the project. On the Build Steps tab, you can set the build phases. The Triggers tab contains settings for configuring triggers to initiate the start of the project build process. The Suggestions section provides additional recommendations on configuring TeamCity projects. The basic logic of the CI process is configured in the Build Steps section by adding various build steps. Configure the Build Process Necessary tools : dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server. Build is an essential phase in the Continuous Integration process. It syncs the script folders from the repository and the target database. If the sync process is correct and successful, the system will create the NuGet package . That package will have an ID serving as an identifier for all the further steps. Hence, in this tutorial, we are going to explore how to arrange the Build stage of the Continuous Delivery with the help of dbForge DevOps Automation for SQL Server. 1. On the Build Steps tab, click the Add . 2. A new window for configuring a new step will appear. There, specify the Build runner type by choosing dbForge DevOps Automation for SQL Server – Build from the drop-down menu. 3. In the New Build Step window, that will appear note the following sections: Source-controlled database is the location of the folder where the database deployment scripts are stored. Package ID specifies the name of the NuGet package that will be generated as a result of this step. Temporary database server is the server to deploy the scripts from the source folder for validating. Note : It is better to choose the SQL LocalDB option. This variant suggests the creation of the LocalDB instance and a temporary database on it. After the validation task is successfully done, the LocalDB instance will delete that database and close. You can choose a specific server to deploy scripts – select that option and configure the connection settings. Test the database Necessary tools: dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server and dbForge Unit Test for SQL Server At this stage, we synchronize the object created at the Build stage with the target connection and add the test data (optionally). Then, we run the tSQLt unit test on a given server and validate the SQL scripts this way. Navigate Build Steps > Add build step > Choose build runner type > dbForge DevOps Automation for SQL Server – Test . The Test step configuration window will appear. Pay attention to the functional sections of its interface: Database package to test is the name of the NuGet package the system generated in the Build step. SQL scripts from this NuGet package will be deployed on the temporary server for tSQLt unit tests. Temporary database server is the server where tSQLt unit tests will be performed. The Run tests field defines the run mode (every test or selected test). The Generate test data section enables test data generation mode before running tSQLt unit tests. Here, you must create the dbForge Data Generator for the SQL Server project file (*.dgen). This file contains settings and rules for test data generation. It should be in VCS, so the system will create the path according to the checkout directory. Synchronize the Database Project Configuration Necessary tools: dbForge Studio for SQL Server or dbForge Schema Compare Pro for SQL Server . At this stage, we sync the NuGet package we generated and tested with the specified server . Navigate Build Steps > Add build step > Choose build runner type > dbForge DevOps Automation for SQL Server – Sync. In addition to the functional sections we’re already familiar with, you’ll have two more in this interface: Database package to sync specifies the name of the generated and tested NuGet package. SQL scripts from it will be used for deployment on the specified server. Target database is the database we nee There are other advanced options for synchronization. Among them, it’s worth special mentioning the possibility to filter objects during synchronization . For that, we apply the (*.scflt) filter file containing the filtering rules. To generate this file, use dbForge Schema Compare for SQL Server . When complete, locate the filter file in the version control system for CI processes. Run the project The project start can be either automated or manual. When you’ve configured the project, TeamCity will put it into the queue automatically as soon as it detects changes in the linked VCS repository. Or, you can run the project yourself from Build Steps > Run (the top-right corner): After that, the project will get into the build queue. When the system finds and assigns a free matching build agent, it will start the build process at once: Note : The Build Log tab provides the build process monitoring results in real-time. You can find there the following information: updating source codes from VCS, forming a temporary directory, details about the implementation of each build step and publishing artifacts. When the build process is over, you can check and download (if needed) the artifact on the Artifacts tab. Conclusion With the [dbForge database tools](https://www.devart.com/dbforge/) , you can quickly and easily set up the Continuous Integration processes on TeamCity. Our tools allow tailoring Continuous Delivery by giving the possibility to flexibly arrange exactly those pipeline steps that are actually required. Embrace the best DevOps practices with dbForge tools. Tags [DevOps Automation](https://blog.devart.com/tag/devops-automation) [teamcity ci](https://blog.devart.com/tag/teamcity-ci) [teamcity integration](https://blog.devart.com/tag/teamcity-integration) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcontinuous-integration-with-teamcity-and-dbforge.html) [Twitter](https://twitter.com/intent/tweet?text=Continuous+Integration+with+TeamCity+and+dbForge&url=https%3A%2F%2Fblog.devart.com%2Fcontinuous-integration-with-teamcity-and-dbforge.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/continuous-integration-with-teamcity-and-dbforge.html&title=Continuous+Integration+with+TeamCity+and+dbForge) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/continuous-integration-with-teamcity-and-dbforge.html&title=Continuous+Integration+with+TeamCity+and+dbForge) [Copy URL](https://blog.devart.com/continuous-integration-with-teamcity-and-dbforge.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/convert-from-mysql-to-postgresql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Convert MySQL Data to PostgreSQL By [dbForge Team](https://blog.devart.com/author/dbforge) June 25, 2020 [0](https://blog.devart.com/convert-from-mysql-to-postgresql.html#respond) 6471 Transferring data between two different servers is not an easy task as it requires a great deal of effort and accuracy, especially when migrating from one database management system to another. In this article, we provide a visual walkthrough of MySQL data migration to PostgreSQL Server. In this tutorial, we will move data from the address table in the sakila database that exists on MySQL server to the postgres database on PostgreSQL server. To migrate MySQL to PostgreSQL, we will need [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) and [ODBC Driver for PostgreSQL](https://www.devart.com/odbc/postgresql/) . It should be pointed out that migration tasks to PostgreSQL are commonly performed with the help of pgloader which is an open-source database migration tool designed to move data [from other RDBMSs](https://www.devart.com/what-is-rdbms/) and files to PostgreSQL. To succeed with pgloader, you will have to suffer a bit creating tweaky configurations which are not easy especially for beginners. It is also worth mentioning that pgloader is a command-line tool and thus doesn’t have a visual interface. Data Export functionality of dbForge Studio for MySQL dbForge Studio for MySQL and dbForge Fusion for MySQL products come with advanced Data Export Wizard . The tool is highly customizable and enables exporting data to the most popular formats quickly and easily. The undoubted advantage of it is a comprehensive GUI allowing non-professional users to use it effectively. Another benefit the [Data Export tool](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) can boast is the ability to export the result of a query which makes the process of data migration more flexible. Note: If you need to transfer data between databases on one MySQL server or between different MySQL servers, a reliable [Copy Database](https://www.devart.com/dbforge/mysql/studio/copy-database.html) functionality built into dbForge Studio for MySQL would be of great help as it provides the fastest and easiest way to copy database structure and data from source to target within one DBMS. Data migration procedure Below are the steps you need to follow to [migrate MySQL to PostgreSQL](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) .  Each step is illustrated and explained in detail. Please note, that dbForge Studio for MySQL allows transferring between servers not only table data but also query results. To move a query result, from MySQL to PostgreSQL, just execute the query, right-click its result, select the Export Data command from the context menu that appears, and follow the procedure below. Step 1. Select ODBC export format In the Database Explorer, right-click the table you want to migrate and select the Export Data command from the context menu. Next, in the Data Export Wizard that opens, select the ODBC format. Step 2. Select a source table for export With the Data Export wizard, you can select a connection, a database, and a table and/or view to be moved. Step 3. Set options for ODBC data provider In this step, the Data Export Wizard will offer you to set ODBC driver options. We recommend you to test the connection specified before running an Export job. How to configure the ODBC driver To convert MySQL data to PostgreSQL, you will need ODBC Driver for PostgreSQL from Devart. You need to configure its options before proceeding with the data export task. That can be done right from the Data Export Wizard. First, click the Build button next to the Use a connection string field. Then, in the Select Data Source window that opens, go to the Machine Data Source tab and press the New button. Then in the Create New Data Source window, click Next to continue configuring the driver. Next, select a driver for which you want to set up a data source. Select Devart ODBC Driver for PostgreSQL and click Next . Finally, click Finish. After that driver configuration window opens where you need to specify: Data source name Description (optionally) Server name and port User ID and password Database and schema. You can click Test Connection to test the connection to the PostgreSQL server. Click OK to save your settings. After that, you will see the Select Data Source window where the path for data migration to PostgreSQL has appeared. Click OK to finish. Step 4. Select a destination table for your export task On the Table tab of the wizard, you need to choose a target table for export. You can select it from a list of existing tables in a database you specified when configuring the driver as well as create a new table and provide its name. Step 5. Choose columns to be exported On the Data formats tab, you need to check columns you want to migrate to PostgreSQL. Step 6. Choose rows to be exported You don’t need to migrate the whole table. On the Exported rows tab, you can select to: export all rows export the selected rows only specify and export a range of rows. The selective export option significantly mitigates data migration and notably saves time. Step 7. Configure errors processing behavior On the Errors handling tab of the Data Export wizard, you can configure the application behavior when an error occurs. The following options are available: prompt a user for an action, ignore all errors, and abort at the first error. Also, in case you need to create a log file, you can set a path to it on this tab, too. Step 8. Finish and save the template dbForge Studio for MySQL allows saving templates for repeating export scenarios. That eliminates the need to waste time setting up data export again and again, just use templates saved earlier to apply configurations to any number of migration jobs. Step 9. Check and enjoy the result As a result of our MySQL to PostgreSQL migration efforts, the address table and the result of the query have appeared on the PostgreSQL server. To check the result, we will run [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . Conclusion dbForge Studio for MySQL is an advanced IDE possessing powerful data transfer functionality. A well-designed and intuitive interface of the tool makes it possible to easily move MySQL data between different databases, servers, and even DBMSs. It would definitely be of great assistance to teams in their data migration routines. For teams and individuals who work with databases on different DBMSs, another solution could be preferred – [dbForge Edge](https://www.devart.com/dbforge/edge/) which comprises the features and powers of all separate dbForge Studios. With its support for SQL Server, MySQL and MariaDB, Oracle, and PostgreSQL, Edge lets you perform all database tasks without switching between various dedicated tools. The Free Trial of dbForge Edge is available for 30 days. Tags [data export](https://blog.devart.com/tag/data-export) [mysql migration](https://blog.devart.com/tag/mysql-migration) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [PostgreSQL Tutorial](https://blog.devart.com/tag/postgresql-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconvert-from-mysql-to-postgresql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Convert+MySQL+Data+to+PostgreSQL&url=https%3A%2F%2Fblog.devart.com%2Fconvert-from-mysql-to-postgresql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/convert-from-mysql-to-postgresql.html&title=How+to+Convert+MySQL+Data+to+PostgreSQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/convert-from-mysql-to-postgresql.html&title=How+to+Convert+MySQL+Data+to+PostgreSQL) [Copy URL](https://blog.devart.com/convert-from-mysql-to-postgresql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/convert-from-mysql-to-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) How to Convert MySQL Databases to SQL Server: Step-by-Step Process By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) January 10, 2025 [0](https://blog.devart.com/convert-from-mysql-to-sql-server.html#respond) 1959 Database conversion from MySQL to SQL Server implies migrating data and schema from a MySQL database to a SQL Server database. This process typically includes transferring tables, views, stored procedures, and other database objects while ensuring data integrity and compatibility with the target SQL Server environment. This article explains the fundamental differences between MySQL and SQL Server and outlines potential issues that might occur during the migration process. The article also provides a step-by-step guide to migrate data from a MySQL table to SQL Server using ODBC driver for MySQL and [Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) . Contents MySQL vs. SQL Server: understanding the key distinctions Key issues in the MySQL-to-MS SQL migration process Prerequisites Introduction to ODBC drivers Install and configure ODBC driver for MySQL Overview of Data Pump for SQL Server Convert your data from MySQL to SQL Server effortlessly MySQL vs. SQL Server: Understanding the key distinctions MySQL and SQL Server are popular relational database management systems, and it’s important to understand their differences because knowing the distinctions helps anticipate and address compatibility issues during the migration process. MySQL is an open-source, simple, and scalable database for managing and organizing data. Though it is free, users can get premium support services through a commercial license. MySQL runs on most operating systems, including Linux, Windows, and macOS. On the other hand, SQL Server is a commercial product with multiple editions having advanced features for efficient storage, management, and data retrieval. Designed primarily for Windows, the product is well-known for its scalability, security, and integration with other Microsoft products. Key issues in the MySQL-to-MS SQL migration process Migrating data MySQL-to-MS SQL may be a complex process, so you should keep in mind some challenges that might appear when you convert MySQL to SQL Server, including compatibility issues, data integrity concerns, and performance implications. Compatibility : Remember about syntax variations, data type differences, and supported features between MySQL and SQL Server. Many of these compatibility issues can be easily resolved with a [MySQL migration tool](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) . Data Integrity : Review and adjust foreign key constraints, data types, and sequence of data migration. Ensure dependencies are appropriately maintained. Performance Optimization : Evaluate and optimize indexes and queries for the target database. Examine the current MySQL database to identify what should be improved and then implement suitable strategies within the SQL Server environment. Prerequisites For data conversion, we’ll download and install the following tools, which will help in how to migrate a MySQL database to SQL Server efficiently: [Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/download.html) , a robust tool for easy and quick data export and import between different servers and instances. [Devart ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/download.html) , a reliable and easy-to-use tool to access MySQL databases from ODBC-compatible tools on 32-bit and 64-bit Windows, macOS, and Linux. In addition, the ODBC driver for MySQL allows you to access live MySQL data directly from [SQL Server Management Studio](https://www.devart.com/odbc/mysql/integrations/mysql-ssms-connection.html) . Introduction to ODBC drivers Before we start, let’s find out what ODBC drivers are what their role in data conversion is. ODBC (Open Database Connectivity) drivers serve as a bridge between a source and a target database, establishing a connection to transfer data between them. The Devart ODBC drivers support a wide range of databases, including Oracle, MySQL, PostgreSQL, SQL Server, and SQLite. ODBC drivers help migration tools access data, send SQL queries to the source database, retrieve the results, and then insert or update the data with the mapped data types in the target database, allowing users to efficiently migrate MySQL to SQL Server. Install and configure ODBC driver for MySQL For data conversion from a MySQL database to a SQL Server database, first, [download](https://www.devart.com/odbc/mysql/download.html) the ODBC Driver for MySQL on the computer where you want to perform data migration. Once done, go to the Downloads folder and double-click the DevartODBCMySQL.exe setup file to install the driver on your computer. In the Setup wizard, go through all the installation steps and click Finish to close the wizard. Next, you must set up a data source name (DSN) for the ODBC driver. To do this, open the ODBC Data Source Administrator utility by typing ODBC Data Sources in the search bar. Select the 32-bit or 64-bit version of the ODBC Data Source Administrator utility, depending on your machine. In the ODBC Data Source Administrator window, you can create a User DSN , which means the DSN will be accessible to the user who created it in the system, or a System DSN , meaning the DSN will be accessible to anyone logged in. To proceed, on the User DSN or System DSN tab, click Add to create a new data source. In the dialog that opens, select Devart ODBC Driver for MySQL and click Finish . This will open the Devart ODBC Driver for MySQL Configuration dialog, where you need to specify the required details, including the data source name, MySQL server, username, password, and target database. The default port number is 3306. You can change it in the corresponding field. Note that the data source name you specify will subsequently appear in the list of data source names in dbForge Data Pump for SQL Server. If you click Test Connection , you can verify that the connection has been properly configured. If the configuration is correct, the notification window will appear. Click OK to close it. To save the DSN settings, click OK . The driver you’ve installed will appear on the corresponding DSN tab. Click OK to save the changes and close the dialog. Overview of Data Pump for SQL Server For data migration from MySQL to SQL Server, we’ll use dbForge Data Pump for SQL Server, which is a part of the [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) pack. Data Pump is a SQL Server Management Studio (SSMS) add-in aimed to facilitate data migration from third-party databases to SQL Server. The tool also supports data import/export in popular data formats, including Excel, CSV, XML, JSON, Text, MS Access, etc. Data Pump allows saving templates for repetitive export scenarios or automating and scheduling export tasks using the command line. Convert your data from MySQL to SQL Server effortlessly Everything has been prepared, so that we can start with data conversion from MySQL to SQL Server. First, let’s see the data we want to transfer from a MySQL table to a SQL server table. To migrate the data, you will need to export it from MySQL, which can be done using solutions like [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) . This tool simplifies data extraction, allowing you to generate a structured export file that can be seamlessly imported into SQL Server. Now, open SSMS, and in Object Explorer , right-click the required database and select Data Pump > Import Data to open the Data Import wizard. This tool acts as a MySQL to SQL Server converter. On the Source file page of the wizard, select the ODBC source file format and click Next . On the ODBC options page, specify the configured ODBC data provider options for imported data: Under Data source specification , select ODBC Driver for MySQL from the Use system or user data source name dropdown list. Under Login information , enter the username and password to the MySQL server to which the ODBC driver is connected. Click Test Connection to verify that the connection has been successfully established. Then, click OK to close the pop-up window. On the Destination page, select a target table for data import: Under Source , you can view all the tables located on the sakila database of the connected MySQL server. Under Target , you can view the preselected SQL Server connection, a database, a schema, and a table to which the data can be imported. In our example, we select sakila.customer as a source table and import its data into a new table with the same name on the target server. You could also import data into an existing table. Switch to the Mapping page, where you can see the columns with the assigned data types in the upper grid and the preview mode in the lower grid. You can also change the data type for the selected column by clicking Edit on the toolbar. To proceed, click Next . The Modes page displays the import modes. Only Append mode becomes available if you migrate data to a new table. However, if you transfer data into an existing table – all modes will be available, allowing you to choose the most suitable option based on your specific requirements. On the Output page, the Import data directly to the database option is selected by default, and we leave it as is. You can also save the file for further use or open it in the internal editor to make changes. Finally, specify the error processing behavior and logging options and click Import . When the data import process has been finished, you will see the following result: Click Finish to close the wizard. Let’s now refresh Object Explorer to check that the data from the MySQL table has been imported to the SQL table. Then, in Object Explorer , expand the sakila database and retrieve data from the imported dbo.sakila.customer table. Great! The MySQL table data has been successfully transferred to the SQL Server table. bForge Edge brings together the power of four database management Studios — dbForge Studio for SQL Server, MySQL, Oracle, and PostgreSQL—into a single, integrated environment. With built-in support for ODBC drivers, users can easily establish connections and convert a MySQL database to SQL Server, streamlining cross-platform migration along with all associated data. Key benefits of using dbForge Edge for data migration [dbForge Edge](https://www.devart.com/dbforge/edge/) supports MySQL, SQL Server, PostgreSQL, and Oracle, making cross-platform migrations seamless. Multi-platform support through four powerful dbForge Studios in one environment ODBC connectivity for seamless database conversion and integration Data Export for exporting data between databases Data Import for importing data between databases Streamlined data integration between systems with minimal manual effort This unified toolset helps ensure accurate, efficient, and scalable migrations across diverse database systems. Conclusion In the article, we have covered the fundamentals of the data conversion process between MySQL and SQL Server. We have also delved into the main differences between these databases, explored potential issues that might happen during data migration, and provided a detailed, step-by-step guide for efficiently transferring data from a MySQL table to a SQL Server table. The [Devart ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/download.html) and [Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/download.html) for data migration ensure a quick, easy, and efficient data migration experience. Gain access to a [30-day free trial of the SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) pack to experience the full range of powerful tools, including dbForge Data Pump for SQL Server. You may also find the following articles useful to explore how to transfer data between different servers. [How to convert a Microsoft Access database to a SQL Server database](https://blog.devart.com/convert-microsoft-access-to-sql-server.html) [How to migrate data from Oracle to MySQL: step-by-step guide](https://blog.devart.com/migrate-from-oracle-to-mysql.html) [How to convert MySQL data to PostgreSQL](https://blog.devart.com/convert-from-mysql-to-postgresql.html) [How to easily convert your MS Access data to MySQL](https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html) [How to migrate from MySQL to Oracle: a comprehensive guide](https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html) [Migrating data from Oracle to PostgreSQL](https://blog.devart.com/oracle-to-postgresql-migration.html) [Automating bulk data import from MS Access to SQL Server](https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html) FAQ Why migrate from MySQL to SQL Server? SQL Server offers advanced security, better performance optimization, and seamless integration with Microsoft technologies like Azure and Power BI. It also provides superior scalability and comprehensive transaction management for enterprise-level applications. How does MySQL differ from SQL Server in terms of database structure and management? MySQL employs a storage model that is file-based, whereas SQL Server uses a unified storage engine complete with transaction logging. SQL Server also supports T-SQL for the purpose of procedural programming, while MySQL depends on structured queries with limited procedural features. MySQL is available as a free Community Edition under the GPL license or as a subscription-based Commercial Edition (Oracle MySQL), which includes support, advanced features, and additional tools. SQL Server is a proprietary licensed RDBMS, with pricing based on server cores or per-user/per-device, while its free Express Edition is suitable for small-scale applications and development. Tools for MySQL include: command-line client for administration, MySQL Workbench for database design and development, and phpMyAdmin for web-based management. Additionally, there are third-party tools like DBeaver, Navicat, or dbForge. Tools for SQL Server include: SQL Server Management Studio (SSMS) for comprehensive administration and development, and Azure Data Studio for cross-platform use with growing capabilities. Command-line tools like sqlcmd and PowerShell cmdlets support scripting and automation. Additional third-party tools such as DBeaver, Navicat, and dbForge extend functionality. How can one assess the compatibility of MySQL data types with SQL Server? A compatibility check would involve analyzing MySQL data types and converting them to SQL Server equivalents. Devart ODBC Driver for MySQL simplifies this process by automatically mapping MySQL data types to SQL Server-compatible formats, ensuring smooth data migration. You can assess compatibility by mapping MySQL data types to their closest SQL Server equivalents, while accounting for differences in precision, length limits, and behavior. For example, TINYINT(1) in MySQL is often used as a boolean and maps to BIT in SQL Server. VARCHAR types are generally compatible but may differ in maximum length handling. MySQL’s DECIMAL supports higher precision than SQL Server’s limit of 38 digits, so adjustments may be needed. For date and time values, DATETIME and TIMESTAMP in MySQL are better mapped to DATETIME2 or DATETIMEOFFSET in SQL Server. Larger data types like TEXT, BLOB, or LONGTEXT should be converted to VARCHAR(MAX), VARBINARY(MAX), or NVARCHAR(MAX) to maintain proper storage and compatibility. What tools are available for migrating MySQL databases to SQL Server? Key tools include Data Pump for SQL Server , which enables bulk data import/export, and Devart ODBC Driver for MySQL , which allows configuring data formats and mapping data types to ensure compatibility between MySQL and SQL Server. Several solutions are available for migrating MySQL to SQL Server, including Data Pump for SQL Server by Devart. This tool helps transfer data and table structures while automatically converting MySQL data types to their SQL Server equivalents. It works with the Devart ODBC Driver for MySQL and requires manual conversion of objects like triggers, procedures, and events, as well as adding foreign keys. What are the common schema conversion issues when migrating from MySQL to SQL Server? The challenges with migration typically include: mismatched data types; varying index structures; and foreign key constraints in need of direct alterations. It is quite common that SQL Server’s identity columns, event-driven actions, and stored procedures demand rewriting due to code structure and functional inconsistencies. Common schema conversion issues when migrating from MySQL to SQL Server include differences in data types, requiring careful mapping to compatible SQL Server equivalents. Database objects like views, stored procedures, and triggers often need to be rewritten to match SQL Server syntax. Additionally, MySQL events must be converted to SQL Server Agent jobs, which serve a similar role in task automation. What are the key performance differences between MySQL and SQL Server? SQL Server differs from MySQL in the way that it achieves boosted performance with In-Memory OLTP and concurrent query processing. In addition to that, it improves data retrieval by utilizing upscale indexing methods. MySQL, while lightweight and fast for read-heavy workloads, lacks SQL Server’s sophisticated query optimization and resource governance. SQL Server generally delivers faster response times than MySQL, especially in most query operations except for INSERT queries. As data volume increases, SQL Server scales more efficiently, showing a smaller performance drop compared to MySQL, which often experiences a doubling of response time in similar scenarios. Tags [data migration](https://blog.devart.com/tag/data-migration) [dbForge Data Pump for SQL Server](https://blog.devart.com/tag/dbforge-data-pump-for-sql-server) [odbc drivers](https://blog.devart.com/tag/odbc-drivers) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconvert-from-mysql-to-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Convert+MySQL+Databases+to+SQL+Server%3A+Step-by-Step+Process&url=https%3A%2F%2Fblog.devart.com%2Fconvert-from-mysql-to-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/convert-from-mysql-to-sql-server.html&title=How+to+Convert+MySQL+Databases+to+SQL+Server%3A+Step-by-Step+Process) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/convert-from-mysql-to-sql-server.html&title=How+to+Convert+MySQL+Databases+to+SQL+Server%3A+Step-by-Step+Process) [Copy URL](https://blog.devart.com/convert-from-mysql-to-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/convert-microsoft-access-to-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Convert a Microsoft Access Database to a SQL Server Database By [Nataly Smith](https://blog.devart.com/author/nataly-smith) June 13, 2023 [0](https://blog.devart.com/convert-microsoft-access-to-sql-server.html#respond) 2485 In this article, we will provide you with a screenshot-infused step-by-step tutorial on how to convert a Microsoft Access database to SQL Server using [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . This GUI tool can enhance nearly every aspect of working with databases, including database design, SQL coding, database comparison, schema and data synchronization, generation of useful test data, and many other functionalities. Contents Importing Data Setting Up Constraints Conclusion Microsoft Access is a widely-used relational database management system that allows users to store and manipulate data simply and intuitively. However, as the amount of data grows and more complex queries are needed to handle it, MS Access databases can become slow and inefficient. In such cases, it might become crucial to migrate such databases to a more robust and scalable database system like SQL Server, which can improve performance and make data management more efficient. Importing Data In order to import data using dbForge Studio for SQL Server: 1. On the Database menu, click Import Data . Alternatively, you can right-click the target database, point to Tools , and select Import Data . 2. In the Data Import wizard that opens, select MS Access as an import format and specify the location of Source data. Click Next . If the Source data is protected with a password, the Open MS Access Database dialog box appears where you should enter the password. NOTE : To perform the transfer, you should install Microsoft Access Database Engine beforehand. It will install components that can facilitate the transfer of data between Microsoft Access files and non-Microsoft Office applications. Otherwise, the Import wizard will show the error: Should you face this issue, download the missing components [here](https://www.microsoft.com/en-us/download/confirmation.aspx?id=54920) . Make sure that the bit versions of both your Windows operating system and Microsoft Access Database Engine match. If you have a 64-bit system, you’ll need to use the 64-bit installer accordingly. However, there may be instances where 32-bit Microsoft Access is installed on a 64-bit Windows OS. If this is the case, follow these steps before proceeding with the installation. Click Start and type cmd. Right-click the Command prompt , and then select Run as Administrator . Type the file path where the installer is located on your computer followed by “/passive “ : In the case above, the Windows OS is 64-bit, but the installed version of Microsoft Access is 32-bit. That is why the 64-bit installer is required. 3. Select a source table. To quickly find a table in the list, start entering its name into the Search field. The list will be filtered to show only matching tables. 4. Specify a Target MySQL connection and a database to convert the data to. If you need to create a new table (if you need to import MS Access data to a new table), select New table and specify its name. Click Next . 5. Adjust data formats: 6. Map the Source columns to the Target ones. Since we create a new table in SQL, dbForge Studio for SQL Server will automatically create and map all the columns, as well as data types for each column. If the automatic match of columns’ data types is incorrect, you can manually edit those. Click the Source column fields and select the required columns from the drop-down list. NOTE : To cancel the mapping of all the columns, click Clear Mappings on the toolbar. To restore it, click Fill Mapping . 7. To edit the Column Properties, double-click the required column or right-click it and select Edit Column . 8. Choose an import mode: Append – add records to the destination table. Update – update a record in a destination table with a matching record from the Source. Append/Update – if a record exists in the destination table, update it. Otherwise, add it. Delete – delete records in the destination table to match the records in the Source. Repopulate – delete all records in the destination table, and repopulate from the Source. Here, you can also select to use a single transaction and/or bulk insert for this migration. 9. On the Output tab of the wizard, select output options to manage the data import script: Open the data import script in the internal editor. Save the data import script to a file. Import data directly to the database. 10. Configure the error processing behavior and logging options: 11. Click Import and see the import progress. dbForge Studio for SQL will notify you whether the conversion was completed successfully or failed. If you have chosen to write a report to a log file earlier, click the Show log file to open it. 12. Click Finish . NOTE : You can save the import settings as a template for future uses. Click the Save Template button on any wizard page to save the selected settings. Next time you will only have to select a template and specify a location of the Source data – all the settings will be already there. To import data from multiple tables simultaneously, even if the data import UI typically supports importing one table at a time, follow the approach outlined in [Automating Bulk Data Import from MS Access to SQL Server](https://blog.devart.com/bulk-import-from-ms-access-to-sql-server.html) . Setting Up Constraints After importing all necessary tables, you can set up new (or update the existing) relations between the converted tables by creating/editing foreign keys (if required). Also, you may create primary keys if you skipped this step during the table creation. Creating Foreign Key Right-click a table you need and select Edit Table . Switch to the Constraints tab. Click Add Foreign key . NOTE : To create a foreign key, the referenced table should have a unique index. Otherwise, dbForge Studio will prompt you to create it. Click Yes in the dialog, and the unique index will be added. Creating Primary Key Right-click a table you need and select Edit Table . Switch to the Constraints tab. Right-click the empty area and select Add Primary Key . Add the required columns to the key and click OK . Conclusion As promised at the beginning of this article, we have provided a detailed step-by-step tutorial on converting a Microsoft Access database to SQL Server using dbForge Studio for SQL Server. With its user-friendly GUI and wide range of functionalities, including database design, SQL coding, schema and data synchronization, and test data generation, the IDE can significantly enhance your database management experience. Additionally, an interested user can take advantage of the [30-day free trial](https://www.devart.com/dbforge/sql/studio/download.html) to test the tool’s capabilities firsthand. Tags [data import](https://blog.devart.com/tag/data-import) [SQL Server](https://blog.devart.com/tag/sql-server) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconvert-microsoft-access-to-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Convert+a+Microsoft+Access+Database+to+a+SQL+Server+Database&url=https%3A%2F%2Fblog.devart.com%2Fconvert-microsoft-access-to-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/convert-microsoft-access-to-sql-server.html&title=How+to+Convert+a+Microsoft+Access+Database+to+a+SQL+Server+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/convert-microsoft-access-to-sql-server.html&title=How+to+Convert+a+Microsoft+Access+Database+to+a+SQL+Server+Database) [Copy URL](https://blog.devart.com/convert-microsoft-access-to-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/convert-sql-function.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Use a CONVERT Function in SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) September 12, 2022 [0](https://blog.devart.com/convert-sql-function.html#respond) 4649 When working with databases, data conversion is an important aspect. It is an efficient way to manage and organize data into different data types so that they can be synchronized on multiple platforms. This will allow users to access, view, or modify data when required. In the article, you will learn how to use the CONVERT() function in SQL Server with syntax and examples, as well as discuss the alternatives that exist to that function. Contents What is a CONVERT function in SQL Server? SQL CONVERT function: Syntax Implicit and explicit data type conversion SQL CONVERT function example Example #1: Convert date and time data types Example #2: Convert numeric data types Example #3: Convert money data types Use the CONVERT function in the WHERE clause Alternatives to the CONVERT function CAST function FORMAT function PARSE function What is a CONVERT function in SQL Server? SQL CONVERT explicitly converts an expression of one data type to another with formatting. The function returns the converted value, if the conversion succeeds, or returns an error if it fails. Depending on the SQL Server version you use, the CONVERT function works differently. In earlier SQL Server versions such as 2005, 2008, and 2012, the CONVERT function returned a data type specified in the expression and returned NULL when a given data type argument was null. However, in late SQL Server versions such as 2016, the function returns NULL if the third parameter is null. To move on, we’ll see how to use the SQL CONVERT function and discuss syntax, examples, and alternatives to the function. SQL CONVERT function: Syntax Now, let’s start with the syntax of the CONVERT function and have a detailed look at each argument. The syntax of the function is as follows: CONVERT ( data_type [ ( length ) ] , expression [ , style ] ) The table describes the arguments used in the CONVERT function. Argument Description data_type A data type you want to get in the result. length (optional parameter) An integer that specifies the length of the destination data type. expression A valid value to be converted. style An integer expression that instructs how the function will convert the expression. The specified data type defines the range of values for the style argument. For more information about style values, see the [Microsoft documentation](https://docs.microsoft.com/en-us/sql/t-sql/functions/cast-and-convert-transact-sql?view=sql-server-ver15#date-and-time-styles) . Implicit and explicit data type conversion Data types can be converted either implicitly or explicitly. Implicit conversion occurs when data is converted from one data type to another automatically . In this case, you do not need to explicitly specify the CONVERT keyword in the query. Let’s compare the results of these two SQL queries. -- the first query\nSELECT\n p.ProductId\n ,p.ProductName\n ,p.Price\nFROM Production.Product p\nWHERE p.Price > 1000\nORDER BY p.Price ASC;\n\n-- the second query\nSELECT\n p.ProductId\n ,p.ProductName\n ,p.Price\nFROM Production.Product p\nWHERE p.Price > '1000'\nORDER BY p.Price ASC; As you can see, both queries have the same output. It means that SQL Server has automatically converted ‘1000’ as text from the second query into the integer data type. Explicit conversion takes place when data is converted from one data type to another manually . It means that you need to explicitly specify the CONVERT keyword in your query. Let’s check this in the following SQL queries. -- the first query\nSELECT\n p.ProductId\n ,p.ProductName\n ,p.Price\nFROM Production.Product p\nORDER BY p.Price ASC;\n\n-- the second query with the explicitly specified CONVERT function\nSELECT\n p.ProductId\n ,p.ProductName\n ,CONVERT (INT, p.Price) as NewPrice\nFROM Production.Product p\nORDER BY p.Price ASC; The result of the queries differ. In the second query, the function has converted the decimal data type of the Price column into an integer data type as specified in the query. In the Microsoft documentation, you can find the table that illustrates data type conversions to be done implicitly or explicitly. Note: The table has been taken from the official [Microsoft documentation](https://docs.microsoft.com/en-us/sql/t-sql/functions/cast-and-convert-transact-sql?view=sql-server-ver15#implicit-conversions) . SQL CONVERT function example The next step is to describe how the SQL CONVERT function works. For demo purposes, we are going to convert the following data types: Date Numeric Money Example #1: Convert date and time data types This example illustrates how to convert the current database system date and time into several formats. The GETDATE() function will be used to retrieve the current data and time. So, execute the following SELECT statement: SELECT GETDATE() as \"current_date\",\n CONVERT(VARCHAR, GETDATE(), 0) as current_date_0,\n CONVERT(VARCHAR, GETDATE(), 104) as current_date_104,\n CONVERT(VARCHAR, GETDATE(), 110) as current_date_110,\n CONVERT(VARCHAR, GETDATE(), 113) as current_date_113,\n CONVERT(VARCHAR, GETDATE(), 120) as current_date_120; Note: You can view the full list of date and time styles in the [Microsoft documentation](https://docs.microsoft.com/en-us/sql/t-sql/functions/cast-and-convert-transact-sql?view=sql-server-ver15#date-and-time-styles) . In the output, the SELECT statement returned the date in different formats according to the specified styles. Depending on the data type you use, the resulting data can be either truncated or rounded. Let’s demonstrate this in the examples with numeric and money values. Example #2: Convert numeric data types When the precision in the numeric value is too big, the output data can be rounded. For example, execute the following query: SELECT CONVERT(NUMERIC(10,3),'123.456789'); In the result, the value has been rounded to ‘123.457’. You can check a full list of data types, which can be truncated or rounded in the [Microsoft documentation](https://docs.microsoft.com/en-us/sql/t-sql/functions/cast-and-convert-transact-sql?view=sql-server-ver15#truncating-and-rounding-results) . Example #3: Convert money data types Suppose that the input data is $123.456789777. You want to display only integer values in your table. Thus, to truncate 0.456789777 cents, you can run the following query: SELECT CONVERT(INT, round(123.456789777, 0)); The output is displayed as follows: Use the CONVERT function in the WHERE clause The CONVERT function can also be used with the WHERE clause in SQL Server. However, it is recommended that you do not wrap the column name into the function because in this case an index assigned to this column won’t be used. Instead, you need to wrap the condition into the function based on which you get the result. Let’s convert a string specifying the date to the DateTime value and filter the result by the ShippedDate column in the Sales.Order table. To do that, in the WHERE clause, we’ll use the CONVERT function in the condition as follows: SELECT\n o.OrderId\n ,o.OrderDate\n ,o.ShippedDate\n ,o.StoreId\nFROM Sales.[Order] o\nWHERE o.ShippedDate > CONVERT(DATETIME, '30-Apr-21', 11)\nORDER BY o.ShippedDate ASC; As you can see, the result has been filtered according to the specified condition in the WHERE clause. Alternatives to the CONVERT function When dealing with data types, you can use alternatives to the CONVERT function that may better reach your goals. The alternatives to the CONVERT function may include the following functions: CAST FORMAT PARSE CAST function The CAST function converts a value of any data type into the specified data type. The CAST function is similar to the CONVERT function, but there is a slight difference. As it has been mentioned, with the CONVERT function, you can simultaneously convert the data type and define the way to do that using a style argument. As for the CAST function, this cannot be done. Also, the CAST function is supported by the ANSI SQL standard, while the CONVERT function is a specific SQL Server function. It should be noted that if you work with different database management systems, the CAST function will be used. The syntax of the CAST function is as follows: CAST(expression AS datatype(length)) The table describes the arguments used in the CAST function. Argument Description expression A valid value to be converted. data_type A data type of expression you want to get in the result. length (optional parameter) An integer that specifies the length of the target data type. Let’s see in the example how the CAST function works. In the output, you can see the products that have ’25’ as the first two digits in the price that has been converted to integer values. FORMAT function The FORMAT function returns a string value in the specified data type. Preferably, it can be used to format date/time and numeric string values. The syntax of the FORMAT function is the following: FORMAT (value, format [, culture]) The table describes the arguments used in the FORMAT function. Argument Description value A valid value to be converted. format The NVARCHAR format of the value you want to get in the result. The argument should include a valid .NET format string in the NVARCHAR data type. culture (optional argument) A string that specifies the culture of the target data type. Keep in mind that the culture must be supported by the .Net framework; otherwise, an error occurs. In the working example, we’ll format the value of the OrderDate column in the Sales.Order table. As you can see, the query returned the formatted date in the Great Britain English date format dd/MM/yyyy. PARSE function The PARSE function returns a string value of an expression in either date/time or numeric data type. The syntax of the PARSE function is as follows: PARSE ( string_value AS data_type [ USING culture ] ) The table describes the arguments used in the PARSE function. Argument Description string_value A valid nvarchar value to be parsed into. If it is invalid, an error occurs. data_type A data type that you want to get in the result. You can use the [Microsoft documentation](https://docs.microsoft.com/en-us/sql/t-sql/functions/parse-transact-sql?view=sql-server-ver16#return-types) to view the supported data_type parameters along with styles. culture (optional argument) A string that specifies the culture which the target data type is parsed into. You can use any valid culture supported by the .Net framework; otherwise, it fails. In the example below, we have converted the string value 31-Mar-17 from the Sales.Order table and applied the DATETIME2 style to the value. Comparison of the CONVERT, CAST, FORMAT, and PARSE functions Here is a short comparison table of the CONVERT, CAST, FORMAT, and PARSE functions. Item to be compared CONVERT CAST FORMAT PARSE Argument SQL expression SQL expression SQL expression string Target value Specified by an argument Specified by an argument Specified by an argument Specified by an argument Style/Culture Yes No Yes No Supported data types for conversion Any Any Any From a string to date/time and numeric values Servers in which they work SQL Server (starting with 2008) Azure SQL Database Azure SQL Data Warehouse Azure SQL Managed Instance Azure Synapse Analytics Analytics Platform System (PDW) SQL Server (starting with 2008) Azure SQL Database Azure SQL Data Warehouse Azure SQL Managed Instance Azure Synapse Analytics Analytics Platform System (PDW) SQL Server (starting with 2012) Azure SQL Database Azure SQL Managed Instance Azure Synapse Analytics SQL Server (all supported versions) Azure SQL Database Azure SQL Managed Instance To demonstrate the examples with the CONVERT function, we used one of the best tools for database development and administration – [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . Though there are a bunch of similar tools, dbForge Studio takes the top place. This IDE makes database development easier and much more convenient. dbForge Studio is designed not only to simplify and optimize your database administration, testing, and deployment but also to allow you to stay productive and efficient while performing routine tasks. Conclusion In the article, we have reviewed how to use the CONVERT function in SQL Server and discussed possible alternatives. Also, we explored that converting data from one data type to another with dbForge Studio for SQL Server is simple and enjoyable. Want to know more about other cutting-edge functionalities of dbForge Studio for SQL Server? Or want to see the tool in action? Do not hesitate to [download](https://www.devart.com/dbforge/sql/studio/download.html) a free 30-day trial version of the Studio tool! Tags [cast function](https://blog.devart.com/tag/cast-function) [convert](https://blog.devart.com/tag/convert) [convert data types](https://blog.devart.com/tag/convert-data-types) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [format function](https://blog.devart.com/tag/format-function) [parse function](https://blog.devart.com/tag/parse-function) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fconvert-sql-function.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Use+a+CONVERT+Function+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fconvert-sql-function.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/convert-sql-function.html&title=How+to+Use+a+CONVERT+Function+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/convert-sql-function.html&title=How+to+Use+a+CONVERT+Function+in+SQL+Server) [Copy URL](https://blog.devart.com/convert-sql-function.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/count-function-in-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Oracle COUNT Function: From Basics to Advanced By [Nataly Smith](https://blog.devart.com/author/nataly-smith) September 5, 2024 [0](https://blog.devart.com/count-function-in-oracle.html#respond) 1020 Oracle is one of the most widely used database management systems for a reason. It provides an extensive set of tools that allow you to accomplish even the most intricate tasks. Today, we will focus on the Oracle COUNT function, exploring what it is, how it works, its structure, and addressing common misconceptions and best practices. [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) will serve as our trusted wingman, supporting us throughout this process. Contents Understanding the COUNT function COUNT function variants Practical usage of the COUNT function in Oracle COUNT with different clauses Common misconceptions Best practices and tips Conclusion Understanding the COUNT function Our first stop will traditionally be the essentials: the definition and basic syntax. The Oracle COUNT function counts the number of rows in a table or the number of times a specific value appears. It provides a way to find out how many items meet certain conditions in your data. The syntax of the function will look somewhat like this: Note: Even though this article focuses on Oracle, you can also find information about the [COUNT function in MySQL](https://blog.devart.com/mysql-count.html) and the [COUNT function in SQL Server](https://www.devart.com/dbforge/sql/studio/sql-server-count-function.html) on our blog. COUNT function variants Let’s break down the syntax into smaller parts: COUNT({ * | [ DISTINCT | ALL ] expression }) [ OVER (analytic_clause) ] COUNT accepts a clause which can be either ALL, DISTINCT, or *: * returns the number of items in a group, including NULL and duplicate values. DISTINCT expression returns the number of unique and NOT NULL items in a group. ALL expression evaluates the expression and returns the number of NOT NULL items in a group, including duplicate values. If you do not explicitly specify the parameter, the function uses ALL by default. Note: Unlike other aggregate functions, such as AVG or SUM, COUNT does not ignore NULL values. Practical usage of the COUNT function in Oracle Now that the express tour through the theoretical part of the article is over, it’s time for the fun: how do we apply the COUNT function in the real world? In this article, we will be using dbForge Studio for Oracle — a powerful integrated development environment (IDE) that helps Oracle developers increase their PL/SQL coding speed and provides versatile data editing tools for managing in-database and external data. With dbForge Studio for Oracle, you benefit from many features tailored specifically to streamline your workflow. The [smart PL/SQL Formatter](https://www.devart.com/dbforge/oracle/studio/plsql-formatter.html) ensures code readability and adherence to best practices, while code completion expedites the coding process by suggesting context-sensitive options. [Code snippets](https://www.devart.com/dbforge/oracle/studio/plsql-developer-tools.html) provide reusable code templates, simplifying the creation of common code structures, and the [SQL Editor](https://www.devart.com/dbforge/oracle/studio/oracle-sql-editor.html) offers a comprehensive environment for writing, testing, and debugging SQL queries. These features collectively enhance productivity, reduce development time, and facilitate seamless database management. Basic usage Before we begin, we need to prepare the playground for our experiments by creating a new films table and filling it with some test data: CREATE TABLE films(val VARCHAR2(50));\nINSERT INTO films(val) VALUES('The First Film');\nINSERT INTO films(val) VALUES('The First Film');\nINSERT INTO films(val) VALUES('The Second Film');\nINSERT INTO films(val) VALUES(NULL);\nINSERT INTO films(val) VALUES('The Third Film');\nINSERT INTO films(val) VALUES(NULL);\nINSERT INTO films(val) VALUES('The Fourth Film'); To see the results of our hard work, make sure to execute the following SELECT statement: SELECT * FROM films; Now, let’s count all the rows in the films table as an example of basic COUNT usage: SELECT COUNT(*) FROM films; As expected, we got 7 as a result since there are 7 rows in the said table. Now, let’s move on to counting NOT NULL values: SELECT COUNT(ALL val) FROM films; Here, you will see 5 in the output as there were only 2 NULLs out of 7 rows in the films table. Advanced usage Moving on to the examples that are a bit more complicated than the previous one. In this section, we are going to explore how COUNT behaves both as an aggregate and analytic function. Aggregate usage When we were talking about COUNT basic usage, you already saw a couple of examples of how it operates as an aggregate function, where multiple values are processed together to form a single summary statistic. It allows you to count the total of rows in your table, the number of NULL, NOT NULL, unique values, and so on. However, there are more applications for COUNT out there. Analytic usage Let’s say you want to count how many times each film title appears in the table, but instead of grouping the results, you want to keep the original order of the rows and include the count alongside each row. SELECT val,\n COUNT(val) OVER (PARTITION BY val ORDER BY val) AS FILM_COUNT\n FROM films\n ORDER BY val; In the query above: PARTITION BY divides the result set into partitions. Each partition contains rows with the same film title. COUNT is applied within each partition. ORDER BY orders the rows within each partition. However, in this case, it doesn’t affect the counting since all rows in a partition are identical. FILM_COUNT shows how many times each film title appears in the table, but without aggregating or collapsing the rows. COUNT with different clauses This section of today’s article explores how the COUNT function interacts with the DISTINCT, WHERE, JOIN, and HAVING clauses and how to group the results we get from all those manipulations. COUNT with DISTINCT For this example, we are adding the DISTINCT clause to the equation: SELECT COUNT( DISTINCT val ) FROM films; The result would be 4, as there are two NULL rows in the table, and the name of one film is duplicated. COUNT with WHERE The WHERE clause, used along with COUNT, ensures that only rows with NULL values are counted. SELECT COUNT(*) AS EMPTY_FILMS\n FROM films\n WHERE VAL IS NULL; COUNT with JOIN We need another table besides films to demonstrat e how COUNT and JOIN work together. Let it be actor : SELECT F.VAL,\n COUNT(A.ACTOR_ID) AS ACTORS_COUNT\n FROM FILMS F\n LEFT JOIN ACTOR A\n ON F.VAL = A.film_title\n GROUP BY F.VAL; This query counts the number of actors associated with each film title, using a LEFT JOIN to include all films, even those without any associated actors. Grouping COUNT results with HAVING The HAVING clause is used to filter groups after they have been formed with GROUP BY. It allows you to apply a condition to the aggregate results. SELECT VAL,\n COUNT(*) AS COUNT_PER_FILM\n FROM FILMS\n GROUP BY VAL\n HAVING COUNT(*) > 1; The films are grouped by title, and the occurrences of each one are counted. The HAVING COUNT(*) > 1 clause filters the results to only include film titles that appear more than once. Common misconceptions Misconceptions about the COUNT function in Oracle are surprisingly common and can lead to inefficient queries and inaccurate results. To avoid these pitfalls, it’s essential to understand and address the most prevalent misunderstandings, such as: Misconception Reality Example COUNT(column_name) includes NULLs COUNT(column_name) only counts rows where the specified column is not NULL. A common mistake is expecting it to count all rows, including those with NULL. If you have a table with 7 rows and 2 rows contain NULL in the specified column, the query will return 5, not 7. COUNT(1) is faster than COUNT(*) A popular myth suggests that COUNT(1) is faster than COUNT(*), assuming COUNT(1) counts a constant value (1) rather than all columns. There’s no performance difference between the two because Oracle optimizes both the same way. Whether you use COUNT(1) or COUNT(*), Oracle performs the same internal row-counting operation, so choose based on readability rather than performance. COUNT(*) and COUNT(column_name) are interchangeable COUNT(*) counts all rows, regardless of NULL values in any column, while COUNT(column_name) only counts rows where column_name is not NULL. If a column has NULL values, COUNT(column_name) will yield a lower count than COUNT(*). COUNT(DISTINCT column_name) is always efficient Using COUNT(DISTINCT column_name) can be resource-intensive, particularly on large datasets, as it requires sorting or hashing to remove duplicates before counting. This operation can be slow without proper indexing or when used excessively. For large datasets, using COUNT(DISTINCT column_name) without indexes on the column can lead to performance degradation. Consider using approximate functions like APPROX_COUNT_DISTINCT for quicker results when exact precision is not necessary. COUNT ignores performance considerations When used with complex JOINs or GROUP BY clauses, COUNT can lead to slow query performance if not optimized properly with indexing and careful query structuring. Many users assume that simply adding COUNT won’t significantly impact performance, which isn’t always true. Counting results from a large, joined dataset without indexes can result in long execution times. Ensuring that join columns and grouping keys are indexed is crucial for maintaining performance. Best practices and tips Keeping in mind the misconceptions we mentioned in the previous section, we derived some tips for you to avoid those and be a few steps closer to perfecting your Oracle database: Don’t fall for the myth that COUNT(1) is inherently faster than COUNT(*). Oracle treats them the same. Use DISTINCT judiciously and be aware of its potential performance impacts on large datasets. Be mindful of how JOINs and groupings can affect performance, and optimize with indexes where possible. Always test and validate the behavior of COUNT in your queries to avoid unexpected results or performance issues. Understand how COUNT handles NULL values, especially when using specific columns versus counting all rows. Conclusion The Oracle COUNT function is a versatile tool capable of functioning as an aggregate and analytical function. This article discussed its practical applications and how it interacts with various clauses, such as DISTINCT, WHERE, JOIN, and HAVING. We have also provided tips and best practices on how to handle some common misconceptions. For all these tasks and more, dbForge Studio for Oracle is an invaluable asset. It offers a robust set of tools to streamline your database routine. Experience its power firsthand by [downloading a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) and see how it can elevate your Oracle database management to the next level. Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcount-function-in-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Oracle+COUNT+Function%3A+From+Basics+to+Advanced&url=https%3A%2F%2Fblog.devart.com%2Fcount-function-in-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/count-function-in-oracle.html&title=Oracle+COUNT+Function%3A+From+Basics+to+Advanced) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/count-function-in-oracle.html&title=Oracle+COUNT+Function%3A+From+Basics+to+Advanced) [Copy URL](https://blog.devart.com/count-function-in-oracle.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/create-and-connect-to-sql-database-on-heroku.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Create and Connect to an Azure SQL Database on Heroku: A Step-By-Step Guide By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) September 4, 2023 [0](https://blog.devart.com/create-and-connect-to-sql-database-on-heroku.html#respond) 2003 In the digital world of today, characterized by rapid technological advancements and a multitude of innovative solutions, the seamless integration of diverse platforms is paramount for efficient application development. This guide delves into the process of creating and connecting an Azure SQL Database on the Heroku platform, offering step-by-step instructions and insights to help you bridge the gap between two powerful solutions. In the article, we will comprehensively demonstrate how to create an Azure SQL Database on the Heroku platform. Subsequently, we will illustrate the process of establishing a remote connection to it, using the powerful SQL Server client tool – dbForge Studio for SQL Server. Join us on this journey as we navigate through the intricacies of setting up and establishing a connection between Azure SQL Database and Heroku, unlocking new possibilities for your projects. Contents What is Heroku? Supported databases on Heroku How to create and connect to an Azure SQL Database on Heroku Step 1: Create a new app on Heroku Step 2: Add the MSSQL add-on Step 3: Create Azure SQL database on Heroku Step 4: Connect to the Azure SQL database on Heroku Conclusion What is Heroku? Heroku is a cloud platform that provides a Platform as a Service (PaaS) to help businesses deploy, manage, and scale applications. It allows developers to focus on writing code without worrying about infrastructure management. Heroku supports various programming languages and frameworks, making it versatile for different types of applications. Users can deploy web applications, APIs, microservices, and other software projects easily using Heroku’s streamlined deployment process. It abstracts away much of the underlying infrastructure complexity, allowing for quick deployment and management of applications in a scalable and efficient manner. Supported databases on Heroku Heroku offers three primary managed data services: Heroku Postgres, Heroku Redis, and Apache Kafka on Heroku. In addition, Heroku provides a diverse selection of integrations and add-ons that allow you to customize your application’s database solutions. This enables you to seamlessly integrate various databases, including Amazon RDS, Azure SQL Server, and ClearDB, into your Heroku app according to your specific requirements. How to create and connect to an Azure SQL Database on Heroku Prerequisites: [Active Heroku account](https://signup.heroku.com/) Installed dbForge Studio for SQL Server Step 1: Create a new app on Heroku Log in to your Heroku account and navigate to the dashboard. Then click Create new app . Next, provide a name for the new app and select a region where your app server is located. Once done, click Create app . Note The app name should be unique and composed solely of lowercase letters, numbers, or dashes. Step 2: Add the MSSQL add-on [Softtrends MSSQL](https://elements.heroku.com/addons/mssql) allows you to seamlessly integrate a high-performing Azure SQL database into your Heroku application. With the add-on, you can provision both single Azure SQL databases and database pools, manage database credentials, and build applications on Heroku using Azure SQL. To add the MSSQL add-on to your Heroku application: 1. Enter your application and select Configure add-ons . 2. In the field under Add-ons , start typing MSSQL and select the corresponding add-on once it becomes available. 3. In the dialog that appears, select the pricing plan and click Submit Order Form . This will add MSSQL to your application. Step 3: Create Azure SQL database on Heroku After you have installed the MSSQL add-on, you’ll find it listed under the Installed add-ons section within your application. Simply click it to access the add-on’s dashboard. Next, navigate to Administration -> Add New Database. On the Create New Azure Sql Database page, provide the new database name, user ID, and password. Once done, click Create Database . It will take a few minutes to complete the process, and the newly created database will appear in the List databases . Step 4: Connect to the Azure SQL database on Heroku With the Azure SQL database now hosted on Heroku, let’s proceed to establish an external connection to it. For this, we’ll use [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , an all-in-one IDE known for its robust SQL Server development, management, and administration features. To establish a connection with the Azure SQL database we have just created, it’s essential to have all the necessary credentials. You can find those by clicking View Config for the specific database in the list of databases. Now, let’s open dbForge Studio for SQL Server and try to establish a connection. Navigate Database -> New Connection . In the Database Connection Properties dialog that opens, go to the Advanced tab. In the designated field, enter the connection string particulars. You can easily copy the connection string from the View Config page for your SQL Azure database on Heroku. Once done, click Connect . To verify the connection, you can first click Test Connection . Once successfully connected to the Azure SQL database hosted on Heroku, it will appear in the Database Explorer. Now, you can comfortably proceed with your tasks within dbForge Studio – an IDE equipped with [all the essential tools](https://docs.devart.com/studio-for-sql-server/) for seamless work with SQL Server databases. Learn how to perform an [Azure SQL database export](https://blog.devart.com/export-azure-sql-database.html) in this article. Conclusion In the modern rapidly evolving tech world, staying ahead means adopting new solutions. The synergy of Azure SQL and Heroku will enable your applications to flourish within a robust, scalable, and efficient environment. To continue your exploration and supercharge your SQL Server experiences, we invite you to download and embark on a [30-day free trial of dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) . This exceptional tool will boost your capabilities, giving you an advantage in the dynamic field of application development. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-and-connect-to-sql-database-on-heroku.html) [Twitter](https://twitter.com/intent/tweet?text=Create+and+Connect+to+an+Azure+SQL+Database+on+Heroku%3A+A+Step-By-Step+Guide&url=https%3A%2F%2Fblog.devart.com%2Fcreate-and-connect-to-sql-database-on-heroku.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-and-connect-to-sql-database-on-heroku.html&title=Create+and+Connect+to+an+Azure+SQL+Database+on+Heroku%3A+A+Step-By-Step+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-and-connect-to-sql-database-on-heroku.html&title=Create+and+Connect+to+an+Azure+SQL+Database+on+Heroku%3A+A+Step-By-Step+Guide) [Copy URL](https://blog.devart.com/create-and-connect-to-sql-database-on-heroku.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Exploring ChatGPT’s Capabilities in Creating and Optimizing SQL Queries for Oracle By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) April 19, 2023 [0](https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html#respond) 2978 SQL (Structured Query Language) is a widely-used programming language for managing and manipulating data stored in relational databases. As the need for data analysis and database management continues to grow, an increasing number of job roles demand SQL proficiency. However, grasping SQL queries can be daunting, particularly for novices. Fortunately, the advent of cutting-edge technology, such as the ChatGPT-4 language model, can help master SQL queries. In this article, we will delve into how ChatGPT-4 can aid in creating SQL queries and assessing their performance. By using ChatGPT-4, you can not only enhance your SQL query writing skills but also leverage its capabilities to evaluate and optimize query performance. Contents ChatGPT-4 vs ChatGPT-3 Solve SQL knowledge tasks with ChatGPT-4 Convert MySQL syntax to Oracle Run the generated query with dbForge Studio for Oracle Generate query solutions Task 1: Return a list of middle managers Task 2: Return the id of managers who have more than two employees Task 3: Return the department with the highest average salary Task 4: Return the name of the branch’s head, the location of the branch, and the total salary for the branch Analyze query performance Check the AI-generated queries with Query Profiler Conclusion ChatGPT-4 vs ChatGPT-3 As an AI language model, ChatGPT is constantly evolving through new iterations and improvements. The latest version, ChatGPT-4, is more advanced than its predecessor, ChatGPT-3, in several ways. One of the key differences between the two is the model size. ChatGPT-4 is larger than ChatGPT-3, with more parameters, which enables it to understand users better and generate more complex and nuanced responses. Additionally, ChatGPT-4 has been trained on a more extensive and diverse dataset than ChatGPT-3, allowing it to have a better contextual understanding and more accurate responses to a broader range of topics. Moreover, ChatGPT-4 exhibits improved performance in terms of accuracy, response generation, and comprehension of context due to enhancements in architecture, training techniques, and optimization. It has also been fine-tuned on a more diverse set of tasks and prompts, which further enhances its ability to handle different types of queries and provide accurate and contextually relevant responses. Although ChatGPT-4 has its own set of limitations, it is designed to address some of the shortcomings of ChatGPT-3 and may have better control over its verbosity and relevance in responses. Overall, the advancements in ChatGPT-4 make it a powerful tool for improving language skills and providing accurate and relevant responses to a wide range of queries. Solve SQL knowledge tasks with ChatGPT-4 In this article, we will showcase how ChatGPT-4 can be a valuable tool for SQL developers, DBAs, and data analysts by demonstrating its ability to efficiently solve common SQL knowledge tests encountered in job interviews, courses, and other contexts. We will also evaluate the AI’s ability to optimize queries and compare its predictions with the actual results obtained using the industry-leading tool, [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) . Convert MySQL syntax to Oracle The first SQL conversion task we have for ChatGPT-4 is to convert MySQL syntax to [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) . Let’s see how well the AI can handle this challenge. So, the source syntax is the following: CREATE TABLE dept (\ndeptno decimal(2, 0) DEFAULT NULL COMMENT 'Department number',\ndname varchar(14) DEFAULT NULL COMMENT 'Department name',\nloc varchar(13) DEFAULT NULL COMMENT 'Department location'\n)\nCOMMENT = 'Company departments';\n\nALTER TABLE dept\nADD UNIQUE INDEX UK_dept_deptno (deptno);\n\n– Create table emp\nCREATE TABLE emp (\nempno decimal(4, 0) NOT NULL COMMENT 'Employee identification number',\nename varchar(10) DEFAULT NULL COMMENT 'Last name',\njob varchar(9) DEFAULT NULL COMMENT 'Position',\nmgr decimal(4, 0) DEFAULT NULL COMMENT 'Manager identification number',\nhiredate date DEFAULT NULL COMMENT 'Hire date',\nsal decimal(7, 2) DEFAULT NULL COMMENT 'Salary',\ncomm decimal(7, 2) DEFAULT NULL COMMENT 'Comment',\ndeptno decimal(2, 0) DEFAULT NULL COMMENT 'Department number'\n)\nCOMMENT = 'Employees';\n\nALTER TABLE emp\nADD UNIQUE INDEX UK_emp_empno (empno);\n\n– Create foreign key\nALTER TABLE emp\nADD CONSTRAINT FK_emp_deptno FOREIGN KEY (deptno)\nREFERENCES dept (deptno) ON DELETE NO ACTION;\n\nINSERT INTO dept VALUES ('10','ACCOUNTING','NEW YORK');\nINSERT INTO dept VALUES ('20','RESEARCH','DALLAS');\nINSERT INTO dept VALUES ('30','SALES','CHICAGO');\nINSERT INTO dept VALUES ('40','OPERATIONS','BOSTON');\n\nINSERT INTO emp VALUES ('7369','SMITH','CLERK','7902','1980-12-17','800.00',NULL,'20');\nINSERT INTO emp VALUES ('7499','ALLEN','SALESMAN','7698','1981-02-20','1600.00','300.00','30');\nINSERT INTO emp VALUES ('7521','WARD','SALESMAN','7698','1981-02-22','1250.00','500.00','30');\nINSERT INTO emp VALUES ('7566','JONES','MANAGER','7839','1981-04-02','2975.00',NULL,'20');\nINSERT INTO emp VALUES ('7654','MARTIN','SALESMAN','7698','1981-09-28','1250.00','1400.00','30');\nINSERT INTO emp VALUES ('7698','BLAKE','MANAGER','7839','1981-05-01','2850.00',NULL,'30');\nINSERT INTO emp VALUES ('7782','CLARK','MANAGER','7839','1981-06-09','2450.00',NULL,'10');\nINSERT INTO emp VALUES ('7788','SCOTT','ANALYST','7566','1982-12-09','3000.00',NULL,'20');\nINSERT INTO emp VALUES ('7839','KING','PRESIDENT',NULL,'1981-11-17','5000.00',NULL,'10');\nINSERT INTO emp VALUES ('7844','TURNER','SALESMAN','7698','1981-09-08','1500.00','0.00','30');\nINSERT INTO emp VALUES ('7876','ADAMS','CLERK','7788','1983-01-12','1100.00',NULL,'20');\nINSERT INTO emp VALUES ('7900','JAMES','CLERK','7698','1981-12-03','950.00',NULL,'30');\nINSERT INTO emp VALUES ('7902','FORD','ANALYST','7566','1981-12-03','3000.00',NULL,'20');\nINSERT INTO emp VALUES ('7934','MILLER','CLERK','7782','1982-01-23','1300.00',NULL,'10');\n\nALTER TABLE emp\nADD CONSTRAINT FK_emp_mgr FOREIGN KEY (mgr)\nREFERENCES emp (empno) ON DELETE NO ACTION; Let’s ask ChatGPT-4 for help. That’s the result we got. Please note that ChatGPT-4 has made the following conversions: decimal(4,0) to NUMBER(4,0), varchar(10) to varchar2(10), ‘1981-02-20’ to TO_DATE(‘1981-02-20’, ‘YYYY-MM-DD’), and ALTER TABLE emp ADD UNIQUE INDEX UK_emp_empno(empno) to CREATE UNIQUE INDEX UK_EMP_EMPNO ON EMP(EMPNO). However, it should be noted that ChatGPT-4 did not restore comments to the table and columns, even though Oracle has a special [COMMENT statement](https://docs.oracle.com/en/database/oracle/oracle-database/19/sqlrf/COMMENT.html) . It’s important to mention that when using ChatGPT-4 to convert MySQL scripts to Oracle, it may not always produce the correct result on the first try. To achieve the desired output, you may need to provide additional guidance such as specifying the Oracle version or which constraints to use. Run the generated query with dbForge Studio for Oracle Let us now validate the query using dbForge Studio for Oracle – a comprehensive solution for developing, managing, and maintaining Oracle databases. To get a deeper understanding of the capabilities and benefits of dbForge Studio for Oracle, take a look at our [recent release notes blog post](https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html) . Whether you’re a developer, DBA, or data analyst, dbForge Studio for Oracle can help you streamline your workflow and optimize your Oracle database management. The query has been executed successfully, as evidenced by the results. Generate query solutions Let’s now assess ChatGPT-4’s SQL proficiency by using that Oracle script to task the AI with writing SQL queries. Afterward, we will verify the accuracy and effectiveness of the provided queries by utilizing dbForge Studio for Oracle to ensure that they return the desired result. Task 1: Return a list of middle managers Suppose that we need to retrieve a list of mid-level employees from the database. A mid-level employee is an employee who has subordinates and is himself a subordinate. Let’s ask ChatGPT-4 to provide two variants of the query for this. The following is what it suggested. Let us now run those queries in dbForge Studio for Oracle to see if they are valid. Both queries have been executed successfully and returned the same result. This is truly impressive! Let’s proceed to further explore ChatGPT-4’s capabilities. Task 2: Return the id of managers who have more than two employees Now, let us ask ChatGPT-4 to write two variants of a query to return the identification number of managers who have more than two employees. Below is what we got. Great! Now, let’s execute the queries in dbForge Studio for Oracle to verify if they produce the desired results, just like we did for the previous task. This will help us ensure that the queries are functional and reliable. And again the result is perfect. The queries have been executed successfully and returned the output we expected. Shall we move forward? Task 3: Return the department with the highest average salary Suppose we want to find out which department has the highest average salary. Let’s request the AI to generate an SQL query that can retrieve the name of the branch with the highest average salary and the amount of that salary. As before, we will ask the AI to provide two different solutions to this task, which will help us explore different approaches to the problem. Below are the queries that the AI has created for us. Shall we validate the queries? Let’s execute them in dbForge Studio for Oracle to verify their accuracy. Task 4: Return the name of the branch’s head, the location of the branch, and the total salary for the branch Let’s ask ChatGPT-4 to create an SQL query to retrieve the name of the head of a branch, the location of the branch, and the total salary of their direct subordinates. In other words, we want the output to display the name of the head, the location of the branch, and the total salary for the branch. For this task, the head is defined as an employee who reports directly to the president of the company. Once again, let’s check the queries the AI has created in dbForge Studio, to make sure they are working properly. Analyze query performance Having checked ChatGPT-4’s query-building powers, let’s now test its query optimization capabilities by asking the AI to determine which of the two queries it provided will be the fastest. As you remember, in Task 1 we asked the AI to create the two variants of the query to return a list of middle managers. The following are the queries it created. Task 1: Variant 1 SELECT DISTINCT e1.ENAME\nFROM EMP e1\nWHERE e1.MGR IS NOT NULL\n AND EXISTS (\n SELECT 1\n FROM EMP e2\n WHERE e1.EMPNO = e2.MGR\n AND e2.EMPNO IS NOT NULL\n )\nORDER BY e1.ENAME; Task 1: Variant 2 SELECT E1.ENAME\n FROM EMP E1\n JOIN EMP E2\n ON E1.EMPNO = E2.MGR\n WHERE E1.MGR IS NOT NULL\n GROUP BY E1.EMPNO,\n E1.ENAME\n HAVING COUNT(*) > 0\n ORDER BY e1.ENAME; Let us ask ChatGPT-4 which query is more optimized. In the second task, we asked the AI to create two variants of the query to return the id of managers who have more than two employees. Below are the queries it has written for us. Task 2: Variant 1 SELECT e1.EMPNO \nFROM EMP e1 \nWHERE e1.EMPNO IN (\n SELECT e2.MGR \n FROM EMP e2 \n WHERE e2.MGR IS NOT NULL \n GROUP BY e2.MGR \n HAVING COUNT(*) > 2\n) \nAND e1.MGR IS NOT NULL \nORDER BY e1.EMPNO; Task 2: Variant 2 SELECT DISTINCT e1.EMPNO \nFROM EMP e1 \nJOIN (\n SELECT e2.MGR, COUNT(*) AS NUM_EMPLOYEES \n FROM EMP e2 \n WHERE e2.MGR IS NOT NULL \n GROUP BY e2.MGR \n) mgr_counts ON e1.EMPNO = mgr_counts.MGR \nWHERE e1.MGR IS NOT NULL \nAND mgr_counts.NUM_EMPLOYEES > 2 \nORDER BY e1.EMPNO; Let us ask ChatGPT-4 which query is the fastest. In Task 3 we wanted two solutions to the query that will return the department with the highest average salary and the amount of that salary. Below are the queries that the AI has provided us with. Task 3: Variant 1 SELECT d.DNAME AS branch_name, AVG(e.SAL) AS avg_salary\nFROM DEPT d\nJOIN EMP e ON d.DEPTNO = e.DEPTNO\nGROUP BY d.DNAME\nHAVING AVG(e.SAL) = (\n SELECT MAX(avg_salaries)\n FROM (\n SELECT AVG(SAL) AS avg_salaries\n FROM EMP\n GROUP BY DEPTNO\n )\n)\nORDER BY avg_salary DESC\nFETCH FIRST 1 ROW ONLY; Task 3: Variant 2 SELECT d.DNAME AS branch_name, AVG(e.SAL) AS avg_salary \nFROM DEPT d \nJOIN EMP e ON d.DEPTNO = e.DEPTNO \nWHERE d.DEPTNO IN (\n SELECT DEPTNO \n FROM (\n SELECT DEPTNO, AVG(SAL) AS avg_salary \n FROM EMP \n GROUP BY DEPTNO \n ORDER BY AVG(SAL) DESC \n ) \n WHERE ROWNUM = 1 \n) \nGROUP BY d.DNAME; And again, let’s ask ChatPGT-4 which query is likely to be executed faster. In Task 4, we asked for two variants of the query that will return the name of the branch’s head, the location of the branch, and the total salary for the branch. The following is what we got. Task 4: Variant 1 SELECT HEAD.ENAME AS HEAD_NAME,\n D.LOC AS DEPARTMENT_LOCATION,\n SUM(S.SAL) AS TOTAL_SALARY\n FROM EMP HEAD,\n DEPT D,\n EMP S\n WHERE HEAD.DEPTNO = D.DEPTNO\n AND S.MGR = HEAD.EMPNO\n AND HEAD.MGR = (SELECT EMPNO\n FROM EMP\n WHERE JOB = 'PRESIDENT')\n GROUP BY HEAD.ENAME,\n D.LOC; Task 4: Variant 2 SELECT E.ENAME AS HEAD_NAME,\n D.LOC AS DEPARTMENT_LOCATION,\n SUM(S.SAL) AS TOTAL_SALARY\n FROM EMP E\n JOIN DEPT D\n ON E.DEPTNO = D.DEPTNO\n JOIN EMP S\n ON S.MGR = E.EMPNO\n WHERE E.MGR = (SELECT EMPNO\n FROM EMP\n WHERE JOB = 'PRESIDENT')\n GROUP BY E.ENAME,\n D.LOC; Shall we ask the AI which one will perform better? As you can see, ChatGPT-4 can assist database specialists by providing information about query performance and optimization. However, why don’t we check if it was correct in its predictions? Check the AI-generated queries with Query Profiler dbForge Studio for Oracle is equipped with a powerful [query profiler](https://www.devart.com/dbforge/oracle/studio/oracle-sql-profiler.html) that can help PL/SQL developers and data analysts optimize their SQL queries. By using this tool, they can easily identify the bottlenecks and improve the performance of their queries. With the help of this tool, we will now verify the answers generated by ChatGPT-4. Task 1: Return a list of middle managers Let us run the two queries in dbForge Studio for Oracle in the Query Profiling mode and compare the results. As we can see from the results obtained using dbForge Studio, the first query performs slightly better, which contradicts what the AI suggested. JOIN and EXISTS are two techniques used to combine data from multiple tables in Oracle. In a JOIN operation, the database searches for matching rows in related tables and returns a result set with all selected columns. JOINs can be expensive, particularly with large datasets. EXISTS queries, however, check for the existence of matching rows in a subquery, and only need to verify the presence of matching rows. It is crucial to consider the specific use case and the data volume to make an informed decision on query performance. By analyzing JOIN and EXISTS row metrics in query profiling, you can determine which query is more costly and optimize it accordingly. Task 2: Return the id of managers who have more than two employees Let us use dbForge Studio to check the performance of the next two queries. And again, when we run the queries through dbForge Studio’s query profiler, we find that the actual results differ from the AI’s predictions. The profiler provides us with detailed performance metrics, such as execution time and resource utilization. You may notice that the use of certain indexes or table structures may have contributed to the differences in performance between the queries. Task 3: Return the department with the highest average salary This time the query profiling results align with the AI’s predictions. Task 4: Return the name of the branch’s head, the location of the branch, and the total salary for the branch And again the results we obtained with dbForge Studio for Oracle complied with what the AI has told us. Query performance is heavily influenced by the amount of data, as well as the use of foreign keys and indexes, table structure, etc. In this regard, query profiling is a reliable tool that provides accurate information about query execution times. And while AI predictions can be useful, they may not always be accurate due to a lack of context. Therefore, it is recommended to use [performance tuning](https://www.devart.com/dbforge/oracle/studio/performance-tuning.html) techniques and [Oracle Explain Plans](https://www.devart.com/dbforge/oracle/studio/oracle-explain-plan.html) . If you are interested in learning more about the application of AI in the database field, we recommend reading our articles on [How to Use ChatGPT to Write SQL JOIN Queries](https://blog.devart.com/how-to-use-chatgpt-to-write-sql-join-queries.html) and [How ChatGPT Can Help You Retrieve MySQL Data](https://blog.devart.com/power-up-your-mysql-queries-how-chatgpt-can-help-you-retrieve-mysql-data.html) . They will provide you with a deeper understanding of how AI can be leveraged to improve query performance and streamline the query writing process. Check them out to learn more! Conclusion In conclusion, we can say that ChatGPT-4 has proven to be a valuable asset in the field of database development and management, particularly in generating SQL queries. However, it is essential to understand that AI should be used in conjunction with human operators and specialized database tools such as query profilers, for example, to achieve the most reliable and accurate results. By leveraging AI’s capabilities and utilizing the right tools, developers and analysts can optimize their queries and improve their overall performance, thus enhancing their productivity and delivering more efficient solutions. You can [download a free 30-day trial of dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) at our website and see for yourself how it can help you improve the performance of your Oracle queries. Tags [AI](https://blog.devart.com/tag/ai) [ChatGPT](https://blog.devart.com/tag/chatgpt) [oracle tools](https://blog.devart.com/tag/oracle-tools) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-and-optimize-oracle-queries-with-chat-gpt-4.html) [Twitter](https://twitter.com/intent/tweet?text=Exploring+ChatGPT%E2%80%99s+Capabilities+in+Creating+and+Optimizing+SQL+Queries+for+Oracle&url=https%3A%2F%2Fblog.devart.com%2Fcreate-and-optimize-oracle-queries-with-chat-gpt-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html&title=Exploring+ChatGPT%E2%80%99s+Capabilities+in+Creating+and+Optimizing+SQL+Queries+for+Oracle) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html&title=Exploring+ChatGPT%E2%80%99s+Capabilities+in+Creating+and+Optimizing+SQL+Queries+for+Oracle) [Copy URL](https://blog.devart.com/create-and-optimize-oracle-queries-with-chat-gpt-4.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/create-database-in-postgresql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Create a New Database in PostgreSQL By [dbForge Team](https://blog.devart.com/author/dbforge) September 17, 2021 [0](https://blog.devart.com/create-database-in-postgresql.html#respond) 7420 In this article, we are going to describe different ways to create a PostgreSQL database from scratch. Here, you will find a tutorial for setting up a database using the CREATE DATABASE command in the command line, the pgAdmin platform, and [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . We have also included a part dedicated to creating a user in PostgreSQL. PostgreSQL is an object-relational open-source system for database management based on Postgres 4.2. It supports most of the SQL standards and offers a variety of modern features. The DBMS was developed at the Department of Computer Science, University of California, Berkeley. There are several ways to create a PostgreSQL database with an owner and we decided to divide this article into parts. Each part corresponds to a separate tool and describes the algorithms of both database and user creation. Set Up a PostgreSQL on Windows The first step in setting up PostgreSQL on Windows is downloading and installing it on your computer. [Get the required version of the installer](https://www.enterprisedb.com/downloads/postgres-postgresql-downloads) and launch it once the download is over. Simply follow the instructions in the installation wizard. Make sure to include the additional components during the installation. Feel free to refer to [our blog article on starting a PostgreSQL server](https://blog.devart.com/download-install-postgresql-on-windows.html) for a more detailed illustrated guide. Creating a User During the installation of PostgreSQL, the default user postgres has already been created. However, that might not always be enough. Most tools for PostgreSQL allow creating both a database and a user. Let us describe the ways to create a user in PostgreSQL along with the instructions on how to create a database. Set Up a Database Using PSQL Command Line Create a Database 1. Open a psql command-line tool. For example, SQL Shell (psql) comes by default along with the PostgreSQL Server installer. 2. After that, log in as the superuser ( postgres by default). Now you are all set to start the database creation. For this, enter the CREATE DATABASE to the command line and specify the name of the database: CREATE DATABASE testdb; Note Don’t forget to add a semicolon at the end of the command, since it will not be executed without it. We have just described the way to create a default PostgreSQL database using the command line. If you simply enter the CREATE DATABASE command, the database will be created according to the template. However, you can also alter the following specifications: Encoding Collation Locale The command will look somehow like this if you specify all the possible settings: CREATE DATABASE testdb WITH ENCODING 'UTF8' LC_COLLATE='English_United States' LC_CTYPE='English_United States'; Create a User The process of creating a PostgreSQL user is akin to database creation. The only difference is in the command that you will need to use: CREATE USER . Make sure to specify the username after the command itself. If the CREATE ROLE is returned, the command was completed successfully. It is also possible to specify the privileges for the user this way. The whole list of the possible options for the CREATE USER command can be found on the [PostgreSQL website](https://www.postgresql.org/docs/12/app-createuser.html) . Create a Database in PostgreSQL using pgAdmin pgAdmin is a default user-friendly psql client. Basically, it comprises the same functionality as the command-line tool, but in the form of an intuitive graphical client. How to Create a Database in pgAdmin 1. To begin with, launch pgAdmin. 2. Right-click Databases . Then, point to Create and click Database . 3. In the Create Database dialog box that opens, enter the name for the future PostgreSQL database and choose the owner: 4. Switch to the Definition tab. Here, you will be able to choose the encoding , database template , tablespace , collation , character type , and connection limit . By default, the connection limit is set to -1 . This way, there will be no connection limit applied for your database. 5. Once you have entered all the necessary configurations, click Save . 6. Now, you will see the newly created database in the menu on the left. How to Create a User in pgAdmin 1. To create a user, right-click PostgreSQL 13 . Point to Create and then click the Login/Group Role . 2. Enter the username and go to the Definition tab. 2. Type the password for the new user, set the expiration time and the limit for possible connections. Once done, go to the next tab. 3. Set the privileges for the account and click Save to finish the process. Create Database in PostgreSQL using dbForge Studio for PostgreSQL [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is a GUI tool by Devart, designed to simplify your day-to-day database development and management routine. It boasts rich functionality including but not limited to creating tables and data reports, editing data, importing and exporting data, building pivot tables, and master-detail relations. Create a Database To begin with, open the IDE on your computer. In case you do not have it yet, feel free to [download a free 30-days trial](https://www.devart.com/dbforge/postgresql/studio/download.html) to test the full functionality of the solution. 1. In the upper left corner of the window, click New SQL . 2. What makes database creation in dbForge Studio more convenient is code auto-completion and syntax check . Start typing CREATE DATABASE and the code auto-completion will offer you possible commands. To choose one, use the up and down arrows on the keyboard and click Enter . Do not forget to specify the name of the database. The built-in syntax check will save your time by pointing out errors and typos. 3. After that, hit Execute button in the upper left corner of the window. The tool will start the process and notify you once done: 4. Click the Refresh button that is located right above the connection name. Now, you will see the database you just created in the list. Create a User In dbForge Studio, you can create a PostgreSQL user as easily as a database. First of all, click New SQL and type in the CREATE ROLE command into the console window: After that, execute the command by clicking the Execute button. As soon as the application finishes the process, you will see the notification about the successful execution: Note: You can learn all about creating PostgreSQL indexes of different types in [How to create indexes in PostgreSQL](https://blog.devart.com/postgresql-indexes.html) . Conclusion To sum up, there is more than one way to create a database and a user in PostgreSQL. In this article, we have focused on three of them: PSQL command line pgAdmin dbForge Studio for PostgreSQL The PSQL command-line tool is more oriented toward experienced tech-savvy users who are familiar with SQL coding. On the contrary, pgAdmin embraces the same features but in a user-friendly intuitive design, which might help beginners join the club of database management. As for [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , it comprises the best features of both: a convenient graphical interface and an improved console. You may also check our [feature-by-feature comparison of dbForge Studio and pgAdmin](https://www.devart.com/dbforge/postgresql/studio/pgadmin-alternatives.html) to learn all the differences in detail. Tags [create database](https://blog.devart.com/tag/create-database) [create user account](https://blog.devart.com/tag/create-user-account) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [studio for postgresql](https://blog.devart.com/tag/studio-for-postgresql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-database-in-postgresql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+New+Database+in+PostgreSQL&url=https%3A%2F%2Fblog.devart.com%2Fcreate-database-in-postgresql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-database-in-postgresql.html&title=How+to+Create+a+New+Database+in+PostgreSQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-database-in-postgresql.html&title=How+to+Create+a+New+Database+in+PostgreSQL) [Copy URL](https://blog.devart.com/create-database-in-postgresql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/create-database-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Create a Database in SQL Server By [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) January 10, 2025 [0](https://blog.devart.com/create-database-in-sql-server.html#respond) 2362 Microsoft SQL Server is recognized as a robust relational database management system (RDBMS) designed for the efficient storage and management of large volumes of data. The system provides many features and tools required to develop and administer databases. As a result, SQL Server stands as a popular solution among developers and enterprises for completing different tasks in the field of data. Creating a database in SQL Server is very important for effectively managing data in the context of a modern business environment. Here are the key points that highlight the value of this: Data organization : SQL databases allow you to structure information into a logical system. This makes it easier to keep, update, and use data in the future. Effective access management : In SQL Server, you can restrict permissions and rights to sensitive database records. Backup and recovery : It’s possible to save and retrieve copies of data in an emergency. Support for big data sets : SQL Server is good at coping with extensive information load. Data integrity : You can set rules and limitations for data consistency. This strategy is crucial to avoid possible errors. Contents Important information Prerequisites Create a database in SQL Server Method 1: Using T-SQL in SQL Server Management Studio Method 2: Using SQL Server Management Studio (SSMS) Method 3: Using dbForge Studio for SQL Server Post-creation steps Additional information Tips and best practices FAQ Conclusion Important information Prerequisites To be able to create a database in SQL Server for beginners with the suggested three methods, you need to install: [SQL Server](https://www.microsoft.com/en-us/sql-server/sql-server-downloads) [SQL Server Management Studio (SSMS)](https://learn.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver16) [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) Create a database in SQL Server To create a database in SQL Server, you can use Transact-SQL (T-SQL) in SQL Server Management Studio (SSMS) and dbForge Studio for SQL Server. SQL Server Management Studio (SSMS) is a powerful tool that delivers advanced database administration features. With its help, you can create a database in SSMS, tables, indexes, triggers, and other objects. dbForge Studio for SQL Server is another robust instrument for developing databases in SQL Server. It provides extensive features for working with databases, including convenient visual database design, integrated tools for query optimization, support for version control systems, and much more. Transact-SQL (T-SQL) is commonly used to create a database in SQL Server. It’s a programming language that performs various operations with data, such as generating tables, indexes, stored procedures, managing access, and so on. T-SQL “create database” commands help developers build complex structures of databases and adjust them to business requirements. In the following sections, we’ll demonstrate how to create the development database. Method 1: Create T-SQL in SQL Server Management Studio 1. Open SSMS and connect to the instance where you want to create your database. 2. Click New Query and type the statement: CREATE DATABASE development; 3. To execute the query, click Execute or press F5 . 4. To check the created database, refresh Object Explorer. This process is straightforward and trouble-free, but sometimes, issues can arise while creating databases. To avoid common errors like existing database name conflicts, we’ve prepared several tips: Review database names already on the server to ensure you will not create a duplicate one. Use a unique name for your database. This approach will help to identify the purpose or context of the database. Consider schema ownership when you create multiple databases to skip confusion or conflicts between databases. Ensure you have the necessary permissions and rights to create a database on the required SQL Server instance. Choose a name that does not contain reserved words or special characters, as they might cause syntax errors. Test the creation process in a non-production environment to detect and solve issues beforehand. Method 2: How to Create a Database in SQL Server Management Studio (SSMS) Also, you can create a database with the New Database option in SSMS. To create a new database in SQL Server, proceed with the following steps: 1. In Object Explorer, right-click Databases and select New Database . 2. In the Database name field, enter the required name for your database. 3. To configure the database owner, click three dots and type the name into the field. Additionally, you can select the necessary name from the list of existing usernames on the instance by clicking Browse . 4. Specify the desired size for the database files. If you need to add a data or transaction log file, click Add . Enter the file name, choose the file type and filegroup, and then set the initial size. 5. Customize the autogrowth settings for the files. Click three dots, adjust the options to your needs, and click OK . 6. Choose the location where the database files will be stored. Click three dots and specify the required folder. 7. Having configured everything you need, click OK . To check the created database, refresh Object Explorer. Method 3: Using dbForge Studio for SQL Server In the sphere of database management, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) stands as a powerful alternative to SQL Server Management Studio (SSMS). The tool provides an advanced suite of features to optimize work with databases. dbForge Studio is designed to meet the demanding requirements of developers, administrators, and database specialists who want to have a reliable platform for efficient development and administration. Let’s see how easy it is to create the same development database in dbForge Studio for SQL Server. 1. Navigate to Database > New Database . 2. Enter the database name in Name . 3. To change the owner name, choose the necessary one from Owner . 4. Set the initial size for the database files. 5. To configure the autogrowth options, click three dots next to the required file. 6. To change the path of the files, click three dots. 7. On the Filegroups tab, you can add a new filegroup. 8. The Options tab contains different customizable database settings. 9. To create the database, click Apply Changes . Post-creation steps Currently, the newly created database is empty. It’s required to add objects, such as tables, views, procedures, etc, to it. In this example, we’ll show you how to insert the testdate table with two columns. 1. Right-click the database, point to New Object , and select Table . 2. Name the table and choose the necessary schema. 3. Set the table type. 4. Let’s add some columns. Right-click in the Columns section and select New column . 5. Type the column name and choose whether to allow nulls. To specify the column as a primary key, click the checkbox next to the column name. 6. Similarly, you can add as many columns as you need. To apply the changes, click Apply Changes . The table with the columns will be visible in the database. Great! We have the database and table with two columns. But this is not the end yet. We should populate these columns with some data. For this, proceed with these steps: 1. Right-click the table and navigate to Generate Script As > INSERT > To New SQL Window . 2. Enter the values and click Execute . 3. To check the successful result, right-click the table and select Retrieve Data . And you’ll see the added values. Additional information Tips and best practices In this section of the article, we would like to highlight some common approaches that should be applied to each database in SQL. These methods are generally accepted norms for simplifying database management. Everyone who deals with database development knows that security measures are a must. That’s why you must consider protecting confidential data from unauthorized access or malicious attacks. These essential principles can help reduce the risk of unauthorized access and information leaks. We suggest that you learn about our considerations regarding this: Implement complex passwords, multi-factor authentication (MFA), and role-based access control (RBAC) to control access to your databases. Use monitoring and logging tools to check database activities and collect performance statistics. Configure alerts to be notified of unusual behavior. Establish Transparent Data Encryption (TDE) to encrypt data files and utilize SSL/TLS for secure communication between the database server and client applications. Develop a secure way for the backup and recovery strategy to ensure that you can restore critical data in case of any failures. Configure Firewall rules to restrict access to the database server from unauthorized IP addresses. Ensure compliance with relevant data protection regulations (such as GDPR, HIPAA, or others, depending on the industry) to avoid legal issues and maintain data privacy standards. Another aspect you should focus on is performance. Implementing effective productivity-tuning strategies can enhance the database’s ability to handle workloads, queries, and transactions. Here are essential steps for improving efficiency for new databases: Design the database schema according to strategies that reduce redundancy and ensure data integrity. Create indexes on tables to enhance query performance. Check that the database server has enough resources like CPU, memory, and storage to operate workloads. Monitor and optimize SQL queries using query execution plans, indexes, and rewriting inefficient statements. Use performance monitoring tools to track database performance metrics like CPU usage, memory consumption, and query execution times. Plan for future growth by considering scalability options such as partitioning, sharding, or scaling up/out resources. Follow best practices recommended by the database platform to optimize performance. By implementing these steps and continuously monitoring and refining the database’s productivity, you can ensure that your new databases are optimized for efficiency, scalability, and responsiveness. Method Steps Overview Key Tips T-SQL in SSMS – Open SSMS and connect to the instance. – Click ‘ New Query ’ and type: CREATE DATABASE myDatabase ; – Execute the query and refresh Object Explorer. – Use unique database names to avoid conflicts. – Test in a non-production environment first. SSMS GUI – Right-click ‘ Databases ’ in Object Explorer and select ‘ New Database ’. – Enter the database name and configure owner, size, and file locations. – Click ‘ OK ’ to create the database. – Add a table with 3 columns via the GUI. – Avoid reserved words in names. – Ensure proper permissions before starting. dbForge Studio – Navigate to ‘ Database > New Database ’. – Enter the database name and configure owner, size, and file paths. – Use the ‘ Filegroups ’ and ‘ Options ’ tabs for advanced settings. – Leverage the ‘ Options ’ tab for custom settings. – Use the ‘ Filegroups ’ tab for better file organization. FAQ How do I create a new database in SQL Server? You can create a new database in SQL Server in several ways: using SQL Server Management Studio (SSMS) through the graphical interface or executing a T-SQL CREATE DATABASE statement. Alternatively, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) provides an intuitive interface that enables you to create databases using a visual editor or directly via SQL queries. What is the SQL command to create a database? The following SQL command is used in this case: CREATE DATABASE myDatabase; This command creates a new database with the designated name. How to create a database step by step? To create a database, open SQL Server Management Studio (SSMS) and connect to the target server. You can create the database by right-clicking the “Databases” folder in Object Explorer, selecting “New Database”, filling in the required properties, and clicking “OK”. Alternatively, you can execute a T-SQL CREATE DATABASE command in the New Query window to customize the configuration. In [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , connect to the server, open the Database menu, select “New Database”, configure your settings using the visual designer, and click “Apply Changes” to finalize the creation. What is a database in SQL Server? A database in SQL Server is an organized collection of data, which is managed by the SQL Server engine. It is structured into tables and, optionally, schemas that serve to strengthen data security and improve its organization. The primary function of such databases is to allow users to store, retrieve, and manipulate data using SQL queries in an efficient way. Conclusion As you can see, there are several ways to create a database in SQL, and it’s up to you which one to choose. While dbForge Studio for SQL Server and SQL Server Management Studio are crucial for SQL Server users, each offers unique features and benefits that satisfy different preferences and requirements. dbForge Studio impresses with its intuitive interface and a comprehensive environment for database development, management, and optimization. Its user-friendly interface and support for various database-related tasks make it a valuable tool for developers and administrators. Download [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and enjoy the possibilities of efficient database solutions! Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) When writing articles, Hanna Khyzhnia follows two main rules: any technical information should be presented in a way that even a child could understand it, and the language of the text must be as simple and accessible as possible for users. She aims to help readers dive into details, make decisions, and find answers without unnecessary confusion. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-database-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+Database+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fcreate-database-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-database-in-sql-server.html&title=How+to+Create+a+Database+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-database-in-sql-server.html&title=How+to+Create+a+Database+in+SQL+Server) [Copy URL](https://blog.devart.com/create-database-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/create-login-user-and-grant-permission-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Create Login, User, and Grant Permissions in SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) November 25, 2021 [0](https://blog.devart.com/create-login-user-and-grant-permission-sql-server.html#respond) 34783 In the article, we are going to examine how to create a new user account and grant/revoke permissions and roles on a database object, as well as how to check SQL Server user permissions on the database using T-SQL, SQL Server Management Studio, and dbForge Studio for SQL Server. When it comes to protecting sensitive data and keeping the database environment secure, the management of database permissions and privileges takes center stage. For example, you want a specific person to be able to modify and manipulate personal data, while others can only view this data. In this case, database administrators will need to create a new user or role with specific permissions that can be assigned at the server, database, or schema levels. To implement this task, they can use T-SQL statements provided by SQL Server or third-party tools. To move on, we are going to dig deeper into the following how-tos: Create a new login in SQL Server CREATE LOGIN statement using Windows Authentication Create a SQL Server Authentication login Create a login from a certificate Create a login from an asymmetric key Create a new user in SQL Create a user using T-SQL Create a user in SSMS Create a user with dbForge Studio Grant permissions in SQL Grant permissions using T-SQL Grant permissions in SSMS Grant permissions in dbForge Studio for SQL Server Check user and login permissions in dbForge Studio for SQL Server Revoke privileges in SQL Revoke all privileges using T-SQL Revoke permissions in SSMS Revoke permissions in dbForge Studio Deny permissions in SQL Deny permissions using T-SQL Deny permissions in SSMS Deny permissions in dbForge Studio Assign roles in SQL Assign roles using T-SQL Assign roles in SSMS Assign roles in dbForge Studio Why to choose dbForge Studio Create a new login in SQL Server Before creating a database user, you should create a new login based on Windows Authentication, SQL Server authentication, a certificate, or an asymmetric key. To add a new login, use the CREATE LOGIN statement. It creates a login [connected to a SQL Server instance](https://blog.devart.com/how-to-connect-to-sql-server.html) . The login will then be mapped to the specified user account. The syntax is as follows: Windows Authentication CREATE LOGIN login_name \n FROM WINDOWS\n[ WITH DEFAULT_DATABASE = database_name\n| DEFAULT_LANGUAGE = language_name ]; SQL Server authentication CREATE LOGIN login_name \n WITH PASSWORD = { 'password' | hashed_password HASHED } [ MUST_CHANGE ]\n[ , SID = sid_value\n | DEFAULT_DATABASE = database_name\n | DEFAULT_LANGUAGE = language_name\n | CHECK_EXPIRATION = { ON | OFF }\n | CHECK_POLICY = { ON | OFF }\n | CREDENTIAL = credential_name ]; Certificate CREATE LOGIN login_name\nFROM CERTIFICATE certificate_name; Asymmetric key CREATE LOGIN login_name\nFROM ASYMMETRIC KEY asym_key_name; The table describes the arguments used in the CREATE LOGIN statements. Argument Description login_name Name of the login connected to the server. database_name Name of the default database to which the login will be assigned. language_name Default language for the login you create. password Password for the login you create. hashed_password Hashed value of the password for the login you create. MUST_CHANGE Prompts to change the password upon the connection. sid_value Value used to recreate a login. It can be used only for logins with SQL Server authentication. If sid_value is not set, SQL Server will assign a new SID. CHECK_EXPIRATION Defines whether the password expiration policy is applied. It should be set to ON if you use the MUST_CHANGE option. CHECK_POLICY When the argument is set to ON, it indicates that Windows password policy of the computer on which SQL Server is running should be applied to the login as well. credential_name Name of the credential to be assigned to the SQL Server login. certificate_name Name of the certificate to be mapped to the SQL Server login. asym_key_name Name of the asymmetric key to be mapped to the SQL Server login. CREATE LOGIN statement using Windows Authentication To add a SQL Server user based on Windows authentication, run the CREATE LOGIN statement with the following arguments: CREATE LOGIN JordanS \nFROM WINDOWS; This will create a new login JordanS on a SQL Server instance using Windows authentication. Create a SQL Server Authentication login For example, let’s create a login JordanS with the password ‘pass123’ using SQL Server authentication. For this, run the following command: CREATE LOGIN JordanS\nWITH PASSWORD = 'pass123'; If you want to change the password at the first login, add the MUST_CHANGE argument with enabled CHECK_EXPIRATION to the CREATE LOGIN statement: CREATE LOGIN JordanS\nWITH PASSWORD = 'pass123' MUST_CHANGE, \nCHECK_EXPIRATION = ON; Create a login from a certificate Now, let’s see how to create a login JordanS using a SQL Server certificate certificate123 . CREATE LOGIN JordanS\nFROM CERTIFICATE certificate123; Create a login from an asymmetric key Finally, add a new login, JordanS, using an asymmetric key – key_123 – in SQL Server. CREATE LOGIN JordanS\nFROM ASYMMETRIC KEY key_123; Create a new user in SQL After creating the login, it is time to add a new user using one of the following methods: T-SQL SSMS dbForge Studio for SQL Server Create a user using T-SQL To create a new user account, use the CREATE USER statement: CREATE USER for login where: username is the name of the user you want to create. Note that the username must be unique for the user within the database. login_name is the existing login on the server with which you want to associate the user. For example, create a new user – Manager associated with the login JordanM. To do this, execute the following CREATE USER query: CREATE USER Manager FOR LOGIN JordanM; To check that the user has been created, execute the SELECT query on the sys.database_principals system view. The query retrieves a list of all users created in the SQL Server database. SELECT *\nFROM AdventureWorks2019.sys.database_principals; The output should be as follows: Create a user in SSMS As an alternative, you can use SQL Server Management Studio (SSMS) to create a user. To begin, open SSMS and connect to the server on which you want to create a user. In Object Explorer , expand the Databases node and select the database where you want to create the user. Under the selected, right-click the Security folder and select New > User . This will open the Database User – New dialog, where you need to specify the following user details: User name : Enter the name of the user you want to create. Login name : Click the ellipsis icon to choose a login that will be tied to the user. Optional: Default schema : Specify the default schema for the user. Then, click OK to create the user. Create a user with dbForge Studio Now, we are going to see how comfortable it is to deal with users using an ultimate SQL Server administration tool, Security Manager , built into [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . It’s a perfect [alternative to SQL Server Management Studio](https://www.devart.com/dbforge/sql/studio/alternative-to-ssms.html) that you can use for most of the tasks. [Security Manager](https://www.devart.com/dbforge/sql/studio/sql-server-administration.html) is a reliable tool that ensures a secure and efficient way to control access to database objects and data, to create and manipulate users and roles, and to grant and revoke privileges and permissions. To get started, open Security Manager by using one of the following ways: On the Start Page, switch to the Administration tab and click Manage Server Security . On the Database menu, select Security Manager . In the Security Manager, select Create User from the list. On the General page , fill in the following details: Name : Enter the name of the new user. Login name : Select an existing login from the list. Optional: Default schema : Select the default schema for the user. Click OK to save the changes. Note that some parameters may vary depending on the type of user. As you can see, thanks to its modern interface of the robust tool Security Manager available in the Studio, you get a user-friendly experience when creating users in SQL. Next, we can proceed with assigning permissions and privileges. Grant permissions in SQL Permissions and privileges control the access to SQL Server data and database objects. You can grant user privileges to different database objects in SQL Server. Privileges can be of two types: System privileges that allow users to create, alter or drop database objects . Object privileges that allow users to execute, select, insert, update, or delete data on database objects to which the privileges were assigned. Note that only database administrators or owners of database objects can provide or revoke privileges. In this section, we’ll consider how to grant permissions using T-SQL, SSMS, and dbForge Studio for SQL Server. Grant permissions using T-SQL The GRANT statement provides user access and permissions on database objects. The basic syntax is as follows: GRANT privileges \n ON database_name.object\nTO {user_name |PUBLIC |role_name}\n[WITH GRANT OPTION]; The table describes the arguments used in the GRANT statements. Argument Description privileges Permissions you want to grant. They include the SELECT, INSERT, UPDATE, DELETE, REFERENCES, ALTER, or ALL statements. database_name Name of the database to which the database object belongs. object Database object on which the specified privileges will be assigned. user_name Name of the user whom the privileges will be granted. PUBLIC Used to grant permissions to all users. role_name Set of privileges grouped in one category. WITH GRANT OPTION Used to grant permissions to other users. For example, let’s grant the SELECT, INSERT, and ALTER privileges on the HumanResources.Employee table to the user Manager that we created. To do this, execute the following query: GRANT SELECT, INSERT, ALTER ON HumanResources.Employee TO Manager; Now, check that the permissions on the HumanResources.Employee table have been assigned to the user Manager. To do this, use the system view – fn_my_permissions – and execute the following query: EXECUTE AS USER = 'Manager';\nGO\nUSE AdventureWorks2019\nGO\nSELECT * FROM fn_my_permissions('HumanResources.Employee', 'OBJECT')\nGO The output is as follows: Grant permissions in SSMS Giving permissions to a user in SSMS involves several steps. In Object Explorer , expand the Databases node and select the database where you want to grant permissions. Then, navigate to the Security > Users folder. Right-click the user to whom you want to grant permissions and select Properties . In the Database User dialog, select the Securables page from the left sidebar menu. On this page, you can grant permissions to the user for specific objects (e.g., tables, views, stored procedures). To add the object, click Search. This will open the Add Objects popup with three options: Specific objects: Select this option to grant permissions to specific database objects. All objects of the types: Select this option to grant permissions to a specific type of database objects, such as tables, stored procedures, views, asymmetric keys, databases, etc. All objects belonging to the schema: Select this option to grant permissions all the database objects of the specified schema. In our example, we select the last option and click OK . This will display all the database objects under this schema in the Securables section. Then, select the database object for which you want to assign permissions. Under Permissions for , select the checkboxes next to the required permission, and click OK to apply and save the changes. Grant permissions in dbForge Studio for SQL Server You can grant permissions for the user in Security Manager . On the left, select the required user and switch to the Object Permissions tab on the right. In the Objects section, select the database object on which you want to grant privileges. In the Available Privileges section, select the checkboxes next to the permissions to assign and click Save . Check user and login permissions in dbForge Studio for SQL Server To get a list of object-level permissions assigned to the user, select the user in Security Manager and switch to the Object Permissions tab. Then, select the object and view the available permissions. In addition, you can check the list of permissions granted to the login at the server level. For this, in Security Manager , select the login for which you want to view the permissions and switch to the Server Permissions tab. The selected checkbox will be next to the assigned permissions in the Granted column. Revoke privileges in SQL In this section, we’ll explore how to revoke privileges using T-SQL, SSMS, and dbForge Studio. Revoke all privileges using T-SQL If you want to remove the privileges on the database object from the user, you can use the REVOKE query: REVOKE privileges\nON object\nFROM {user_name |PUBLIC |role_name} Replace privileges with SELECT, INSERT, UPDATE, DELETE, REFERENCES, ALTER, or ALL. Let us remove the INSERT and ALTER permissions on the HumanResources.Employee table assigned to the user Manager by running the following query: REVOKE INSERT, ALTER ON HumanResources.Employee TO Manager; To verify the permissions granted, execute the following query using the fn_my_permissions function. It returns a list of the permissions for the current or specified user on a specific securable, such as a database, table, or view: EXECUTE AS USER = 'Manager';\nGO\nUSE AdventureWorks2019\nGO\nSELECT * FROM fn_my_permissions('HumanResources.Employee', 'OBJECT')\nGO As you can see, the INSERT and ALTER permissions were removed, leaving only the SELECT permission still assigned. Revoke permissions in SSMS We have covered how to revoke permissions using T-SQL. Now, it is time to see how to do the same operation using SSMS. Permissions can be revoked on the Securables page of the Database User dialog. In the Database User dialog, select the Securables page from the left-hand menu, which shows a list of objects and the permissions the user currently has. Then, select the required object and clear the checkboxes next to the permissions you want to revoke in the Grant column of the grid. To save the changes, click OK . Revoke permissions in dbForge Studio In contrast to SSMS, dbForge Studio allows you to revoke either specific or all permissions in Security Manager. To revoke specific changes, select the privilege, click Revoke Selected Privilege and click Save to save the changes . To remove all the granted privileges, click Revoke All Privileges . Deny permissions in SQL The DENY permission explicitly blocks access to the database object and overrides any permissions that may have been granted previously. For example, if the object has the GRANT and DENY permissions, the DENY permissions will prevail over the GRANT ones. Deny permissions using T-SQL In SQL, permissions can be denied using the DENY statement. The basic syntax is as follows: DENY [permission] ON [database_object] TO [user_or_role]; where: permission is the specific permission or privilege you want to deny, such as SELECT , INSERT , UPDATE , DELETE , etc. database_object is the database object on which the permission should be denied. This could be a table, view, schema, or even the entire database. user_or_role is the user or role to whom you want to deny the permission. For example, execute the following query: DENY INSERT ON HumanResources.Employee TO Manager; This command explicitly forbids the Manager user from inserting data into the Employee table, regardless of any other permissions that can be granted to Manager. Deny permissions in SSMS In SSMS, you can deny permissions in the Database User dialog on the Securables page. Select the database object and select the checkboxes next to the permissions you want to deny in the Deny column and click OK to save the changes. Deny permissions in dbForge Studio With dbForge Studio, you can easily manage permissions in Security Manager. To deny a permission, start by selecting the required user from the left-side menu. Then, in the Objects section, choose the specific database object from which you want to deny permissions. In the bottom grid, select the checkbox next to the corresponding privilege in the Deny column. Finally, click Save on the toolbar to apply the changes. Assign roles in SQL What if you need to apply privileges to a group of users rather than to a single user? In this case, it would be better to define the role , a set of privileges and permissions. Thus, a user who is assigned to the role can access and manipulate the database objects with the same permissions as the role has. Note that for role manipulation, you should have the ALTER permission on the role, ALTER ANY ROLE permission at a database level, and membership in the db_securityadmin fixed database role. Assign roles using T-SQL First, create the role by executing the CREATE ROLE query: CREATE ROLE role_name; where role_name is the name of the role you want to create. For example, we want to create the role Managers : CREATE ROLE managers; Given that the role currently has no assigned privileges, the next step is to add privileges to the role using the GRANT command. It can assign privileges to roles on databases and database objects. For example, apply the SELECT, UPDATE, ALTER, INSERT, DELETE privileges on the Person.Address table to the role Managers . GRANT SELECT, UPDATE, ALTER, INSERT, DELETE ON Person.Address TO managers; After that, add users to the role by running the ALTER ROLE query. It adds or removes users to or from a database role or can be used to change the name of the role. The syntax is as follows: ALTER ROLE role_name \nADD MEMBER user_name; where: role_name is the name of the role you want to modify or drop. user_name is the name of the existing user you want to add to the role. If you want to remove the user from the role, use the ALTER ROLE statement with the following properties: ALTER ROLE role_name DROP MEMBER user_name; Note: user_name cannot be a fixed database role or a server principal. To change the name of the role, run the following statement: ALTER ROLE role_name WITH NAME = new_name; where new_name is the new name of the role. Keep in mind that the name of a fixed database role cannot be modified. Assign roles in SSMS Assigning roles to users in SSMS involves granting predefined or custom roles to a user, which then determines the permissions that user has within a database or the entire SQL Server instance. You can assign either a server-level or a database-level role. To grant a server-level role, in Object Explorer , expand the server tree and then expand the Security folder > Logins . Then, right-click the required login and select Properties . In the Login Properties dialog that opens, select the Server Roles page on the left. Then, select the checkbox next to the server roles you want to assign to the user, for example, sysadmin , dbcreator , securityadmin , etc. To assign a database-level role, in Object Explorer , expand the Databases folder and select the database where you want to assign the role. Then, expand the Security folder > Users. In the Users folder, right-click the user to whom you want to assign a role and select Properties . In the Database User dialog that appears, navigate to the Membership page and select the checkboxes next to the roles you want to assign. Click OK to save the changes. Assign roles in dbForge Studio With dbForge Studio, you can assign roles visually using Security Manager. Select a user account for which you want to assign roles, navigate to the Role Membership tab on the right, and select the checkboxes next to the required roles. Click Save to save the changes. Why to choose dbForge Studio We have considered several methods for creating and managing users, privileges and roles, including T-SQL, SSMS and dbForge Studio for SQL Server. Which one you choose will largely depend on your specific needs, workflow preferences and the complexity of the tasks you are handling. However, we would like to draw your attention to several reasons why a lot of users find dbForge Studio for SQL Server to be the best tool compared to SSMS and direct T-SQL scripting: Intuitive and visual interface for managing users and roles. A single and easy-to-navigate view to handle all user, role, and permission management features without the need to switch between different windows. Security Manager displays all available permissions at a glance and lets you assign them in a few clicks without manually writing T-SQL GRANT, DENY, or REVOKE statements, which require precise syntax and can be error-prone. Script changes to create and reuse role and privilege templates. Conclusion In the article, we have described how to create a SQL Server user account on database objects, and assign or revoke privileges applied to the user using T-SQL, SSMS, and dbForge Studio for SQL Server. As you can see, with Security Manager available in dbForge Studio for SQL Server, it is much faster to perform these tasks with several clicks, thus, saving your time and increasing productivity. To evaluate other excellent features and capabilities dbForge Studio for SQL Server provides, [download](https://www.devart.com/dbforge/sql/studio/download.html) a 30-day free trial version of the tool. After it expires, you feel like purchasing a full version of dbForge Studio for SQL Server – no doubt! Tags [create login](https://blog.devart.com/tag/create-login) [create user](https://blog.devart.com/tag/create-user) [database administration](https://blog.devart.com/tag/database-administration) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [sql server permissions](https://blog.devart.com/tag/sql-server-permissions) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [sql statement](https://blog.devart.com/tag/sql-statement) [t-sql](https://blog.devart.com/tag/t-sql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-login-user-and-grant-permission-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+Login%2C+User%2C+and+Grant+Permissions+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fcreate-login-user-and-grant-permission-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-login-user-and-grant-permission-sql-server.html&title=How+to+Create+Login%2C+User%2C+and+Grant+Permissions+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-login-user-and-grant-permission-sql-server.html&title=How+to+Create+Login%2C+User%2C+and+Grant+Permissions+in+SQL+Server) [Copy URL](https://blog.devart.com/create-login-user-and-grant-permission-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/create-oracle-index.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Oracle Index: CREATE, DROP, RENAME – Guide with Examples By [dbForge Team](https://blog.devart.com/author/dbforge) October 6, 2022 [0](https://blog.devart.com/create-oracle-index.html#respond) 6046 In [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , an index is a database object that creates and stores records for all values in specific columns or clusters. With the help of indexes, users can access the necessary data portion much faster and easier. Contents Types of indexes in Oracle Oracle CREATE INDEX statement How to create a Normal Index How to create a Function-Based Index Rename an Index in Oracle Delete an Index in Oracle Manage Indexes in Oracle with GUI tools Conclusion Indexes are among the most popular means of [Oracle database performance tuning](https://www.devart.com/dbforge/oracle/studio/performance-tuning.html) . But they are not a “silver bullet” for all cases. They are applicable in SELECT statements, where the usage of indexes can improve the overall performance significantly. As for the INSERT, UPDATE, and DELETE commands, the effect is the opposite – indexes slow the operations down. That’s why it is important to understand indexes in Oracle to apply them correctly. Types of indexes in Oracle Oracle defines two types of indexes: the B-Tree (Balanced Tree) Index and the Bitmap Index. B-Tree Index is the default Oracle index created whenever we use the CREATE INDEX command. It compiles a list of values divided into ranges and associates a key with a single row or range of rows. This structure works efficiently in a majority of scenarios, including both the exact match and range of searches. In its turn, a B-Tree Index is divided into: Normal Index. It is the most common type created if a user does not specify any additional parameters. In particular, Oracle creates it automatically for the primary key column whenever you create a new table with the primary key. Note: Oracle won’t create an index for the columns with foreign keys. Function-Based Index. It is an index that calculates the result of a function involving one or more table columns (an arithmetic expression, an SQL function, a PL/SQL function, or a package function). The results are stored in the index. It is convenient when you use queries with expressions multiple times. The database must calculate that expression each time, but a Function-Based Index with the same expression lets you avoid those computations. Bitmap Index is an index type used in scenarios with repetitive values. For instance, a traditional B-Tree index would be too expensive for data warehouses, but Bitmap indexes will save space. In addition, Bitmap indexes work best with complicated queries containing WHERE clauses with multiple conditions, reducing the response type. Besides, Oracle\ndifferentiates unique and non-unique indexes. Unique Index . Key column(s) can’t have duplicate values. The simplest example is the staff database for any organization – two employees can’t have the same ID. Then, the row ID is specific for each data value in this index. Non-unique Index. Indexed column(s) can have duplicate values. For instance, several employees can have the same first names. Thus, the respective column may contain duplicates. The row ID will be in the key in sorted order. Non-unique indexes are sorted by the index key and the row ID. Both the unique and non-unique indexes are the B-Tree index structure versions. By default, the B-Tree index is non-unique. To create unique index in Oracle, you need to use the UNIQUE keyword in the CREATE INDEX statement. And now, let’s proceed to the process of creating different indexes in Oracle. Oracle CREATE INDEX statement and how to apply it Before starting to look for the best way to create index on table in Oracle, you should answer one question. Do you need to create that index at all? As we already know, indexes are helpful to speed up access to data, but they are not universal. Besides, indexes consume physical storage. Oracle uses and maintains indexes automatically and stores them separately. Indexes are independent of the tables – you can create or drop them whenever you need – it won’t affect the tables and other indexes. On the other hand, the more indexes you have and the larger they are, the more storage they will take. When you alter the table by inserting or deleting rows, all indexes related to that table must be updated. And any change in the indexed column also requires updating the index. The data maintenance costs only grow. There are\nconditions to check when you want to create an index on a table in Oracle: Do you query the particular\ncolumn often? Is there a UNIQUE key integrity\nconstraint existing on the column? Is there an integrity\nconstraint existing on the column? If the answer is\n“no,” creating an index won’t improve performance. In any case, you\nshould analyze the possible benefits you get from applying indexes and compare\nthem with the possible troubles of the data and index updating. Create a Normal Index in Oracle for one or several columns As we have already defined, the default index created in Oracle is a non-unique B-Tree index. To create a new one, we need to apply the CREATE INDEX command with the below syntax: CREATE INDEX index_name\nON table_name (column1, column2, ...columnN); This statement is the simplest form of syntax. The parameters are as follows: index_name – the name of the index you’ll create table_name – the name of the table for which you are creating that index column1, column2, … columсnN – the table columns to include in that index If you want to make Oracle create table unique index, you should modify that basic command syntax in the following way: CREATE [UNIQUE] INDEX index_name\n ON table_name (column1, column2, ...columnN); In this\nstatement, the keyword UNIQUE defines that the indexed columns must contain\nunique combinations of values. One more\nparameter used in the CREATE INDEX statement is COMPUTE STATISTICS. It is an\noptional parameter telling Oracle to collect the statistics during index\ncreation. Further, this stats data would serve when choosing the “plan of\nexecution” of SQL statements. However, since Oracle 10, the statistics is\ncollected by default. Thus, you don’t have to add this parameter additionally. Now, let us have a look at some practical examples. We want to tell Oracle create non-unique index for one column. Our sample database belongs to the organization that offers tickets to the Olympic Games and shares some other related information. We want to check the ticket prices. There will surely be the same prices for the tickets from specific categories. The command to create non-unique index in Oracle is as follows: CREATE INDEX OLYMPIC_GAMES.UK_TICKET_PRICE\nON OLYMPIC_GAMES.TICKET (PRICE); To illustrate the execution of commands, we’ll use dbForge Studio for Oracle. A part of this software IDE is a multi-functional [PL/SQL Code editor](https://www.devart.com/dbforge/oracle/studio/oracle-sql-editor.html) that simplifies all coding tasks. Pay attention to colorizing the statement parts, as it improves the code readability greatly, whether you deal with a simple or a sophisticated query. The next time you retrieve the information about tickets and refer to the prices, it will be faster and easier due to having an index on that column. If you want to create a unique index, use the modified command: CREATE UNIQUE INDEX OLYMPIC_GAMES.UK_TICKET_TID\nON OLYMPIC_GAMES.TICKET (SID); Indexes can be created on several columns. You can include as many columns in the index as necessary. In our example, we want a new index that will include ticket prices and ticket categories. CREATE INDEX OLYMPIC_GAMES.UK_TICKET_PRICE\nON OLYMPIC_GAMES.TICKET (PRICE), OLYMPIC_GAMES.TICKET (CAT); With the help of that index, you can retrieve the ticket information according to several criteria at once, thus obtaining the results faster and causing less load. Create a Function-Based Index in Oracle We use Function-Based indexes to improve the performance of queries that contain functions in the WHERE clauses. To tell Oracle create function based index on your table, use the below syntax: CREATE [UNIQUE] INDEX index_name\nON table_name (function1, function2, ... functionN); UNIQUE is an optional keyword specifying that we want an Oracle unique index created on the column. index_name is the name of the index we will create table_name is the name of the table for which we will create this index function1, function2, … functionN are the functions we will use in the index Let’s get back to our ticket sales database. It has a column with the countries. Assume we want to convert the country names to lowercase before we retrieve them. In this case, having a Function-Based index on that column will help us do that more efficiently. The SQL statement would be as follows: CREATE INDEX OLYMPIC_GAMES.UK_COUNTRY\nON OLYMPIC_GAMES.UK_COUNTRY (LOWER(CNAME)); Whenever we want to get the list of converted names, we can use this index and get the necessary data much faster. How to rename an index in Oracle Quite often, we need to change the name of\nthe index. This operation is common in Oracle. If the user has the ALTER right\nfor an index (if not, check it with the administrators), the task is\nstraightforward. The Oracle alter index rename command is as follows: ALTER INDEX current_index_name\nRENAME TO new_index_name; In this statement, current_index_name specifies the name of an existing index we want to rename new_index_name specifies a new name of an existing index For example, we want to change the OLYMPIC_GAMES.UK_COUNTRY_CNAME index: ALTER INDEX OLYMPIC_GAMES.UK_COUNTRY_CNAME\nRENAME to INDCNAME; This way, we rename index in Oracle – it is a simple operation. Note that the index renaming does not affect the columns and tables in any way. How to delete an index in Oracle The DROP INDEX command in Oracle allows the users to delete any existing index from the current database schema. It won’t affect the table physically because indexes are independent objects and are stored separately. Still, check if you might still use that index in some queries. In that case, the queries using the dropped index will take longer. The statement to drop Oracle index is the following: DROP INDEX [schema_name.]index_name; index_name specifies the name of the index you want to delete schema_name specifies an optional parameter. If it is absent, Oracle considers that you want to delete an index from your current schema. In the previous examples, we created both unique and non-unique indexes. If you want to drop unique index Oracle, you don’t need to specify that index type. The index name is enough for the command: DROP INDEX OLYMPIC_GAMES.UK_COUNTRY; One thing to note is that you can’t drop a non-existing index – Oracle will produce an error. Thus, it would be helpful to have a variant of the Oracle DROP INDEX IF EXISTS statement. Unfortunately, Oracle does not support the IF EXISTS option. Pundits recommend Oracle users check for the index they want to delete separately before executing the drop command. Note that dropping the table also deletes all the corresponding triggers and indexes automatically. Manage Indexes in Oracle with GUI tools Each database has lots of indexes, and managing them takes plenty of time if this work is not automated. Powerful modern GUI tools, such as [dbForge Studio for Oracle](https://www.devart.com/dbforge/Oracle/studio/) , let the users create and edit indexes in a visual mode (among the many other database-related tasks). dbForge Studio for Oracle speeds up such tasks and ensures the highest quality of the code. The [PL/SQL Formatter](https://www.devart.com/dbforge/oracle/studio/plsql-formatter.html) detects any syntax errors and beautifies the code. All types of database-related jobs become much more straightforward with this and many other features of this powerful GUI solution. Conclusion Indexes are standard schema objects in all popular relational database management systems due to their ability to make task performance more efficient and save server resources. Access to the necessary data portion at once, directly, without scanning the entire table is helpful in all work scenarios. We used dbForge Studio for Oracle to illustrate this article. It lets you work with indexes in Oracle in a visual mode, but it is not the only benefit of the Studio – it helps you with any tasks related to Oracle databases. The fully-functional Free Trial of the software is available for you to test all its capacities appropriately. Tags [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-oracle-index.html) [Twitter](https://twitter.com/intent/tweet?text=Oracle+Index%3A+CREATE%2C+DROP%2C+RENAME+%E2%80%93+Guide+with+Examples&url=https%3A%2F%2Fblog.devart.com%2Fcreate-oracle-index.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-oracle-index.html&title=Oracle+Index%3A+CREATE%2C+DROP%2C+RENAME+%E2%80%93+Guide+with+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-oracle-index.html&title=Oracle+Index%3A+CREATE%2C+DROP%2C+RENAME+%E2%80%93+Guide+with+Examples) [Copy URL](https://blog.devart.com/create-oracle-index.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/create-table-in-postgresql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) Create Table in PostgreSQL: A Guide With Examples By [dbForge Team](https://blog.devart.com/author/dbforge) August 2, 2021 [0](https://blog.devart.com/create-table-in-postgresql.html#respond) 30347 The article presents a comprehensive walkthrough of existing methods to create a table in PostgreSQL. PostgreSQL is one of the most efficient and advanced open-source relational database management systems. As its name emphasizes, the system is compliant with the SQL standard, making it vastly popular among companies that carry out complex and massive data operations. The system uses multi-version concurrency control (MVCC) which allows several users to efficiently perform multiple tasks at the same time. How to create tables in PostgreSQL Creating a table in PostgreSQL comprises a basic operation that can be performed by using the Postgres CREATE TABLE statement and various PostgreSQL GUI tools. In this article, we are going to explore a bunch of ways to create a table in PostgreSQL. Contents 1. Creating a table using the PostgreSQL CREATE TABLE statement 1.1 Using the LIKE option 1.2 Creating a temporary table 2. Creating a new table from the command line 3. Creating a PostgreSQL table using dbForge Studio for PostgreSQL 4. Postgres table constraints 5. How to use the PostgreSQL CREATE TABLE AS statement 6. The OR REPLACE option on the CREATE TABLE statement The need for a reliable and efficient DBMS becomes extremely acute with an increasing amount of data that companies need to keep and process today. Being a relational database, PostgreSQL stores data in tables that hold structured related data like lists of products, their prices, quantity, etc., and enables database users to change that data easily. If you are new to PostgreSQL and looking for more insights, you may as well start with learning [how to download and install PostgreSQL on Windows](https://blog.devart.com/download-install-postgresql-on-windows.html) . What is a table in Postgres? A table in PostgreSQL is a database object that organizes and stores data in a structured format: in rows and columns. PostgreSQL tables allow information to be quickly accessed and retrieved. Creating a table using the PostgreSQL CREATE TABLE statement The PostgreSQL CREATE TABLE statement basic syntax is as follows: CREATE TABLE [IF NOT EXISTS] table_name (\n column1 datatype(length) column_constraint,\n column2 datatype(length) column_constraint,\n column3 datatype(length) column_constraint,\n table_constraints\n); In this syntax: Use the IF NOT EXISTS operator to make sure a table with the same name doesn’t exist in a database. If there is one already, PostgreSQL won’t let you proceed and will skip the command. Enter column names, separate them with commas, and specify data types for columns, the column length, and the column constraints. Indicate the table constraints like [PRIMARY KEY](https://blog.devart.com/postgresql-primary-key.html) and FOREIGN KEY. Let’s create a table called accounts : CREATE TABLE accounts (\n\tuser_id serial PRIMARY KEY,\n\tusername VARCHAR ( 50 ) UNIQUE NOT NULL,\n\tpassword VARCHAR ( 50 ) NOT NULL,\n\temail VARCHAR ( 255 ) UNIQUE NOT NULL,\n\tcreated_on TIMESTAMP NOT NULL,\n last_login TIMESTAMP\n); Where: NOT NULL constraint enforces a column NOT to accept NULL values UNIQUE constraint ensures the column doesn’t contain the repeated values Using the LIKE option PostgreSQL allows creating an empty table based on the definition of another table, including the column attributes and [indexes](https://blog.devart.com/postgresql-indexes.html) defined in the original table. To copy the table structure, use the PostgreSQL LIKE clause: CREATE TABLE new_table_name (LIKE old_table_name INCLUDING ALL); Creating a temporary table PostgreSQL allows you to create temporary tables as well. A PostgreSQL temp table is an impermanent table that can be accessed only till the end of a database session. After that, the table will be automatically dropped. Use the CREATE TEMPORARY TABLE statement to create a PostgreSQL temp table: CREATE TEMPORARY TABLE temp_table_name(\n column_list\n); Let’s create the city_temp table: CREATE TEMPORARY TABLE city_temp (\n city VARCHAR(80),\n street VARCHAR(80)\n )\n ON COMMIT DELETE ROWS; Here, by adding ON COMMIT DELETE ROWS, we specify to remove data from the temporary table at the end of each transaction. See also: [How to duplicate a table in PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/postgresql-copy-table.html) Creating a new table from the command line SQL Shell (psql) is a command-line-based frontend to PostgreSQL. It allows entering, editing, and executing queries and statements as well as viewing their results. To create a Postgres table from the command line, first launch SQL Shell. Next, connect to the database and execute the CREATE TABLE statement. CREATE TABLE table_name (column_1 datatype, column_2 datatype); On the screenshot above, we execute the CREATE TABLE statement one more time and get an error message, stating that the table already exists. Let’s try to get the list of all tables in the database. For this, execute the following command: \\d As you can see, the tutorials table has been successfully created. The above-mentioned methods to create a new table in PostgreSQL are quite straightforward. If you have basic knowledge of SQL, you can master them quickly. However, database developers and DBAs have to perform hundreds of similar tasks every day, and those tasks need to be performed quickly and without any errors. The solution? That’s where the professional database development tools become extremely handy. Let’s observe one of the most convenient tools – dbForge Studio for PostgreSQL. With its help, the PostgreSQL create tables jobs can be completed with a couple of clicks and minimum manual coding. Creating a PostgreSQL table using dbForge Studio for PostgreSQL dbForge Studio for PostgreSQL is an advanced solution designed to offer all the necessary tools for PostgreSQL database development and administration in a single IDE. The Studio boasts a user-friendly interface, allowing people without database-related background to effectively cope with database tasks. With dbForge Studio for PostgreSQL, you can easily create a table in SQL Editor and while doing that benefit greatly from automatic syntax check, context-sensitive code completion, and execution notifications. To create a table using dbForge Studio for PostgreSQL: 1. Launch the Studio and connect to the server. 2. In Database Explorer, right-click the database you want to create a table in and click New SQL . 3. In SQL Editor that opens, type the CREATE TABLE statement. In the process, dbForge Studio for PostgreSQL will offer context-sensitive autocompletions so that you didn’t type all the code manually – just click to insert the suggestion into your syntax. It will also check your code and highlight typos. To get quick information about objects in the script, simply hover the mouse over them. Postgres table constraints Constraints are special rules or restrictions for data in a table. PostgreSQL supports both table and column constraints. Table constraints specify restrictions that apply to the whole table while a column constraint affects only one specific column. PostgreSQL supports the following column constraints: PRIMARY KEY, NOT NULL, UNIQUE, CHECK, FOREIGN KEY. Let’s consider some of them in more detail. We have already mentioned UNIQUE and NOT NULL constraints above. PRIMARY KEY constraint indicates that the column is used to uniquely identify a record within a table. Thus, when creating a PostgreSQL table, it is important to remember that the columns with PRIMARY KEYS can contain only unique (non-duplicate) values and cannot have NULLs. Let’s consider the example Postgres CREATE TABLE statement with PRIMARY KEY. CREATE TABLE orders ( \n order_id integer NOT NULL,\n order_date date,\n quantity integer,\n notes varchar(200),\n CONSTRAINT orders_pk PRIMARY KEY (order_id)\n); Note, that you can not create a table with multiple primary keys as it is against the essence of a primary key, instead you can have a primary key that contains multiple columns (a composite primary key). CREATE TABLE product_tags\n( product_id INTEGER NOT NULL,\n tag_id SERIAL NOT NULL,\n production_date VARCHAR(20),\n tag_peni VARCHAR(20),\n item_number VARCHAR(20),\n PRIMARY KEY(product_id, tag_id)\n); FOREIGN KEY constraints are used to relate tables in a database. A foreign key comprises a column or a group of columns in one table that reference the primary key column or columns in another table.  In other words, the FOREIGN KEY constraint specifies that the values in a column must match the values in another table. In such a simple way, database referential integrity is maintained. CHECK constraints are used to make sure that values in a column meet a specific requirement. CHECK constraints use a Boolean expression to evaluate the values before they are inserted into a table. In case a value doesn’t pass the check, PostgreSQL won’t insert it and will issue a constraint violation error. CREATE TABLE prices\n( id serial PRIMARY KEY,\n product_name VARCHAR (50),\n product_description VARCHAR (50),\n price numeric CHECK(price > 0)\n); Here we make sure that the price value must be greater than zero. How to use the PostgreSQL CREATE TABLE AS statement To create a new PostgreSQL table based on the results of a query, you can use the CREATE AS statement. In other words, Postgres CREATE TABLE AS statement creates a new table and populates it with the data returned by a query. See the basic syntax for CREATE TABLE AS: CREATE TABLE new_table_name\nAS query; If you want to make sure that your table doesn’t already exist, you might use the IF NOT EXISTS operator. In this case, the syntax is as follows: CREATE TABLE IF NOT EXISTS new_table_name\nAS query; Let’s now look at the worked example: CREATE TABLE thrillers AS\nSELECT\n movie_id,\n movie_title,\n production_year,\n imbd_rating\nFROM\n movies\nINNER JOIN movie_category USING (movie_id)\nWHERE\n category_id = 5; The OR REPLACE option for the CREATE TABLE statement The OR REPLACE option on the CREATE TABLE statement is used in MariaDB databases to change the definition of the existing table. In such a way, you can replace the old table with the newly defined one. In plain English, if you use the CREATE OR REPLACE TABLE statement and the table already exists, there won’t be any error issued – the old table will be dropped and the new one will be created. However, PostgreSQL doesn’t support the OR REPLACE option on the CREATE TABLE statements. In Postgres, OR REPLACE works well with CREATE VIEW and CREATE FUNCTION statements, but doesn’t work for CREATE TABLE. Instead, you may use the following method. Suppose, you have a table: CREATE TABLE t_table (\n pk INT PRIMARY KEY,\n txt VARCHAR(255)\n ); Now, we want to replace it with another table. For this, we create a new table with the same structure, insert values into it, change the names of the old table and the new one, and then drop the old table: CREATE TABLE \"table_new\" AS TABLE t_table;\nINSERT INTO \"table_new\" (pk, txt) VALUES (1,'1');\nALTER TABLE t_table RENAME TO \"table_old\";\nALTER TABLE \"table_new\" RENAME TO t_table;\nDROP TABLE \"table_old\"; Conclusion In the article, we have explored the popular methods to create a new table in a PostgreSQL database and found out that dbForge Studio for PostgreSQL offers the advanced functionality for you to perform the task in the most convenient and beneficial way. [Download a 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) of dbForge Studio for PostgreSQL and check it yourself. Tags [create table in postgresql](https://blog.devart.com/tag/create-table-in-postgresql) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [postgres create table](https://blog.devart.com/tag/postgres-create-table) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-table-in-postgresql.html) [Twitter](https://twitter.com/intent/tweet?text=Create+Table+in+PostgreSQL%3A+A+Guide+With+Examples&url=https%3A%2F%2Fblog.devart.com%2Fcreate-table-in-postgresql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-table-in-postgresql.html&title=Create+Table+in+PostgreSQL%3A+A+Guide+With+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-table-in-postgresql.html&title=Create+Table+in+PostgreSQL%3A+A+Guide+With+Examples) [Copy URL](https://blog.devart.com/create-table-in-postgresql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/create-table-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Use CREATE TABLE in SQL Server By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) October 24, 2024 [0](https://blog.devart.com/create-table-in-sql-server.html#respond) 7082 Creating a table is the core of the database design. Data is stored in tables, and the table structure with internal relations allows us to organize that data effectively. It is impossible to work with databases without creating and configuring tables, and it is one of the fundamental skills for all database professionals. There are standard methods of creating tables and tips that help us to do it faster and ensure accuracy. This article aims to review these methods and tips on Microsoft’s SQL Server – the data platform and one of the world’s most popular database management systems. Contents The basics of creating database tables The CREATE TABLE statement CREATE TABLE with a primary key CREATE TABLE with a foreign key CREATE TABLE from another table CREATE TABLE if it does not exist CREATE a temp table The advantages of using GUI tools for creating tables Conclusion The basics of creating database tables The database table is a structure that contains data organized by rows and columns. Tables have descriptive names. Table columns also have specific names. Besides, each column is assigned the data type that defines which values that column can store. SQL Server provides the following options for creating tables: The CREATE TABLE command: This is the standard method used to create a SQL Server table. Here we can specify columns, data types, set constraints, and define other table properties. Besides, it allows the developers to save the script and reuse it whenever needed, even automatically. The SELECT AS/SELECT INTO command: This method creates a new table from the existing one based on the SELECT query result set. The resulting table inherits the structure of the “source” table, whether or not it contains any records. This method provides a convenient way to generate a new table with the same structure as the original one. GUI-based software tools (SSMS or third-party solutions): Graphical user interfaces are favored by both database experts and regular users as they streamline all processes and eliminate errors caused by manual coding. SQL Server Management Studio (SSMS) is the default solution provided by Microsoft. This article will demonstrate how to create new tables in SQL Server with dedicated scripts. However, we’ll also utilize GUI tools to illustrate our work – we appeal to [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , a more powerful and robust alternative to SSMS that allows us to design database tables in several clicks. The CREATE TABLE statement Syntax The basic syntax we use to create a new table in SQL Server is: CREATE TABLE [database_name.][schema_name.]table_name (\n column_name1 data_type [NULL | NOT NULL],\n column_name2 data_type [NULL | NOT NULL],\n column_name3 data_type [NULL | NOT NULL],\n ...,\n); Note the following parameters: database_name and schema_name – optional parameters that define respectively the names of the database and the database schema where you are creating the new table. If they aren’t specified explicitly, the query will be executed against the current database and the default schema of that database. table_name – the name of the table you are creating. The maximum length of the table name is 128 characters (except for the local temporary tables – we’ll review them further in this article). It is recommended to use descriptive names to manage tables easier. column_name – the name of the column in the table. Most tables contain multiple columns, and we separate column names in the CREATE TABLE script by commas. data_type – the data type for each column to indicate which values that particular column will store. NOT NULL – the optional parameter that specifies that the column cannot contain NULL values. If it is not set, the column allows having NULL values. The CREATE TABLE statement can be significantly more intricate and incorporate a wider array of parameters, whereas this syntax represents the simplest variant. But for now, let us see how the basic syntax works. Have you ever considered using a [table designer tool](https://www.devart.com/dbforge/sql/studio/table-designer.html) to create new SQL tables or edit table data in the grid visually? If not, it’s about time to give it a try! Example Assume we want to create a table in a shop database with information about regular customers. CREATE TABLE sales.customers (\n\tcustomer_id INT IDENTITY (1, 1) PRIMARY KEY,\n\tfirst_name VARCHAR (255) NOT NULL,\n\tlast_name VARCHAR (255) NOT NULL,\n\tphone VARCHAR (25),\n\temail VARCHAR (255) NOT NULL,\n\tstreet VARCHAR (255),\n\tcity VARCHAR (50),\n\tstate VARCHAR (25),\n\tzip_code VARCHAR (5)\n); This is what it looks like in SSMS and dbForge Studio for SQL Server, respectively. Check the list of the [best free SQL database software](https://blog.devart.com/best-free-sql-database-software.html) to try in 2025. That’s how we have created a new table in the existing SQL Server database. CREATE TABLE with a primary key The primary key is a constraint that identifies each table record uniquely. It is not mandatory, but it is present in most tables. Most likely, we’ll need it too. Syntax The primary key has the following characteristics: Contains unique values only Can be only one on a table Can’t contain NULL values Consists of one or several columns Thus, the basic syntax for this case is: CREATE TABLE [database_name.][schema_name.]table_name (\n column_name1 data_type NOT NULL PRIMARY KEY,\n column_name2 data_type [NULL | NOT NULL],\n column_name3 data_type [NULL | NOT NULL],\n ...,\n); Example #1 So, to create a table in SQL Server with a primary key, we use the PRIMARY KEY keyword for the respective column after its name and data type. CREATE TABLE production.categories (\n\tcategory_id INT IDENTITY (1, 1) PRIMARY KEY,\n\tcategory_name VARCHAR (255) NOT NULL\n); Example #2 Setting a primary key is possible for any column or a combination of columns. CREATE TABLE sales.customers (\n\tfirst_name VARCHAR (255) NOT NULL,\n\tlast_name VARCHAR (255) NOT NULL,\n\tphone VARCHAR (25) NOT NULL,\n\temail VARCHAR (255) NOT NULL,\n\tstreet VARCHAR (255),\n\tcity VARCHAR (50),\n\tstate VARCHAR (25),\n\tzip_code VARCHAR (5),\n CONSTRAINT PK_Customer PRIMARY KEY (first_name, last_name, phone, email) \n); In the above example, we create a table with a primary key that involves four columns – the first name, the last name, the phone number, and the email address. This combination will be used to identify each record in the table. CREATE TABLE with a foreign key The foreign key constraint is an essential element for relational databases – it creates the relation between tables by referring to the primary key set on a different table. As a result, two tables get linked together. The table with the primary key is called the parent table, and the table with the foreign key is called the child table. The values used by the foreign key of the child table must exist in the parent table. It is a common practice to create a table in SQL Server with a foreign key at once to relate it to another table and make the entire schema more organized. Syntax The basic syntax for this case is: CREATE TABLE [database_name.][schema_name.]table_name (\n column_name1 data_type NOT NULL PRIMARY KEY,\n column_name2 data_type [NULL | NOT NULL],\n column_name3 data_type [NULL | NOT NULL],\n ...,\nFOREIGN KEY\n ( column_name [ ,... n ] )\n REFERENCES referenced_table_name [ ( ref_column [ ,... n ] ) ]\n); Here, we take the general syntax for the previous case, add a foreign key constraint, and indicate the referenced table and column. Example Assume we want to create a table with information about products. It will be a child table for the production.categories table, and we’ll have a foreign key on it. CREATE TABLE production.products (\n\tproduct_id INT IDENTITY (1, 1) PRIMARY KEY,\n\tproduct_name VARCHAR (255) NOT NULL,\n\tbrand_id INT NOT NULL,\n\tcategory_id INT NOT NULL,\n\tmodel_year SMALLINT NOT NULL,\n\tlist_price DECIMAL (10, 2) NOT NULL,\n\tFOREIGN KEY (category_id) REFERENCES production.categories (category_id) \n); This way, we create a table with a foreign key in SQL Server and relate two tables (production.products and production.categories ). The product_id column is the primary key of the production.products table, and the category_id column is the foreign key referencing the category_id column in the parent production.categories table. CREATE TABLE from another table Syntax Creating a new table in a database from an existing table is common. We use the SELECT…INTO statement for that. It fetches columns from an existing table and inserts them into a new table. SELECT column1, column2, column3\nINTO [external_db.][schema_name.]new_table\nFROM [database_name.][schema_name.]old_table\nWHERE condition; Note the WHERE clause, which can be used to specify the data you want to retrieve and save in a new table. Example #1 Now let’s see how it works. For example, let’s create a comprehensive list of customers’ addresses. SELECT street, city, state, zip_code \nINTO address_dictionary\nFROM BikeStores.sales.customers Example #2 One of the scenarios where SELECT INTO comes in handy is creating empty tables with a specific structure. For that, we take our basic syntax and add the WHERE clause with the 1 = 0 parameter: SELECT column1, column2, column3, ...\nINTO new_table \nFROM old_table\nWHERE 1 = 0; This parameter ensures that the query won’t copy any data from the source table. It will create an empty table with the same structure as the original one, and you can populate its columns with your data. SELECT * \nINTO audit_orders\nFROM sales.orders o WHERE 1=0; However, indexes, constraints, and triggers aren’t transferred through SELECT INTO. If you need them in a new table, you should add them separately. CREATE TABLE if it does not exist Before creating a new table in a database, checking whether such a table already exists would be helpful. And here the issue is: Microsoft SQL Server does not support the IF NOT EXISTS function in the CREATE TABLE queries. Should the database contain the same table, the command to create a new one with the same name will fail. Syntax Is there some alternative to CREATE TABLE IF NOT EXISTS in SQL Server? The recommended solution is the OBJECT_ID() function. IF OBJECT_ID(N'table_name', N'U') IS NULL\nCREATE TABLE table_name (\n column_name1 data_type [NULL | NOT NULL],\n column_name2 data_type [NULL | NOT NULL],\n column_name3 data_type [NULL | NOT NULL],\n ...,\n); Example In this example, we specify our object – the user-defined table. If this object does not exist in the database, the function returns NULL, which is the condition of creating a new table. Assume we want a new table called sales.stores . Let’s check if it exists before executing the query to create it. IF OBJECT_ID(N'sales.stores', N'U') IS NULL\nCREATE TABLE sales.stores (\n\tstore_id INT IDENTITY (11, 1) PRIMARY KEY,\n\tstore_name VARCHAR (255) NOT NULL,\n\tphone VARCHAR (25),\n\temail VARCHAR (255),\n\tstreet VARCHAR (255),\n\tcity VARCHAR (255),\n\tstate VARCHAR (10),\n\tzip_code VARCHAR (5)\n); Thus, the CREATE TABLE command is successful, and we have a new sales.stores table in our database. CREATE a temp table In SQL Server, a temporary (temp) table is a table with some data portion extracted from the regular table and not stored in the memory. While it is possible to use and reuse this table during a particular session, it will be deleted when that session ends or the database connection is terminated. Temp tables are convenient to work with if we regularly deal with some records kept in the database. We can retrieve that data, process it as needed, and turn it into a temporary table. The table is stored in the tempdb system database, and we can operate it the same way as regular tables. Temp tables are significantly faster in loading data. Syntax To create a temp table on SQL Server, we can use the SELECT…INTO command – it is the simplest approach: SELECT column1, column2, column3, ...\nINTO #new_table\nFROM old_table\nWHERE condition; Important: The temp table name always starts with the hash symbol (#), and the maximum name length is 116 characters. Example #1 Let’s create a temp table based on our syntax, listing orders placed by customers whose State is indicated as NY . SELECT \n o.order_id, \n o.customer_id, \n o.order_status, \n o.order_date, \n o.required_date, \n o.shipped_date, \n o.store_id, \n o.staff_id\nINTO #temp_sales_orders\nFROM sales.orders o\nJOIN sales.customers c ON o.customer_id = c.customer_id\nWHERE c.state = 'NY'; Example #2 Another way to create a temp table in SQL Server is by using the CREATE TABLE statement. It works in the same way as in the earlier examples of creating regular tables. You only need to begin the table name with the hash symbol (#). CREATE TABLE #temp_sales_orders (\n order_id INT IDENTITY (1, 1) PRIMARY KEY,\n customer_id INT,\n order_status TINYINT NOT NULL, \n order_date DATE NOT NULL,\n required_date DATE NOT NULL,\n shipped_date DATE,\n store_id INT NOT NULL,\n staff_id INT NOT NULL\n); Then, we insert records into this table and work with it as required. When the session is over, the table will be automatically deleted. Example #3 In some work scenarios, we need to create a temporary table in SQL Server and make it accessible to other users. The solution is a global temporary table visible to all users and their sessions. To create a global temporary table, we use the CREATE TABLE command and mark the table name with two hash symbols: ##table_name. CREATE TABLE ##temp_2024_sales_orders (\n order_id INT IDENTITY (1, 1) PRIMARY KEY,\n customer_id INT,\n order_status TINYINT NOT NULL, \n order_date DATE NOT NULL,\n required_date DATE NOT NULL,\n shipped_date DATE,\n store_id INT NOT NULL,\n staff_id INT NOT NULL\n); Global temporary tables are also stored in the system tempdb database. They remain there until all users who refer to the particular temp table complete their sessions or close connections to the database. The advantages of using GUI tools for creating tables Table design is one of the most common tasks, and cutting-edge GUI tools help database specialists resolve such tasks quickly and efficiently. To support this article, we used dbForge Studio as it is one of the most popular and powerful IDEs for database-related jobs in SQL Server. It simplifies the table design significantly by transferring the work into the visual interface, though it provides a powerful SQL Editor to write SQL queries and execute them against the database directly. For instance, you can use any standard commands like [ALTER TABLE to add multiple columns](https://blog.devart.com/create-table-in-sql-server.html) to the existing table and modify the tables in other ways or visit the website to learn [how to list all tables in SQL](https://www.devart.com/dbforge/sql/studio/show-tables-in-sql-server-database.html) . [Table Designer](https://www.devart.com/dbforge/sql/studio/table-designer.html) allows creating and modifying tables visually. It has all the options to define table columns, data types, constraints, relationships, and other properties. Instead of writing complex SQL scripts manually, you can [complete the task with several clicks](https://www.youtube.com/watch?v=gZ2l0OC5JLA) . Once the table structure is defined visually, Table Designer can generate an SQL script that can be executed against the database to create that table in it, or you can save it for further reference. dbForge Studio for SQL Server is available for a [30-day free trial](https://www.devart.com/dbforge/sql/studio/download.html) , so you can explore the entire range of its capabilities and see whether it’s the best fit for your routine database development and management needs. Conclusion Tables are a fundamental component of any relational database. They make it possible to both store and organize data. That’s why understanding and mastering approaches to creating tables is critical for database developers and admins. Professional expertise and the right tools help them raise their effectiveness and productivity, and this combination can’t be beaten. Tags [create table in SQL Server](https://blog.devart.com/tag/create-table-in-sql-server) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-table-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Use+CREATE+TABLE+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fcreate-table-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-table-in-sql-server.html&title=How+to+Use+CREATE+TABLE+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-table-in-sql-server.html&title=How+to+Use+CREATE+TABLE+in+SQL+Server) [Copy URL](https://blog.devart.com/create-table-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/create-upload-and-store-images-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Create Images with Microsoft Designer and Import Them into SQL Server Databases Using dbForge Studio By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) October 13, 2023 [0](https://blog.devart.com/create-upload-and-store-images-in-sql-server.html#respond) 1984 It’s high time we explored one of Microsoft’s most curious offerings— Microsoft Designer , a web application that applies artificial intelligence to create visuals based on your suggested text. In other words, you describe what kind of image you would like to get, and Microsoft Designer effortlessly generates a number of images to choose from. Additionally, we’ll show you the easiest way of importing images into a SQL Server database. Contents Introduction to graphics AI What is Microsoft Designer? Introduction to dbForge Studio for SQL Server Importing images into a SQL Server database using dbForge Studio Introduction to graphics AI Now, let’s get it straight—no graphics AI can replace a human designer, and in the near future, it’s not likely that it will. This is especially true when it comes to more complex areas such as Web design, motion design, industrial design, you name it. However, if there is one branch where AI can achieve impressive results with minimum human intervention, it’s graphic design. Sure, you need to articulate your ideas well to help neural networks deliver what you want. But the chances are that, if your graphics AI gets it right, the output will not require all that many corrections and adjustments. What is Microsoft Designer? This leads us to [Microsoft Designer](https://designer.microsoft.com/) , a web-based, AI-powered graphic tool that helps create unique, variegated visuals for whatever purposes, be it presentations, infographics, banners for posts on your blog or social media, newsletters, or other marketing materials such as ads, posters, brochures, and postcards. The key features of Microsoft Designer Predesigned yet customizable design templates A wide variety of design elements, including text, icons, shapes, and images Basic image editing tools Collaboration with other users Now let’s take a look at some of these features in action. How to generate images with Microsoft Designer 1. Your first step is a text box where you need to enter the description of an image you would like to get. The main principle is fairly similar to that of ChatGPT—if you provide the AI with detailed input, you are more likely to get exactly what you need. Additionally, you can use the Try a prompt section with a selection of customizable templates. That said, once your description is ready, click Generate . 2. In a while, you get a few designs to choose from. Once you make up your mind—or try a different description—select the design you like and either click Download , or, if you want to tweak it a bit, click Customize design . 3. We’ll go with the latter option and proceed to customization. 4. Now let’s see what we have here. First, if the generated design comes with a text, you can edit it the way you want as well as pick the preferred font style. You can copy and paste custom images to your newly created design—for instance, you can add your logo to a banner. If you want to do that, click My media . You also get your basic set of image editing features that help you do the following: Reposition and crop images Adjust the opacity, brightness, contracts, saturation, temperature, and sharpness of images Remove, replace, or blur the background Add effects Finally, you can add suggested Visuals to your design. This may also come in handy. In our case, we’ll remove everything and go with a plain image of a bike against a white background. 5. Once you finish editing, you can click Download , select the file type, and get your image in the preferred way. Note that you can use AI to generate captions or hashtags to publish with the image. Now that we’ve got the image at hand, let’s see how we can upload it to a SQL Server database. Introduction to dbForge Studio for SQL Server Arguably, the easiest way to manage databases is to adopt an integrated development environment, a single toolkit with virtually everything you might need feature-wise and a clean user interface to make it complete. We’ve got one compelling example that we’ll use today — [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . The list of its key features and capabilities is quite long (that’s why, if you want to have a look, you’ll find it at the end of this article), so we’ll focus on Data Edtior, a built-in tool that will help us import the image we have generated. Data Editor is your best bet when it comes to convenient management of table data. This includes viewing, sorting, grouping, filtering, and editing data directly in the grid. And, as we said, it helps import images. Let’s see how it’s done. Importing images into a SQL Server database using dbForge Studio 1. In our case, we’ll import our image into the AdventureWorks2022 sample database. And since we’re dealing with a bike, the production.ProductPhoto table will make a perfect destination. This is what it looks like, with one of the default bike images shown in Data Viewer. And since you will need the grid to insert your own images, you can just as well use a SELECT query on the required table to get it. 2. Now you need to make sure you switch from the default read-only mode to editing. To do that, click the corresponding button on the toolbar and switch from read only to the table you are working with. 3. Find the cell that you want to insert your image into. In our case, we’ll simply add a new record with NULL values by clicking Append (the green plus icon under the grid). 4. To insert your image, go to the required cell, invoke the dropdown menu, and click Load Data . 5. Select All images in the Files of type field, find the downloaded image, and click Open . 6. The image will be uploaded to the cell. Double-click it, and you’ll see your image in Data Viewer. That’s it! Easier than ever. Download dbForge Studio for a free 30-day trial today! Naturally, the capabilities of the Studio go far beyond image import, so now, just as we promised, we’ll give you a list of its key features that can make your daily with databases most effective. Visualization of database structures on ER diagrams Visual table design Enhanced SQL coding: context-aware completion, syntax validation, formatting, refactoring, and debugging Profiling-based query optimization Visual query building that does not require coding Identification of differences in database schemas and table data Quick synchronization of source and target databases Data analysis: data aggregation in visual pivot tables, observation of related data in Master-Detail Browser, and generation of detailed data reports Generation of realistic test data Generation of full database documentation Administration: database and server monitoring, backup/recovery, user and session management, and more That said, we gladly invite you to get started with dbForge Studio for SQL Server today—just [download it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and give it a go to see the full power of its capabilities. We bet they won’t leave you indifferent. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreate-upload-and-store-images-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+Images+with+Microsoft+Designer+and+Import+Them+into+SQL+Server+Databases+Using+dbForge+Studio&url=https%3A%2F%2Fblog.devart.com%2Fcreate-upload-and-store-images-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/create-upload-and-store-images-in-sql-server.html&title=How+to+Create+Images+with+Microsoft+Designer+and+Import+Them+into+SQL+Server+Databases+Using+dbForge+Studio) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/create-upload-and-store-images-in-sql-server.html&title=How+to+Create+Images+with+Microsoft+Designer+and+Import+Them+into+SQL+Server+Databases+Using+dbForge+Studio) [Copy URL](https://blog.devart.com/create-upload-and-store-images-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Creating a New Database in MySQL: Tutorial with Examples By [dbForge Team](https://blog.devart.com/author/dbforge) May 18, 2021 [0](https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html#respond) 63559 The article provides a detailed overview of how to create a database in MySQL using different methods and tools (including the Command Line, Workbench, and dbForge Studio for MySQL). MySQL is a relational database management system based on SQL. It is developed, distributed, and supported by the Oracle Corporation. MySQL is free and open-source software and it is gaining more and more popularity due to its reliability, compatibility, cost-effectiveness, and comprehensive support. MySQL has a fork – MariaDB, which is made by the original developers of MySQL. MariaDB has the same database structure and indexes which allows it to be a drop-in replacement for MySQL. MySQL and MariaDB both support a number of popular operating systems including but not limited to the following: Linux, [Ubuntu](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-ubuntu/) , Microsoft Windows, and [macOS](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-macos/) . Also, it’s easy to [install MySQL Server on Debian](https://www.devart.com/dbforge/mysql/install-mysql-on-debian/) using standard tools. Before you can start creating a new MySQL database, you need to [download MySQL Server](https://dev.mysql.com/downloads/mysql/) . In this article, we study the different ways to create a MySQL database. Contents 1. CREATE DATABASE: MySQL syntax example 2. Create a database from the Command Line Client 3. Create a database using MySQL Workbench 4. Create a database using dbForge Studio for MySQL CREATE DATABASE: MySQL syntax example The very first way to create a database in MySQL that should be mentioned is by using the CREATE DATABASE statement. This statement creates a database with the specified name. Please remember, that to use it, you need the CREATE privilege for the database. CREATE DATABASE mydatabase; Note: You will get an error if you run the CREATE DATABASE statement without specifying IF NOT EXISTS and the database already exists. So it’s better to use the IF NOT EXISTS clause to prevent errors. CREATE DATABASE IF NOT EXISTS mydatabase; After you execute the CREATE DATABASE statement, MySQL will return a message to notify whether the database has been created successfully or not. Create a database from the Command Line Client MySQL Command Line Client usually comes with MySQL Server installation pack. It is installed in two versions – with the support for UTF-8 and without it. You can run the console client right from the Start menu. To create a new MySQL database via MySQL Command Line Client: 1. Run the client. 2. Enter your password. 3. Execute the create database command. You can learn more about working with MySQL Command-Line Client in our [How to Connect to MySQL Server](https://blog.devart.com/how-to-connect-to-mysql-server.html) article. How to create a database in MySQL Workbench MySQL Workbench is a popular visual database tool for designing, developing, and administering MySQL databases. How to use MySQL Workbench to create a database: 1. Launch MySQL Workbench and click the + button to open the Setup New Connection wizard. 2. Enter the name for the connection and username, then click Test Connection . Enter the password in the dialog asking for the password. We enter localhost and root . 3. Click the required connection in the MySQL Connections section of the Workbench start page. 4. In the MySQL Workbench window that opens, click the Create a new schema in the connected server button on the main toolbar. Then enter the schema name, change the character set and collation if necessary, and click Apply . 5. In the Apply SQL Script to Database window that opens, click Apply . Then click Finish . 6. Check that the database has appeared in the Navigator. Workbench today is one of the most popular professional tools for MySQL database development. However, [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is its worthy rival boasting, in many aspects, superior functionality. Let’s look at how you can create a new MySQL database with dbForge Studio for MySQL. How to create a database using dbForge Studio for MySQL dbForge Studio for MySQL offers a quite simple and intuitive way to create a new database in MySQL. You don’t need to be a professional developer or DBA to [get started with dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/getting-started.html) . 1. First, you need to create a required connection. Click the New Connection button on the Database Explorer toolbar. Alternatively, go to the Database menu in the main toolbar and click New Connection . 2. In the Database Connection Properties window that opens, provide all the necessary credentials for your connection. 3. The new connection will appear in the Database Explorer. Right-click the connection name and select New Database . Alternatively, go to the Database menu in the main toolbar and click New Database . 4. In the New Database tab that will open, enter the name for your new database, select charset and collation. You can check the script for the database in the lower part of the window. Click Apply Changes , once you’ve configured everything as required. 5. Check that your newly created database has appeared on your MySQL server. To do this, right-click the connection name in the Database Explorer and click Refresh . After Database Creation When you have multiple databases in your MySQL server, then to start working with the database you’ve created, use the following statement: USE database_name; To create a new table in the database, use the following syntax: CREATE TABLE [IF NOT EXISTS] table_name(\n column_1_definition,\n column_2_definition,\n …,\n table_constraints\n) ENGINE=storage_engine; In the following picture, you can see how you can create a MySQL table in dbForge Studio for MySQL. The Studio has an automatic code completion feature, so won’t have to type all the code. To learn more about creating tables in MySQL, please refer to our [How to Create Tables using MySQL CREATE TABLE Statement and via GUI Tool for MySQL](https://blog.devart.com/mysql-create-table-query.html) article. Along with our deep dive into MySQL databases, we’ve incorporated a helpful guide on [how to show or list tables in a MySQL database](https://www.devart.com/dbforge/mysql/studio/show-tables-list-in-mysql.html) , equipping you with a crucial skill in database management. To delete a table, use the following statement: DROP DATABASE databasename; dbForge Studio for MySQL allows deleting a database visually without having to write code. Just right-click the required database in the Database Explorer, then click Delete . Learn how to master [creating views in MySQL](https://blog.devart.com/how-to-create-a-view-in-mysql.html) with best practices and tips in this article. Conclusion In the article, we have walked through the ways to create a MySQL database and proved that dbForge Studio for MySQL is the [best alternative to MySQL Workbench](https://www.devart.com/dbforge/mysql/studio/alternative-to-mysql-workbench.html) . As you can see, dbForge Studio for MySQL allows creating a new database quickly and easily in a convenient and modern interface. And dbForge Support Team is always ready to help you in case there are any difficulties. Moreover, we provide a fully-functional 30-day free trial of our products. [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) and test all its features to their full capacity. Also, you can watch this video tutorial: Tags [create database in MySQL syntax](https://blog.devart.com/tag/create-database-in-mysql-syntax) [create database MySQL](https://blog.devart.com/tag/create-database-mysql) [create database MySQL Workbench](https://blog.devart.com/tag/create-database-mysql-workbench) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreating-a-new-database-in-mysql-tutorial-with-examples.html) [Twitter](https://twitter.com/intent/tweet?text=Creating+a+New+Database+in+MySQL%3A+Tutorial+with+Examples&url=https%3A%2F%2Fblog.devart.com%2Fcreating-a-new-database-in-mysql-tutorial-with-examples.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html&title=Creating+a+New+Database+in+MySQL%3A+Tutorial+with+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html&title=Creating+a+New+Database+in+MySQL%3A+Tutorial+with+Examples) [Copy URL](https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/creating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Creating a TFVC Repository in Azure DevOps and Linking It to Source Control By [dbForge Team](https://blog.devart.com/author/dbforge) April 29, 2021 [0](https://blog.devart.com/creating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html#respond) 4280 During software development, it is critical to maintain a structured and traceable history of changes. This can be easily achieved with version control systems that allow you not only to version-control data but also commit, track or roll back changes, resolve conflicts, switch between branches or manage merge operations. In the article, we’ll guide you through the step-by-step process of establishing a TFVC repository in Azure DevOps and linking your database to it using the dbForge Source Control tool. It helps you version control your database and keep a history of changes. In addition, we’ll cover some basics of Azure repositories, delve into TFVC as a version control system, and outline the importance of using them from the beginning of your project development. Contents Why do we need Azure Repos? What is TFVC? Introduce version control comparison Early adoption emphasis Getting started with Azure DevOps Linking a database to source control Why do we need Azure Repos? Azure DevOps provides end-to-end DevOps tools that assist teams in planning, working together on code development, and deploying applications. The platform incorporates various features to help you boost productivity and facilitate your database development and deployment. In this article, we want to focus on a specific service integrated into Azure DevOps that provides free cloud-hosted repositories. Azure Repos allows you to create your repository using Git or Team Foundation Version Control (TFVC). Using a version control system contributes significantly to the success of your project. With its help, you can stay abreast of the changes delivered into the code, check the development history, and roll back the changes to any version if needed. Apart from that, a version control system establishes flawless coordination across the teams working on the same project. What is TFVC? Team Foundation Version Control (TFVC) is a centralized version control system that delivers all the required features to support database development. As a rule, every team member gets only one version of each file on the dev machine. The history of changes is stored only on the server, and branches are path-based and created on the server. Using TFVC, you can implement granular permissions and limit access to a file level. And since your team checks all their work into your Team Foundation server, you can quickly examine changes, find out which user introduced a changeset, and determine the exact changes the user made. Introduce version control comparison Given that Git is the most widely used version control system let’s outline the main differences between TFVC and Git: TFVC adheres to a centralized version control model featuring a client-server architecture, while Git uses a distributed approach. This difference gives Git greater flexibility, allowing you to seamlessly work with a local copy of the remote repository and quickly execute tasks such as branch switching or merging. TFVC stores changes on a per-file basis tracked through changesets. In contrast, Git captures a committed file as a snapshot, facilitating straightforward revert or undo operations. For a more detailed comparison of TFVC and Git, see the [Microsoft documentation](https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/comparison-git-tfvc?view=azure-devops) . So, which version control system to choose for database versioning depends on the development team’s preferences and the specific requirements of the project. While TFVC excels in structured branching, Git’s flexibility and comprehensive tooling make it a preferred choice for modern development workflows, especially when seamless database versioning integration is crucial. Early adoption emphasis Let’s now discuss the early adoption of version control, especially TFVC for database versioning. It is worth saying that it is not merely a best practice but a strategic investment in the success and sustainability of software development projects. Here are key points highlighting the importance of early adoption: Change tracking and history : TFVC allows developers to track changes made to the database over time. Every modification is recorded, providing a comprehensive history of alterations to the database schema. It, in perspective, may help understand database development and troubleshooting issues. Collaboration and teamwork : Early adoption of TFVC facilitates developers to work concurrently on different aspects of the database without the risk of conflicting changes. Version control helps merge these changes seamlessly, promoting a collaborative and efficient development environment. Risk mitigation : Version control allows developers to roll back changes in case of errors or unexpected issues, thus reducing the risk associated with database modifications and providing a reliable mechanism for recovering from unforeseen challenges. Consistency across multiple environments: With version control, developers can ensure consistency between database versions across different environments, such as development, testing, and production. This capability ensures that everyone works with the same database schema baseline, minimizing the likelihood of deployment. Code reviews and auditing : TFVC facilitates code reviews by providing a clear view of the changes made to the database. This enhances the code quality by allowing team members to review modifications, provide feedback, and ensure that coding standards are adhered to. Additionally, version control logs can serve as an audit trail for compliance and regulatory purposes. Automated build and deployment : Version control is a fundamental element for implementing automated build and deployment pipelines. By integrating TFVC into these processes, developers can automate the testing and deployment of database changes, reducing manual errors and accelerating the release cycle. Adaptation to change: Early adoption of TFVC instills a version control mindset within the development team. This mindset is essential as projects evolve, teams grow, and requirements change. Developers accustomed to version control are better equipped to handle the dynamic nature of software development. Continuous Integration and Continuous Delivery (CI/CD) : TFVC is integral to implementing CI/CD practices, allowing for the continuous integration and delivery of database changes. Early adoption ensures that the development process aligns with modern DevOps practices, leading to faster and more reliable software releases. As projects grow in complexity, having a robust version control system like TFVC becomes increasingly essential. Early adoption establishes a solid foundation for scaling development efforts and accommodating the evolving needs of the project. Getting Started With Azure DevOps To begin, create a new Azure DevOps project by going to your [Azure DevOps account](https://azure.microsoft.com/en-us/products/devops/) and clicking Start free to sign up. Next, specify a country/region where you reside and click Continue : Following that, indicate the name of your Azure DevOps organization, select where you would like your projects to be hosted, and enter the characters below. To proceed, click Continue : Done! The organization has been created. So, let’s now build your project. In your organization – https://dev.azure.com/YourOrganizationName , click + New project in the upper-right corner of the page. In the Create new project dialog that opens, specify the project details: In the Project name field, specify the name of your project. Optional: In the Description field, add detailed information describing your project. Under Visibility , select Public to make your project visible to anyone on the Internet or Private to make the project visible only to the users to whom you grant access. Expand the Advanced options, select Team Foundation Version Control as a version control and a [process template](https://docs.microsoft.com/en-us/azure/devops/boards/work-items/guidance/choose-process?view=azure-devops&tabs=basic-process) based on your needs. The available templates are Basic, Agile, CMMI, and Scrum processes. Click Create to create the project. Congrats! You have created a project and a new repository in Azure DevOps. Note that the name of the repository corresponds to the name of the project: If the project is private, add users to access your project repository. This step is important to further link your database to source control. Linking a database to source control Now that you are all set and ready, you can use a reliable tool to manage code changes in your database project. We’ll utilize [dbForge Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/) that can be easily integrated into SQL Server Management Studio (SSMS) as an add-in. The tool helps version-control database schemas and table static data while preserving database integrity. To go on with the tutorial, [install the product](https://www.devart.com/dbforge/sql/source-control/download.html) and open SQL Server Management Studio. Additionally, watch [this video](https://youtu.be/reU4ALv2ctg) to discover how dbForge Source Control is involved in the DevOps process. [To link your database to a TFVC repository](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-tfs.html) , in Object Explorer , right-click the database you want to link to source control and select Link Database to Source Control : In the Link Database to Source Control dialog that opens, click the plus icon in the Source control repository field to establish a connection to your repository: In the Source Control Repository Properties dialog that opens, select Team Foundation Version Control (TFVC) as a source control system from the Source control system dropdown list and specify Server URL , for example, https://dev.azure.com/Organization_Name . After that, choose the authentication type. Since the Base authentication type is not recommended due to its low-security level, you should opt for either the Token or Web authentication type. In the case of the Token authentication type, you will need to obtain an authorization token in the Azure DevOps user settings: If you opt for the Web authentication type, enter a Username and Password for your web account. Next, specify the Database folder name, which is the name of your Azure DevOps repository. Note that the name of your repository is generated automatically from the project name at the project creation stage. You can enter the name manually in the Database folder field or click the ellipses and choose the name in the window that appears. It is recommended that you create a new folder within your Azure DevOps project. Click Test to verify that the database can be successfully connected to source control. To proceed, click OK : Next, select a [database development model](https://docs.devart.com/source-control/linking-to-source-control/dedicated-model.html) and click Link : Upon the successful connection of the database to source control, the database will get the following icon in Object Explorer: You can now start introducing changes to your database, committing them, and keeping track of all modifications within a handy interface of the tool. Besides, you can reap the benefits of both tools and automate your database development with the DevOps approach. Conclusion In the article, we have briefly explored the importance of Azure DevOps and TFVC, emphasizing the advantages of adopting a version control system from the project’s start. We also provided a step-by-step guide on linking your database to a TFVC repository using the robust SSMS add-in – dbForge Source Control for SQL Server. In addition, dbForge Source Control extends its support beyond TFVC to encompass a range of popular version control systems, including Apache Subversion (SVN), Git, Mercurial (Hg), Perforce (P4), and SourceGear Vault. The tool can also version-control database schemas and static table data, commit and revert changes, view and resolve conflicts, and allow users to track changes in the history of changes. [Download](https://www.devart.com/dbforge/sql/source-control/download.html) a trial version of dbForge Source Control as a part of the SQL Tools pack and enjoy the advanced features and functionalities for free within 30 days. Tags [Azure Devops](https://blog.devart.com/tag/azure-devops) [source control](https://blog.devart.com/tag/source-control) [tfs](https://blog.devart.com/tag/tfs) [tfvc](https://blog.devart.com/tag/tfvc) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html) [Twitter](https://twitter.com/intent/tweet?text=Creating+a+TFVC+Repository+in+Azure+DevOps+and+Linking+It+to+Source+Control&url=https%3A%2F%2Fblog.devart.com%2Fcreating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/creating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html&title=Creating+a+TFVC+Repository+in+Azure+DevOps+and+Linking+It+to+Source+Control) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/creating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html&title=Creating+a+TFVC+Repository+in+Azure+DevOps+and+Linking+It+to+Source+Control) [Copy URL](https://blog.devart.com/creating-a-tfvc-repository-in-azure-devops-and-linking-it-to-source-control.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/creating-tfs-custom-check-in-policy.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Creating TFS Custom Check-in Policy By [ALM Team](https://blog.devart.com/author/alm) April 23, 2014 [4](https://blog.devart.com/creating-tfs-custom-check-in-policy.html#comments) 25897 This article explains implementation of the TFS custom check-in policy for pre-commit code review. We developed this policy for [Review Assistant](https://www.devart.com/review-assistant/) – our code review tool. Idea Behind the Implementation A while ago a user posted an idea on the [Review Assistant forum](https://devart.uservoice.com/forums/145340-review-assistant) on UserVoice: “I setup a build management in TFS server and add a build check-in policy to TFS, and now every check-in has to have a successful build, otherwise you cannot check-in. It’d be great if we can setup a custom policy to TFS that every check-in needs to pass a code review . So, instead of developers having to shelve their changes manually and assign the shelveset to a code review, if this tool can do that automatically for you in a check-in process, that’d be super.” Other users supported this idea so we decided to implement it. The rest of the article explains what is a check-in policy and how to implement it . What is TFS Check-in Policy? Check-in policy enforces constraints every time when files are checked into source control. Team Foundation Server provides a number of out-of-box check-in policies, including policies that check that static code analysis has been performed, and policies that check that work items are associated with check ins. How to Add Check-in Policy To add check-in policy in TFS, you must have the Edit project-level information permission set to Allow . For more information, see [Team Foundation Server Permissions](https://docs.microsoft.com/en-us/azure/devops/organizations/security/permissions?view=vsts) . Open Team Explorer , left-click the window header, click Settings , and then click Source Control under Team Project section. The Source Control Settings dialog box appears. Click the Check-in Policy tab and then click Add . The Add Check-in Policy dialog box appears. In the Check-in Policy list, select the policy type you want, and then click OK. When you are satisfied with the settings for the check-in policies, click OK ; the new check-in policy now displays with future check-ins. For more information see [Add Check-in Policies](https://docs.microsoft.com/en-us/azure/devops/repos/tfvc/add-check-policies?view=vsts) on MSDN. How Does Check-in Policy Work? Before you check-in, TFS evaluates pending changes in your working directory. Here TFS evaluates the check-in policy for the first time. If the policy evaluation fails you get a warning. You can override the policy warning if necessary to force check-in, but you need to specify the reason for such override. Check-in Policy Implementation Procedure Implementation procedure for a custom check-in policy is pretty straightforward: Create and build a custom policy class. Register the custom policy in the Windows registry. Then you can add your policy to a TFS project as described earlier in this article. Custom Policy Class Skeleton All you need is to create a class that derives from PolicyBase and place it into a class library. It may look like the following. using System;\nusing Microsoft.TeamFoundation.VersionControl.Client;\n\nnamespace Devart.ReviewAssistant.TeamFoundation\n{\n /// \n /// Custom check-in policy that requires pre-commit code review.\n /// \n [Serializable]\n public class CodeMustBeReviewedPolicy : PolicyBase\n {\n /// \n /// Gets the description of this policy.\n /// \n public override string Description\n {\n get\n {\n return \"The policy forces users to review code using Review Assistant...\";\n }\n }\n /// \n /// This is a string that is stored with the policy definition on the source\n /// control server. If a user does not have the policy plug-in installed,\n /// this string is displayed. You can use this to explain to the user \n /// how they should install the policy plug-in.\n /// \n public override string InstallationInstructions\n {\n get\n {\n return \"none\";\n }\n }\n\n /// \n /// This string identifies the type of policy. It is displayed in the\n /// policy list when you add a new policy to a Team Project.\n /// \n public override string Type\n {\n get\n {\n return \"Pre-commit code review\";\n }\n }\n\n /// \n /// This string is a description of the type of policy. It is displayed\n /// when you select the policy in the Add Check-in Policy dialog box.\n /// \n public override string TypeDescription\n {\n get\n {\n return \"This policy requires that files pass code ...\";\n }\n }\n\n /// \n /// This method is called if the user double-clicks on\n /// a policy failure in the UI.\n /// \n /// The policy failure that causes this event.\n public override void Activate(PolicyFailure failure)\n {\n // create a review\n }\n\n /// \n /// This method is called if the user presses F1 when a policy failure\n /// is active in the UI.\n /// \n /// The policy failure that causes this event.\n public override void DisplayHelp(PolicyFailure failure)\n {\n // open help page\n }\n\n /// \n /// This method is called by the policy framework when you create\n /// a new check-in policy or edit an existing check-in policy.\n /// You can use this to display a UI specific to this policy type\n /// allowing the user to change the parameters of the policy.\n /// \n /// \n /// Arguments for the configuration dialog box.\n /// True if the dialog box opened; otherwise, false.\n public override bool Edit(IPolicyEditArgs policyEditArgs)\n {\n // Do not need any custom configuration\n return true;\n }\n\n /// \n /// This method performs the actual policy evaluation.\n /// It is called by the policy framework at various points in time\n /// when policy should be evaluated.\n /// \n /// \n /// An array of PolicyFailures that results from the evaluation.\n /// \n public override PolicyFailure[] Evaluate()\n {\n return new PolicyFailure[0]; // never fail\n }\n }\n} Registering Custom Policy Next, you need to add an entry to the Windows registry so that your policy appears in the Add Check-in Policy dialog box. Here is an example of the .reg file that registers the policy. Windows Registry Editor Version 5.00\n[HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\VisualStudio\\11.0\\TeamFoundation\\SourceControl\\Checkin Policies]\n\"Devart.ReviewAssistant.TeamFoundation\"=\"c:\\\\Program Files\\\\Devart\\\\ReviewAssistant\\\\Devart.ReviewAssistant.TeamFoundation.dll\" Note that you must install the policy assembly on every computer that needs to have a reference the assembly. Make sure that you provide valid installation instructions for your policy as it will help fellow developers to install it. Let’s dive into the details of our implementation of the check-in policy for code review. We shall discuss: The algorithm of our Pre-Commit Code Review policy; Main problems of implementing the algorithm; Restrictions of the policy implementation. How Does the Pre-Commit Code Review Policy Work? Here is the policy algorithm description in a nutshell : Once a user invokes the Check In Pending Changes command or just opens the Pending Changes view in Team Explorer, the policy is evaluated. The policy loads the Review Assistant package and attempts to connect to a review server using previously stored credentials. Policy searches for the open reviews for the pending changes. There are three possible outcomes: No review is found – the policy is not satisfied. An open review  is found –  the policy is still not satisfied. A closed review is found – the policy is satisfied (no warning). A user activates the policy (when it’s not satisfied). Here we have two scenarios: There are files that require the code review. A shelveset with pending changes is created Code Review Board is opened and the new review for the above shelveset is created. The user specifies a reviewer and publishes the review. Files in the pending check-in are under review. Code Review Board is opened and the review, which causes the policy to fail, is displayed. The user can check the status of  the review and push the things forward, if necessary. Note: When the policy fails to evaluate correctly it returns no failure. In other words, if it fails to connect to a review server the policy is still satisfied. Tough Point of the Implementation The main problem of the [pre-commit code review](https://www.devart.com/review-assistant/learnmore/pre-commit-vs-post-commit.html) process lies in creeping changes . Developers are constantly changing the code, even while the review is in process. Thus, we faced the dilemma: how to identify what code was reviewed and what wasn’t. It’s not enough to identify the files that were changed. We want to make sure that the file content remains stable throughout the review. The solution we chose was to calculate MD5 hash of each changed file during the policy evaluation. Fortunately, the policy evaluation runs in a background thread and doesn’t block the Visual Studio UI. These hash-values are stored locally on user’s computer. We maintain a sort of cache for the reviews created by the user. This saves time on searching the reviews for the pending changes. Policy Restrictions Finally, we’ll cover the restrictions of our Pre-Commit Code Review policy. Policy does not filter the files during the evaluation. It will fail even if you change supplementary files that don’t need a review. In this case you should override the policy warning. If you add files to your check-in after the review is created the policy will fail. In this case you should remove additional files from the check-in to satisfy the policy. Reviews created by the policy are cached for a limited time. But it won’t be an issue unless you delay the check-in for a month. Conclusion We’ve covered some detail of the implementation of Pre-Commit Code Review policy in Review Assistant. We hope it helps you in using our product. [Download Review Assistant](https://www.devart.com/review-assistant/download.html) and start reviewing code today. Tags [code review](https://blog.devart.com/tag/code-review) [review assistant](https://blog.devart.com/tag/review-assistant) [tfs](https://blog.devart.com/tag/tfs) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcreating-tfs-custom-check-in-policy.html) [Twitter](https://twitter.com/intent/tweet?text=Creating+TFS+Custom+Check-in+Policy&url=https%3A%2F%2Fblog.devart.com%2Fcreating-tfs-custom-check-in-policy.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/creating-tfs-custom-check-in-policy.html&title=Creating+TFS+Custom+Check-in+Policy) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/creating-tfs-custom-check-in-policy.html&title=Creating+TFS+Custom+Check-in+Policy) [Copy URL](https://blog.devart.com/creating-tfs-custom-check-in-policy.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS Creating TFS Custom Check-in Policy – Part 2 April 25, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 8:59 am […] […] DevArt releases new code peer review tool for Visual Studio « SDTimes July 9, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 6:17 am […] More information on some of the new features, particularly the TFS custom check-in policy, is available here. […] Unable to see the policy in \"Add Check-in Policy\" Dialog box October 27, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 8:45 pm I have followed all the steps as mentioned, still I am unable to see the custom policy in “Add Check-in Policy” dialog box. I am using visual studio version ultimate 2012 (version 11.0.61030.00) Sjoerd van Loon March 14, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 3:12 pm If your OS is 64bit and your VS is 86x you should add it to the Wow6432Node or add it not to local machine but to HKEY_CURRENT_USER [https://stackoverflow.com/questions/14571063/tfs-custom-check-in-policy-is-not-in-the-add-box](https://stackoverflow.com/questions/14571063/tfs-custom-check-in-policy-is-not-in-the-add-box) Comments are closed."} {"url": "https://blog.devart.com/cross-compiling-in-lazarus-on-linux-using-unidac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Cross-Compiling in Lazarus on Linux Using UniDAC as an Example By [DAC Team](https://blog.devart.com/author/dac) December 30, 2021 [0](https://blog.devart.com/cross-compiling-in-lazarus-on-linux-using-unidac.html#respond) 2973 The topic of cross-compiling projects in Lazarus on Linux systems has been repeatedly raised by our users. Therefore, in this article, we want to examine in detail the features of cross-compilation in Lazarus and show how to create a simple application using [UniDAC](https://www.devart.com/unidac/) and compile it for a Windows platform in a Linux environment. Contents Setting up the working environment Peculiarities of UniDAC cross-compilation Creating a project and setting cross-compilation parameters Setting up the working environment First, let’s take a quick look at setting up a working environment. To create our project, we used Lazarus 2.0.12 + FPC 3.2.0 installed on a workstation with a 64-bit Ubuntu 18.04 LTS operating system. After installing Lazarus on a Linux system, you need to make the initial configuration of the FPC compiler, during which the necessary module packages will be generated for compiling a Windows application. To do this, you need to run the following sequence of commands. Note: The configuration script is executed with root privileges, therefore, at the end of the process, be sure to exit root mode (by closing the terminal window or by pressing Ctrl + D). sudo –i\nexport FPCVER=\"3.2.0\"\ncd /usr/share/fpcsrc/\"$FPCVER\"/\nmake clean all OS_TARGET=win64 CPU_TARGET=x86_64\nmake clean all OS_TARGET=win32 CPU_TARGET=i386\nmake crossinstall OS_TARGET=win64 CPU_TARGET=x86_64 INSTALL_PREFIX=/usr\nmake crossinstall OS_TARGET=win32 CPU_TARGET=i386 INSTALL_PREFIX=/usr\nln -sf /usr/lib/fpc/\"$FPCVER\"/ppcrossx64 /usr/bin/ppcrossx64\nln -sf /usr/lib/fpc/\"$FPCVER\"/ppcross386 /usr/bin/ppcross386 The FPCVER variable set at the beginning of the script must match the version of the FPC you are using. As we wrote above, our environment uses FPC 3.2.0. You can check the version of the FPC you are using by opening the About window in Lazarus. Peculiarities of UniDAC Cross-Compilation Here, we will not describe in detail the installation of UniDAC components in Lazarus, since this process does not have any difficulties and is described in the documentation – [https://docs.devart.com/unidac/installation_unidac.htm](https://docs.devart.com/unidac/installation_unidac.htm) . In the demo project, we used UniDAC Professional Edition with Source Code version 9.1. In our environment, the UniDAC components are installed in the folder ~/Documents/UniDAC/ . Let us dwell on a few points, which, on the one hand, are the “internals” of UniDAC and should not particularly concern the user, but on the other hand are important for understanding the features of compilation and cross-compilation of a project with UniDAC. Firstly, UniDAC has a built-in SQLite engine, which is used to work with SQLite databases in Direct mode, without using a client library, and also has a built-in database encryption functionality. Considering UniDAC modules, this engine represents pre-compiled units for various platforms located in the folder ~/Documents/UniDAC/Source/sqlite3/ . Note: Keep in mind the path to this folder. We’ll need it later. When the project is finally built, the compiler will statically link the precompiled unit for the required target platform to the application file. Thus, the precompiled SQLite modules use many standard operating system functions (I / O, etc.). They would be taken directly from the operating system libraries if you compiled a Windows application in a Windows environment. But, in the case of building a Windows application on Linux, the compiler may need some additional libraries. These libraries for Windows 32-bit and Windows 64-bit platforms are supplied with UniDAC and located respectively in the folders ~/Documents/UniDAC/Lib/Lazarus1/i386-win32/ and ~/Documents/UniDAC/Lib/Lazarus1/x86_64-win64/ . For further steps, we will need these folders as well. Creating a Project and Setting Cross-Compilation Parameters We will not dwell directly on the application development process using UniDAC, but we’ll focus on setting up the project and preparing it for compilation for the Windows platform. Let’s assume that we have a project called project1, which in our case is saved in the folder ~/Documents/test/ . First, let’s configure the necessary general parameters of the project. To do this, open the project settings (menu Project > Project Options … or use the keyboard shortcut Ctrl + Shift + F11 ) and go to the section Compiler Options > Paths . In the previous section, we talked about the precompiled modules of the SQLite engine that are included in UniDAC and located in the folder ~/Documents/UniDAC/Source/sqlite3/ . The path to the folder with these modules must be added to the Other unit files field. The completed steps are already enough for the project to be successfully compiled for the Linux platform. Let’s check the build by selecting the Run > Build menu item or pressing the Shift + F9 key combination. Let’s move on to the next stage: setting up the project for cross-compilation. First, let’s create separate build modes for Windows 32-bit and Windows 64-bit platforms. To do this, open the modes window by clicking the ellipsis button at the top of the project settings window. In the dialog box that appeared, add two modes: Win64 and Win32 for Windows 64-bit and Windows 32-bit, respectively. Let’s make the Win64 mode active and close the dialog. After that, in the Paths section of the project settings, in the Libraries field, you need to specify the path to additional Windows libraries, which we mentioned in the previous section. For the Win64 mode, this is the path to the folder ~/Documents/UniDAC/Lib/Lazarus1/x86_64-win64/ . The next step is to install the target platform. In the project settings section Config and Target in the Target OS field, specify the Win64 platform. In the Target CPU family field, specify the x86_64 processor family. Pay attention to the warning shown at the bottom of the window. It says that the default LCL framework gtk2 cannot be used to compile an application for the Windows platform. Go to the Additions and Overrides settings section and select the win32 value in the set “ LCL Widget Type ’’ drop-down list. In the added line LCLWidgetType: = win32 , set the checkboxes for both build modes, Win64 and Win32, to avoid returning to this step when setting up a project for the 32-bit Windows platform. This completes the project setup for compilation for the Windows 64-bit platform. We check the correctness of the actions performed and compile the project (menu Run > Build or the keyboard shortcut Shift + F9 ). We configure the project for compilation for the Windows 32-bit platform in the same way described above. Set the active build mode Win32, in the Paths section of the project settings in the Libraries field, specify the path to additional Windows libraries: ~/Documents/UniDAC/Lib/Lazarus1/i386-win32/ . In the Config and Target section of the project settings, in the Target OS field, specify the Win32 platform and in the Target CPU family field – the i386 processor family. There is no need to set up LCLWidgetType as we did earlier. We check the correctness of the settings. As a result, we get the opportunity to develop applications using UniDAC in Lazarus for Linux and compile them for both Linux and Windows platforms in a single development environment without using additional tools. Hopefully, this article will answer many earlier asked questions related to cross-compilation in Lazarus. Tags [dac](https://blog.devart.com/tag/dac) [lazarus](https://blog.devart.com/tag/lazarus) [linux](https://blog.devart.com/tag/linux) [unidac](https://blog.devart.com/tag/unidac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcross-compiling-in-lazarus-on-linux-using-unidac.html) [Twitter](https://twitter.com/intent/tweet?text=Cross-Compiling+in+Lazarus+on+Linux+Using+UniDAC+as+an+Example&url=https%3A%2F%2Fblog.devart.com%2Fcross-compiling-in-lazarus-on-linux-using-unidac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/cross-compiling-in-lazarus-on-linux-using-unidac.html&title=Cross-Compiling+in+Lazarus+on+Linux+Using+UniDAC+as+an+Example) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/cross-compiling-in-lazarus-on-linux-using-unidac.html&title=Cross-Compiling+in+Lazarus+on+Linux+Using+UniDAC+as+an+Example) [Copy URL](https://blog.devart.com/cross-compiling-in-lazarus-on-linux-using-unidac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/customized-hash-function-python-generator.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Customized Hash Function Python Generator By [dbForge Team](https://blog.devart.com/author/dbforge) September 30, 2021 [0](https://blog.devart.com/customized-hash-function-python-generator.html#respond) 3628 In this article, we will show how you can take more control over the data in your database by using a custom hash function Python generator in the dbForge Data Generator tool. We will provide an example Python script, which you will be able to later use as a starting point for creating your own scripts. Data-driven applications, that operate on diverse sets of data, are now appearing and spreading across the economy with a huge speed. Our world for sure has become data-driven itself. In data-driven apps, the behavior of an application is governed by the data it processes, and the input data set can significantly affect the flow of the application. This leads us to the question, ‘How should those apps be tested?’ It is axiomatic that real production data can not be used for this purpose. Thus, there appears an overwhelming need for precise, interrelated, and meaningful test data. The other problem is that the test data needs to be quickly generated and accurately inserted into databases. There are dozens of data generator solutions on the market today, however, most of them have a problem generating interrelated, realistic-like data. dbForge Data Generator for SQL Server really stands out from the other tools as it delivers an advanced Python generator that allows defining the generated data using IronPython scripts. What is more—you can use dbForge Data Generator not only for populating databases with high-quality fake data but also for managing data in your production databases. Let’s consider the example of using a custom hash function Python generator to create unique IDs based on the combination of columns with different data types. Worked Example Suppose, we have the actor table in a database. Here is the script for it: CREATE TABLE actor_sample (\nactor_id SMALLINT NOT NULL PRIMARY KEY\n,first_name VARCHAR(45) NOT NULL\n,last_name VARCHAR(45) NOT NULL\n,last_update DATETIME NOT NULL DEFAULT (CURRENT_TIMESTAMP)\n,Unique_ID VARCHAR(255) DEFAULT NULL\n); Problem We want to generate unique IDs into the Unique_ID column. The generated IDs unlike the auto-increment primary keys need to depend on the data in the table and be unique for different combinations of values in the three columns that contain meaningful data ( first_name, last_name, and last_update ). And when repopulating the Unique_ID column, it should always generate the number it generated before in case that combination has occurred earlier. Solution The task can be easily accomplished with the help of the custom hash function Python generator. Python hash() is a built-in function that accepts input expressions of arbitrary types and returns the hash values of objects, which are also called hash code s, digest s, or simply hash . These values represent fixed-sized integers that identify particular values. Hash is very useful, as it allows for a quick look-up of values in large sets of values. In simple words, hashing is a method used to turn data into a fixed-length number that may serve as a digital “fingerprint” of that data. Those “fingerprints’ or hash values then can be used for compression, data comparison, cryptology, etc. Hash values are widely used in data indexing since they have fixed length regardless of the size of the values they were generated from. Thus hash occupies minimal space in a database compared to other values of varying lengths. Python script to be used import hashlib\n\nt=\"|%s|%s|%s|\" % (first_name, last_name, str(last_update))\nt1=hashlib.sha1(t)\nt1.hexdigest() Where: 1) Python hashlib module is used. 2) Non-string formats are converted into strings and concatenated. 3) The SHA1 hash function is used to calculate an alphanumeric string. How to use the Python Generator of the dbForge Data Generator tool dbForge Data Generator for SQL Server comes with the advanced Python Generator that allows generating data using IronPython scripts. To generate data using the Python Generator: 1. Open the dbForge Data Generator for SQL Server and click New Data Generation on the ribbon. 2. In the Data Generator Project Properties window that opens, specify a required connection and database. Then click Next . 3. On the Options tab, make the necessary configurations for your Data Generator Project. Once done, click Open . 4. In the Data Generator Project that opens, select a table or tables you want to generate data for. 5. In the navigation tree, click the column you need to assign the Python Generator to. 6. Select Python from the Generator drop-down list on the right. 7. Enter the required Python script in the field and preview the data to be generated in the grid at the bottom of the window. 8. Click the Green arrow to populate the table with the generated data. As you can see, the script works perfectly and the unique IDs have been successfully generated. Potential for use So, when might you need to generate hash keys like in our worked example? One reason is to prevent hackers from using sequential auto-incremented keys to take a stroll through your database. The other example is to simplify data comparison in two different databases—having the unique IDs generated with the help of the hash function will allow you not to compare all the columns in the two tables, but only the ones containing these unique IDs. Conclusion In the article, we consider a simple and elegant method to generate unique and conclusive IDs that depend on the data in the table. We provide a sample Python script and give a detailed step-by-step guide on how to use that script in the dbForge Data Generator tool to achieve a successful result. Give dbForge Data Generator for SQL Server a try for 30 days absolutely free! [Download](https://www.devart.com/dbforge/sql/data-generator/download.html) the full-featured trial version of the product and test-drive it on your projects. Tags [Customized Hash Function Python Generator](https://blog.devart.com/tag/customized-hash-function-python-generator) [data generator](https://blog.devart.com/tag/data-generator) [dbForge Data Generator for SQL Server](https://blog.devart.com/tag/dbforge-data-generator-for-sql-server) [Python](https://blog.devart.com/tag/python) [Python Generator](https://blog.devart.com/tag/python-generator) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcustomized-hash-function-python-generator.html) [Twitter](https://twitter.com/intent/tweet?text=Customized+Hash+Function+Python+Generator&url=https%3A%2F%2Fblog.devart.com%2Fcustomized-hash-function-python-generator.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/customized-hash-function-python-generator.html&title=Customized+Hash+Function+Python+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/customized-hash-function-python-generator.html&title=Customized+Hash+Function+Python+Generator) [Copy URL](https://blog.devart.com/customized-hash-function-python-generator.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/customizing-http-400-and-404-errors-in-asp-net-cor.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) Customizing HTTP 400 and 404 Errors in ASP.NET Core By [Max Remskyi](https://blog.devart.com/author/max-remskyi) May 31, 2023 [0](https://blog.devart.com/customizing-http-400-and-404-errors-in-asp-net-cor.html#respond) 2392 Introduction To store and retrieve data from the database, we’ll use [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/download.html) , a high-performance and enhanced data provider for PostgreSQL built on top of ADO.NET and can work on both connected and disconnected modes. Pre-requisites You’ll need the following tools to deal with code examples: Visual Studio 2022 Community Edition PostgreSQL dotConnect for PostgreSQL You can download PostgreSQL from here: [https://www.postgresql.org/download/](https://www.postgresql.org/download/) You can download a trial version of dotConnect for PostgreSQL from here: [https://www.devart.com/dotconnect/postgresql/download.html](https://www.devart.com/dotconnect/postgresql/download.html) What Are We Building Here? In this article, we’ll be building a simple application that will demonstrate how to customize HTTP 400 and 404 error responses in ASP.NET 6. Here are the steps we’ll follow throughout this article to accomplish this: Gain an understanding of Middleware in ASP.NET Core Create an ASP.NET 6 Core Web API project in Visual Studio 2022 Add the Devart.Data.PostgreSql NuGet package to the API project Write a simple controller and its associated classes Create a custom middleware to handle HTTP 400 error responses Create another custom middleware to handle HTTP 404 error responses Run the Application What is a Middleware? In ASP.NET Core, a middleware is a component that sits between a web server and an application’s request/response pipeline, processing incoming requests and outgoing responses. In addition to inspecting and manipulating HTTP requests and responses, middleware components can handle authentication, logging, caching, compression, routing, and more. The following code example shows a custom middleware: public class MyCustomMiddleware\n{\n private readonly RequestDelegate _next;\n\n public CustomBadRequestMiddleware(RequestDelegate next)\n {\n _next = next;\n }\n\n public async Task InvokeAsync(HttpContext context)\n {\n await _next(context);\n context.Response.Headers.Add(\"X-Custom-Header\", \"Hello World!\");\n }\n} Create a new ASP.NET 6 Core Web API Project In this section, we’ll learn how to create a new ASP.NET 6 Core Web API project in Visual Studio 2022. Now, follow the steps outlined below: Open Visual Studio 2022. Click Create a new project . Select ASP.NET Core Web API and click Next. Specify the project name and location to store that project in your system. Optionally, checkmark the Place solution and project in the same directory checkbox. Click Next. In the Additional Information window, select .NET 6.0 (Long-term support) as the project version. Disable the Configure for HTTPS and Enable Docker Support options (uncheck them). Since we’ll not be using authentication in this example, select the Authentication type as None . Since we won’t use Open API in this example, deselect the Enable OpenAPI support checkbox. Since we’ll not be using minimal APIs in this example, ensure that the Use controllers (uncheck to use minimal APIs) is checked. Leave the Do not use top-level statements checkbox unchecked. Click Create to finish the process. We’ll use this project in this article. Install NuGet Package(s) into the API Project In your API project, i.e., the project you just created, you should install the dotConnect for PostgreSql package in your project. dotConnect for PostgreSQL is a high-performance data provider for PostgreSQL built on ADO.NET technology that provides a comprehensive solution for building PostgreSQL-based database applications. You can install this package either from the NuGet Package Manager tool inside Visual Studio or, from the NuGet Package Manager console using the following command: PM> Install-Package Devart.Data.PostgreSql Create the Database You can create a database using the pgadmin tool. To create a database using this Launch this tool, follow the steps given below: Launch the pgadmin tool Expand the Servers section Select Databases Right-click and click Create -> Database… Specify the name of the database and leave the other options to their default values Click Save to complete the process Create a database table Select and expand the database you just created Select Schemas -> Tables Right-click on Tables and select Create -> Table… The table script is given below for your reference: CREATE TABLE supplier (\n\tid serial PRIMARY KEY,\n\tfirstname VARCHAR ( 50 ) NOT NULL,\n\tlastname VARCHAR ( 50 ) NOT NULL,\n\taddress VARCHAR ( 255 ) NOT NULL\n); We’ll use this database in the subsequent sections of this article to demonstrate how we can work with Integration Tests in ASP.NET Core using dotConnect for PostgreSQL. Add a few records to the Supplier table Now, run the following script in your database to insert a few records in the supplier table: INSERT INTO supplier(firstname, lastname, address)\nVALUES ('James', 'Payne', 'London, UK');\n\nINSERT INTO supplier(firstname, lastname, address)\nVALUES ('Steve', 'Jones', 'Chicago, USA');\n\nINSERT INTO supplier(firstname, lastname, address)\nVALUES ('Samuel', 'Anderson', 'Dallas, USA'); Figure 1 below illustrates the pgAdmin editor where you can write and execute your scripts: Figure 1: Records inserted into the supplier table Create the Model Class Create a solution folder in the Solution Explorer window and name it Models. Next, create a .cs file called Supplier.cs with the following code in there: public class Supplier\n {\n public int Id { get; set; }\n public string FirstName { get; set; }\n public string LastName { get; set; }\n public string Address { get; set; }\n } Create the SupplierRepository Class The ISupplierRepository interface would look like this: public interface ISupplierRepository\n {\n public List GetSuppliers();\n } Note that for the sake of simplicity, we’ve only one method in the ISupplierRepository interface. The SupplierRepository class implements the GetSuppliers method of the ISupplierRepository interface and encapsulates all database operations. public class SupplierRepository : ISupplierRepository\n {\n public List GetSuppliers()\n {\n try\n {\n List suppliers = new List();\n using (PgSqlConnection pgSqlConnection =\n new PgSqlConnection(\"User Id = postgres; Password = sa123#;\" +\n \"host=localhost;database=postgres;\"))\n {\n using (PgSqlCommand pgSqlCommand = new PgSqlCommand())\n {\n pgSqlCommand.CommandText =\n \"Select * From public.supplier\";\n pgSqlCommand.Connection = pgSqlConnection;\n if (pgSqlConnection.State !=\n System.Data.ConnectionState.Open)\n pgSqlConnection.Open();\n using (PgSqlDataReader pgSqlReader =\n pgSqlCommand.ExecuteReader())\n {\n while (pgSqlReader.Read())\n {\n Supplier supplier = new Supplier();\n supplier.Id =\n int.Parse(pgSqlReader.GetValue(0).ToString());\n supplier.FirstName =\n pgSqlReader.GetValue(1).ToString();\n supplier.LastName =\n pgSqlReader.GetValue(2).ToString();\n supplier.Address =\n pgSqlReader.GetValue(3).ToString();\n suppliers.Add(supplier);\n }\n }\n }\n }\n return suppliers;\n }\n catch\n {\n throw;\n }\n }\n } Create the SupplierController Class Next, select and right-click on the Controllers solution folder and create a new controller class called SupplierController with the following code in there: [Route(\"api/[controller]\")]\n [ApiController]\n\n public class SupplierController : ControllerBase\n {\n private readonly ISupplierRepository _supplierRepository;\n public SupplierController(ISupplierRepository supplierRepository)\n {\n _supplierRepository = supplierRepository;\n }\n\n [HttpGet]\n public List Get()\n {\n return _supplierRepository.GetSuppliers();\n }\n } Note how an instance of type ISupplierRepository is injected in the constructor of the SupplierController class. Remember that you must add an instance of type ISupplierRepository to the services container using the following piece of code in the Program.cs file: builder.Services.AddScoped(); Executing the Application When you execute the application and invoke the GetSuppliers endpoint, all records of the supplier database table will be displayed in your web browser. However, if you specify an endpoint that is not existing, you will encounter an HTTP 404 error. Alternatively, if the request is malformed, you will encounter a HTTP 400 error. In the sections that follow, we’ll examine how we can custom these errors, i.e., intercept the requests and customize HTTP 400 or HTTP 404 responses. Customizing HTTP 400 Responses When working with ASP.NET Core, you can customize HTTP 400 and 404 responses to handle errors in your application more gracefully. To do this, you should create a middleware component that intercepts the request and returns a custom response. public class CustomBadRequestMiddleware\n {\n private readonly RequestDelegate _next;\n public CustomBadRequestMiddleware(RequestDelegate next)\n {\n _next = next;\n }\n public async Task InvokeAsync(HttpContext context)\n {\n await _next(context);\n if (context.Response.StatusCode == 400)\n {\n context.Response.ContentType = \"application/json\";\n string text = \"The requested resource is not found...\";\n var json = JsonSerializer.Serialize(text);\n await context.Response.WriteAsync(json); }\n }\n } In the preceding code example, the middleware checks if the response status code is 400. If so, the middleware sets the content type of response to JSON and returns a custom error message. To use this middleware, you should add it to the request processing pipeline in the Program.cs file as shown in the code snippet given below: app.UseMiddleware(); Customizing HTTP 404 Responses Similarly, to customize an HTTP 404 error, you can take advantage of another middleware component that can intercept the request and return a custom response, as shown below: public class CustomBadRequestMiddleware\n {\n private readonly RequestDelegate _next;\n\n public CustomBadRequestMiddleware(RequestDelegate next)\n {\n _next = next;\n }\n public async Task InvokeAsync(HttpContext context)\n {\n await _next(context);\n if (context.Response.StatusCode == 400)\n {\n context.Response.ContentType = \"application/json\";\n string text = \"Bad request...\";\n var json = JsonSerializer.Serialize(text);\n await context.Response.WriteAsync(json);\n }\n }\n } Similar to the previous example, the preceding code example demonstrates how the custom middleware checks to see if the response status code is 404. If so, the middleware component intercepts the request and sends a custom text message as a response. To add this middleware to the pipeline, you should use the following code in the Program.cs file of your project: app.UseMiddleware(); License Key Validation Error When you execute the application, you might run into license validation errors if no valid license key is available. If the license key validation fails, you will encounter a Devart.Common.LicenseException . To resolve this error, you must either have a license key and already be a user or install the installation file, which will install a trial key into the system. Summary You can provide customized, more meaningful messages by customizing the HTTP 400 and 404 error responses in your application. Note that we’ve used two different custom middleware classes to handle HTTP 400 and 404 error responses in this example. You can use one custom middleware class to handle this as well, but if you’re handling multiple HTTP error responses in a single custom middleware class, it not only breaks the single responsibility principle but also makes your code difficult t manage and maintain over time. Tags [ASP.NET](https://blog.devart.com/tag/asp-net) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fcustomizing-http-400-and-404-errors-in-asp-net-cor.html) [Twitter](https://twitter.com/intent/tweet?text=Customizing+HTTP+400+and+404+Errors+in+ASP.NET+Core&url=https%3A%2F%2Fblog.devart.com%2Fcustomizing-http-400-and-404-errors-in-asp-net-cor.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/customizing-http-400-and-404-errors-in-asp-net-cor.html&title=Customizing+HTTP+400+and+404+Errors+in+ASP.NET+Core) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/customizing-http-400-and-404-errors-in-asp-net-cor.html&title=Customizing+HTTP+400+and+404+Errors+in+ASP.NET+Core) [Copy URL](https://blog.devart.com/customizing-http-400-and-404-errors-in-asp-net-cor.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/dac-support-64bit-android.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) Devart Supports 64-bit Android App Development in DAC for Delphi By [DAC Team](https://blog.devart.com/author/dac) November 27, 2019 [0](https://blog.devart.com/dac-support-64bit-android.html#respond) 4113 Devart, the leading provider of database management software and developer tools,  announced the latest version of its data access components for Delphi. The key feature in this update is support for 64-bit Android app development, following the release of RAD Studio 10.3.3 that includes a new Delphi compiler for the ARM platform, which allows developers to build 64-bit applications from a single Delphi codebase. It should be noted that starting August 1, 2019, your apps published on Google Play need to support Android devices with 64-bit architectures. All publishers are required to provide 64-bit versions in addition to 32-bit versions in Google Play. New versions of RDBMS were supported: PostgreSQL 12 in PgDAC, Oracle 19c in ODAC, and InterBase 2020 in IBDAC. The 6-fields limitation in the trial version of data access components for macOS and Linux was removed. You can now execute the test code against a large database to make sure everything works as intended. The latest version 2.0.6 of Lazarus, a free IDE for rapid application development using the Free Pascal compiler was supported in all DAC products. MySQL and PostgreSQL data providers now support OpenSSL 1.1, a library that implements TLS and SSL protocols to secure communications between a client and a server. TLS 1.2 was supported in Direct mode in SDAC. It is a security protocol that provides privacy and data integrity for client-server applications that exchange data over a network. TLS 1.2 fixes some issues in TLS 1.1 and is recommended for organizations that consider cyber security critical to their operations. Check the revision history page of DAC products for a complete list of updates and improvements in this release. You are welcome to download a trial version of Delphi Data Access components to give it a test run. [UniDAC 8.1](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 11.1](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 9.1](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 10.1](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 7.1](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 6.1](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 4.1](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 11.1](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] You are welcome to join our [forum](https://forums.devart.com/viewforum.php?f=42) to discuss database application development. Tags [delphi](https://blog.devart.com/tag/delphi) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdac-support-64bit-android.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Supports+64-bit+Android+App+Development+in++DAC+for+Delphi&url=https%3A%2F%2Fblog.devart.com%2Fdac-support-64bit-android.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dac-support-64bit-android.html&title=Devart+Supports+64-bit+Android+App+Development+in++DAC+for+Delphi) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dac-support-64bit-android.html&title=Devart+Supports+64-bit+Android+App+Development+in++DAC+for+Delphi) [Copy URL](https://blog.devart.com/dac-support-64bit-android.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dapper-vs-entity-framework-core.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project? By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) March 28, 2025 [0](https://blog.devart.com/dapper-vs-entity-framework-core.html#respond) 460 “Dapper is pure speed—EF Core is bloated,” or “Dapper is a nightmare—EF Core keeps things scalable.” We’ve all heard it. Both sides make a point, but neither tells the whole story. The real question? Which one will save you from a world of pain six months from now? Dapper gives you raw SQL execution and total control, but you’re on your own when managing transactions, relationships, and migrations. EF Core, on the other hand, automates the heavy lifting, but that abstraction comes at a cost—query overhead, unexpected performance hits, and sometimes, a fight with LINQ-generated SQL. So, how do you make the right call? And more importantly, how do you avoid regrets when your app scales? While [dotConnect](https://www.devart.com/dotconnect/) enhances both Dapper and EF Core by optimizing query execution and simplifying database workflows, understanding the core differences between these ORMs remains crucial for choosing the right approach. Let’s dive in! Table of contents What is Dapper? What Is Entity Framework Core? Pros and cons of EF Core Performance comparison: Dapper vs. EF Core How to improve workflow efficiency with dotConnect Conclusion What is Dapper? [Dapper](https://www.devart.com/dotconnect/what-is-dapper-orm.html) is a micro-ORM for .NET, originally developed by Stack Overflow to handle high-performance database queries with minimal overhead. It executes raw SQL directly and maps results to .NET objects, giving developers complete control over query execution—unlike Entity Framework Core, which abstracts database interactions through LINQ and automatic change tracking. When to use Dapper Dapper is best suited for: High-performance applications – Critical systems like financial transactions and real-time APIs where slight query delays matter. Read-heavy workloads – Dashboards and analytics tools that retrieve large datasets quickly. Microservices & scalable APIs – Lightweight database calls with minimal processing overhead. Custom SQL-heavy applications – Projects requiring specific indexing strategies, execution plans, or stored procedures that EF Core may not efficiently support. Dapper is not a full ORM—it won’t manage migrations, schema changes, or entity state tracking. But if your project demands uncompromised performance and complete control over SQL execution, Dapper is one of the fastest ways to interact with relational databases in .NET. Pros and cons of Dapper Pros Feature Benefit Minimal overhead, maximum speed Dapper processes raw SQL directly, avoiding ORM-generated query bloat for faster execution. Full SQL control Developers have complete control over queries, indexing, query hints, and execution plans, optimizing performance for high-traffic apps. Broad database support It works natively with SQL Server, PostgreSQL, MySQL, SQLite, and Oracle, making migrating databases or supporting multi-cloud architectures easy. Lightweight and simple Requires just a single NuGet package and minimal setup, reducing ORM-related complexity. Cons Limitation Challenge & Workarounds No built-in change tracking Developers must manually handle updates and state management, but Unit of Work or third-party tools can help. More manual SQL management Writing raw SQL gives control but requires ongoing maintenance as schemas and queries evolve. Scaling complex applications Large data models require manual handling of joins, transactions, and caching so that hybrid ORM strategies may be needed. What Is Entity Framework Core? [Entity Framework Core (EF Core)](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) is Microsoft’s ORM for .NET, which simplifies database interactions by using LINQ  instead of raw SQL. It automates relationship management, schema migrations, and entity tracking, making it ideal for enterprise applications prioritizing maintainability and scalability. While EF Core reduces the need for manual SQL, improper DbContext management can lead to performance issues. Here’s [a guide](https://blog.devart.com/best-practices-in-using-the-dbcontext-in-ef-core.html) on best practices to keep it efficient. Comparing Entity Framework vs. Entity Framework Core Think of Entity Framework Core (EF Core) as the leaner, faster, and more flexible upgrade to the original Entity Framework. While both help .NET applications interact with databases, EF Core is optimized for modern development—it’s cross-platform, performs better, and even supports cloud and NoSQL databases. However, choosing the right ORM depends on your database needs. If you’re working with PostgreSQL in .NET, explore this [guide for the best ORM](https://blog.devart.com/best-postgresql-orm-in-dotnet.html) options. When to use EF Core EF Core is the preferred choice for applications that require advanced data modeling, automatic schema evolution, and structured workflows. Use it for: Enterprise-level applications – Large-scale systems where data integrity, consistency, and relationship management are key. Rapid development & maintainability – Teams prioritizing code readability, migrations, and collaboration-friendly ORM workflows. Complex data models & transactions – Ideal for applications that involve multi-table relationships, cascading deletes, and structured data mapping. Check out [this article](https://blog.devart.com/how-to-work-with-inheritance-in-entity-framework-core.html) for a deeper look at handling inheritance strategies in EF Core. Microsoft ecosystem integration – Projects built on ASP.NET Core, Azure, and other .NET services for smooth compatibility and efficient development. While EF Core simplifies data access, its ORM abstraction adds overhead. Dapper may be better if an application demands maximum performance and raw SQL control. Pros and cons of EF Core Pros Feature Benefit Simplified database management Automates query execution, schema migrations, and relationship handling, reducing manual SQL work. Change tracking & migrations Automates entity state tracking and schema updates, reducing development effort and errors. Smooth Microsoft Integration Works natively with ASP.NET Core, Azure, and SQL Server, simplifying cloud and enterprise deployments. Cons Limitation Challenge & workarounds Slower than Dapper EF Core’s abstraction adds query execution overhead, making it less suitable for high-performance applications. Complex generated queries LINQ-generated queries can be inefficient, sometimes requiring custom SQL tuning for optimization. Limited raw SQL control While EF Core supports raw SQL execution, it isn’t optimized for manual query fine-tuning like Dapper. Performance comparison: Dapper vs. EF Core Performance is key when choosing between [Dapper](https://www.devart.com/dotconnect/what-is-dapper-orm.html) and [Entity Framework Core (EF Core)](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) . Each tool approaches database interactions differently—Dapper focuses on raw SQL execution with minimal overhead, while EF Core provides a high-level abstraction with built-in data management features. The right choice depends on the trade-off between control and convenience. Below is a side-by-side comparison based on benchmark tests and real-world performance metrics. EF Core vs. Dapper: Feature-by-feature breakdown Feature Dapper  (Micro-ORM) Entity Framework Core  (Full-ORM) Performance Faster execution due to raw SQL queries (ideal for high-performance apps) Slower due to query translation, change tracking, and abstraction Ease of use Requires writing SQL manually (more control but increases code complexity) Uses LINQ, making queries simpler and more maintainable Query execution Direct SQL execution, highly optimized for read-heavy operations EF Core abstracts SQL execution but may require manual tuning for some LINQ-generated queries. Database operations Ideal for CRUD operations where high performance is needed Best for complex data models with relationships Change tracking No built-in change tracking (requires manual implementation) Automatic change tracking Migrations & schema management No built-in support (handled manually) Built-in support with migrations Transaction support Supported, but requires manual handling Supported with built-in transaction management Bulk operations Must be manually optimized (e.g., using SqlBulkCopy for SQL Server) Built-in support for batch inserts & updates Flexibility Highly flexible—developers have full SQL control Less flexible—SQL abstraction can make custom queries harder Complex queries Best for custom SQL queries, including stored procedures LINQ-generated queries can be hard to optimize manually Learning curve Fast for SQL-experienced developers Easier for .NET developers unfamiliar with SQL Debugging & optimization Easier to debug (queries are explicit) Harder to optimize due to hidden SQL generation Best use cases – High-performance applications (e.g., reporting dashboards, analytics, microservices) – Simple CRUD operations with raw SQL – Read-heavy workloads with minimal ORM overhead – Enterprise applications with complex relationships – Large projects requiring structured data access & migrations – Teams preferring ORM abstraction over SQL optimization Compatible databases Supports SQL Server, PostgreSQL, MySQL, SQLite, and more Supports SQL Server, PostgreSQL, MySQL, SQLite, Oracle, IBM Db2, and more Integration with dotConnect Optimized connectivity for .NET applications using direct SQL Supports optimized EF Core database access & LINQ performance improvements Query execution speed Dapper: Its minimal abstraction and direct SQL execution allow it to process queries up to 5–10x faster than EF Core. By bypassing ORM layers like entity tracking and LINQ translation, it provides low-latency performance, making it the top choice for real-time analytics, dashboards, and performance-critical applications. EF Core: While optimized for maintainability and complex data interactions, introduces query overhead due to automatic change tracking, LINQ processing, and dynamic SQL generation. This added abstraction makes it less suited for high-frequency, low-latency database queries where raw SQL control is essential. Database read & write operations Dapper: Excels in large-scale read operations, such as analytics dashboards, reporting systems, and real-time data retrieval, where fast query execution and minimal overhead are critical. EF Core: Is best suited for transactional applications requiring data integrity, automatic relationship management, and structured workflows. It is particularly effective for enterprise applications like ERP systems, where complex data modeling and relational integrity are key priorities. Bulk insert & data modifications Dapper: Bulk operations require manual implementation since Dapper does not provide built-in support for bulk inserts, updates, or deletes. This gives developers more control over performance optimizations but also requires writing additional SQL code. EF Core: This includes built-in support for batch inserts, updates, and deletes through native ORM tools and community extensions. However, these features introduce additional processing time due to entity tracking and state management. To maximize performance, developers often mix raw SQL queries into EF Core to handle heavy bulk operations more efficiently. How to improve workflow efficiency with dotConnect For applications handling high query volumes, optimizing database connectivity is crucial. dotConnect enhances ADO.NET performance for both Dapper and EF Core by improving connection pooling, reducing redundant queries, and optimizing SQL execution. This delivers faster queries, improved resource management, and optimized data access. Overview of dotConnect for .NET dotConnect is a high-performance ADO.NET provider that optimizes database interactions beyond default .NET connections. It enhances both EF Core’s ORM capabilities and Dapper’s raw SQL performance with: Intelligent connection pooling to minimize redundant database calls Optimized SQL execution for lower query latency Reliable multi-database support for [Oracle](https://www.devart.com/dotconnect/oracle/ef-core-oracle-using.html) , [MySQL](https://www.devart.com/dotconnect/mysql/ef-core-mysql-using.html) , [PostgreSQL](https://www.devart.com/dotconnect/postgresql/ef-core-postgresql-using.html) , [SQL Lite](https://www.devart.com/dotconnect/sqlite/ef-core-sqlite-using.html) , and more Whether you’re using EF Core’s high-level abstraction or Dapper’s fine-tuned SQL control, dotConnect acts as a performance accelerator—reducing query overhead while keeping database interactions efficient. dotConnect and Entity Framework For developers working with EF Core, dotConnect fine-tunes query execution, ensuring that database interactions remain fast and scalable. Key optimizations include: Reducing query latency by optimizing SQL command execution Minimizing redundant database calls to improve overall throughput Enhancing migrations and relationship handling for complex data models Because dotConnect supports multiple database platforms, teams can future-proof applications and reduce vendor lock-in without sacrificing performance. Check out the [official documentation here](https://www.devart.com/dotconnect/entityframework.html) to see how dotConnect enhances EF Core’s ORM capabilities. How dotConnect helps developers Many projects benefit from a hybrid Dapper and Entity Framework approach, which uses EF Core for ORM tasks and Dapper for performance-critical queries. Here’s how dotConnect supports this: Faster query execution – Optimized SQL handling cuts query latency, ensuring faster response times in high-traffic applications. Built-in profiling & monitoring – Identifies slow queries, optimizes execution plans, and prevents bottlenecks with integrated profiling tools. Unified ORM & SQL performance – Balances EF Core’s abstraction and Dapper’s direct execution, making it ideal for complex .NET architectures blending microservices, legacy systems, and modern data models. Design ORM models visually – With [Entity Developer](https://www.devart.com/entitydeveloper/) , you can build and customize ORM models through an intuitive visual designer, streamlining development and improving maintainability. Dapper or EF Core, dotConnect acts as a unifying layer that ensures consistent performance across your data-access strategies. This flexibility is especially beneficial for enterprise projects that blend microservices, legacy data systems, and modern .NET application architectures. Conclusion Choosing between Dapper vs Entity Framework Core (EF Core) isn’t about declaring a winner—it’s about selecting the right tool for your specific needs. Dapper delivers raw SQL power and speed, making it ideal for performance-critical applications that demand efficiency. EF Core, with its structured ORM approach, is the best choice for enterprise systems where maintainability and scalability matter. However, for those looking to improve performance further, dotConnect optimizes both—enhancing Dapper’s efficiency and fine-tuning EF Core’s ORM workflows. [Download dotConnect](https://www.devart.com/dotconnect/) to optimize your database interactions today! FAQ Which one is better, Entity Framework vs Dapper? It depends on your needs. Entity Framework Core (EF Core) is better for applications requiring structured ORM workflows, automatic relationship management, and long-term maintainability. On the other hand, Dapper is the better choice for performance-critical applications where raw SQL control and execution speed matter most. What are the disadvantages of Dapper? While Dapper is incredibly fast, it comes with trade-offs: No built-in change tracking – Developers must manually handle entity state. No automatic migrations – Schema updates require manual SQL scripts. More SQL maintenance – Writing raw queries means ongoing upkeep as schemas evolve. EF Core may be a better fit for applications that require schema evolution, automated relationships, and ease of maintenance. Is Entity Framework still relevant? Absolutely. Entity Framework Core remains a top choice for enterprise applications and projects where structured ORM workflows simplify development. While raw SQL tools like Dapper offer greater control and speed, EF Core provides a high-level abstraction that makes managing complex data models easier—especially in large-scale .NET applications. Tags [Dapper](https://blog.devart.com/tag/dapper) [ef core](https://blog.devart.com/tag/ef-core) [entity framework](https://blog.devart.com/tag/entity-framework) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [orm](https://blog.devart.com/tag/orm) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdapper-vs-entity-framework-core.html) [Twitter](https://twitter.com/intent/tweet?text=Dapper+vs.+Entity+Framework+Core%3A+Which+One+Is+Right+for+Your+.NET+Project%3F%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fdapper-vs-entity-framework-core.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dapper-vs-entity-framework-core.html&title=Dapper+vs.+Entity+Framework+Core%3A+Which+One+Is+Right+for+Your+.NET+Project%3F%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dapper-vs-entity-framework-core.html&title=Dapper+vs.+Entity+Framework+Core%3A+Which+One+Is+Right+for+Your+.NET+Project%3F%C2%A0) [Copy URL](https://blog.devart.com/dapper-vs-entity-framework-core.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs Entity Framework Core: Which ORM Is Right for You?](https://blog.devart.com/nhibernate-vs-entity-framework-core-a-comprehensive-comparison-for-c-net-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/dare-compare-your-data-much-more-faster.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Dare Compare Your SQL Data Much Faster! By [dbForge Team](https://blog.devart.com/author/dbforge) May 11, 2010 [0](https://blog.devart.com/dare-compare-your-data-much-more-faster.html#respond) 3379 We compared some best-of-breed [SQL data comparison tools](https://www.devart.com/dbforge/sql/datacompare/) with dbForge Data Compare for SQL Server to find out which one is faster to compare and synchronize data in SQL Server databases. The participant tools were tested with default settings on live databases on SQL Server 2008 that installed with default settings on a desktop (with a dual-core processor and 2Gb of RAM). Both competitors tools and [database tools](https://www.devart.com/dbforge/) by Devart were installed on the same desktop. Live Databases: Two databases with total size of 1Gb: 115 tables containing one with 3 million records of numeric and string data types, the other with middle size BLOB data. Task №1: We measured the speed of each tool while comparing the data, generating an update script, and at last synchronizing the databases. Data Comparison/Synchronization Performance Result №1 : Tests results show that dbForge Data Compare for SQL Server v2.00 has leaved the competitors far behind. Task №2: We decided to compare the performance of the selected participant tools based on the main industry-leading competitor’ performance. Besides we took into account the product price and tried to calculate the efficiency as best performance for less money. Performance/Price Result №2 : This graph shows that not all the tools can be proud of delivering an expected combination of best performance and best price. dbForge Data Compare handles this task well. Check Shot To fully complete our research, we decided to go the whole hog and use the aforementioned tools to compare large databases with the size of 120Gb (some tables in these databases have 2.4 billion records, some tables contain 1.5Gb of LOB data, the FILESTREAM table is with 5.3Gb of records). dbForge Data Compare was good at comparison, while neither of the competitive tools compared tables with LOB data and records in the FILESTREAM table. When we excluded such tables, only one tool coped with 2.4 billion records. Speed rate of dbForge Data Compare is several times higher than other popular competitive tools have. [Try dbForge Data Compare](https://www.devart.com/dbforge/sql/datacompare/download.html) on your SQL Server database. Delivering qualitative data comparison and synchronization is our fame. We guarantee [quick support](https://www.devart.com/dbforge/sql/datacompare/support.html) and product improvement if dbForge Data Compare fails to compare and synchronize your database. We even may give you the product license for free. Moreover, you can take full advantage of data comparison and synchronization functionality in [dbForge Compare Bundle](https://www.devart.com/dbforge/sql/compare-bundle/) that includes both dbForge Data Compare for SQL Server and dbForge Schema Compare for SQL Server, and also allows you to save money as compared to buying the tools separately. We compared some best-of-breed comparison tools with dbForge Data Compare 2.00 to find out which one is speedier to compare and synchronize data in SQL Server databases . The participant tools were tested with default settings on live databases on SQL Server 2008 that installed with default settings on a desktop with a dual-core processor and 2Gb of RAM. Both the SQL Server and the tools were installed on the same desktop. Tags [data compare](https://blog.devart.com/tag/data-compare) [performance](https://blog.devart.com/tag/performance) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdare-compare-your-data-much-more-faster.html) [Twitter](https://twitter.com/intent/tweet?text=Dare+Compare+Your+SQL+Data+Much+Faster%21&url=https%3A%2F%2Fblog.devart.com%2Fdare-compare-your-data-much-more-faster.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dare-compare-your-data-much-more-faster.html&title=Dare+Compare+Your+SQL+Data+Much+Faster%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dare-compare-your-data-much-more-faster.html&title=Dare+Compare+Your+SQL+Data+Much+Faster%21) [Copy URL](https://blog.devart.com/dare-compare-your-data-much-more-faster.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/data-access-components-support-macos-64bit.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) Data Access Components are now available for macOS 64-bit By [DAC Team](https://blog.devart.com/author/dac) July 23, 2019 [0](https://blog.devart.com/data-access-components-support-macos-64bit.html#respond) 3950 Great news for all macOS users: the team behind Delphi DAC adds support for the macOS 64-bit operating system to all Data Access Components. You are no longer limited to 32-bit applications: from now on, build 32-bit or 64-bit versions of database applications for macOS depending on your specific needs. This release will be of particular interest to RAD Studio users who wish to upgrade their software to the latest version 10.3.2 of RAD Studio which comes with macOS 64-bit application support for Delphi. The new versions of DACs have been fully tested for compatibility with RAD Studio 10.3.2 on macOS 64-bit. Please note that this DAC release requires you to install Release 2 of Embarcadero RAD Studio 10.3. A full list of updates in this release is available on the [DAC](https://www.devart.com/dac.html) page. You are welcome to visit DAC product pages to download and give a try to the new versions of Delphi Data Access Components for free: [UniDAC 8.0](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 11.0](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 9.0](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 10.0](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 7.0](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 6.0](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 4.0](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 11.0](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] Your feedback is highly appreciated: you can reach out to us via any of the means listed on the Support page of DAC products. If you’d like to discuss anything related to database application development, you may also join our [forum](https://forums.devart.com/viewforum.php?f=42) . Tags [delphi](https://blog.devart.com/tag/delphi) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-access-components-support-macos-64bit.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Access+Components+are+now+available+for+macOS+64-bit&url=https%3A%2F%2Fblog.devart.com%2Fdata-access-components-support-macos-64bit.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-access-components-support-macos-64bit.html&title=Data+Access+Components+are+now+available+for+macOS+64-bit) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-access-components-support-macos-64bit.html&title=Data+Access+Components+are+now+available+for+macOS+64-bit) [Copy URL](https://blog.devart.com/data-access-components-support-macos-64bit.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/data-comparison-methods-overview.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Data Comparison Methods and Techniques By [dbForge Team](https://blog.devart.com/author/dbforge) March 9, 2010 [0](https://blog.devart.com/data-comparison-methods-overview.html#respond) 12071 Data comparison is a difficult and resource-intensive process. In the article, we are going to explore algorithms, advantages, and disadvantages of specific data comparison, as well as describe how to compare data between two MySQL databases with dbForge Studio for MySQL. For convenience, this process can be divided into several steps. First, you should compare tables from one database on one server with the database on the other server. You should choose columns for data comparison, and also choose a column that will be the comparison key. The next step is to choose all data or specific parts of the data from these tables. The third and most important step is the comparison of two tables by the selected comparison key itself. During this process, the status of each record is set to “only in source”, “only in target”, “different”, or “equal”. The final steps of the data comparison process are including records into the synchronization process and then synchronization itself. During these steps, records needed for synchronization are chosen, an update script is created, and after that, the script is executed. You can read a detailed description of the comparison process [here](https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html) or [here](https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html) . Now, let’s look at the third step (data comparison) thoroughly. There are several ways you can compare data, and they differ only by the side on which data comparison is performed – either on the server’s side or on the client PC. Server-side data comparison is performed using the resources of the server The algorithm of comparison is the following: 1. For each record of each of the two tables, a checksum is calculated; 2. Then the checksum of every record from one table is compared to the checksum of the corresponding record from another table. A conclusion is made about whether the records are equal or different; 3. The comparison result is stored in a temporary table on the server. Performance indicators: 1. The speed of data comparison directly depends on the server capacity and occupancy; 2. The maximum size of the database for comparison is limited by the resources of the server itself. Advantages: 1. There is no need to transfer large amounts of data for comparison to the client’s PC through the network. In this way, we save network traffic; 2. The speed of comparison does not depend on the client PC resources; 3. Ability to compare blob data of any size. Disadvantages: 1. Because of the record checksum calculation algorithm, different data in some cases can result in equal checksums, and the “equal” status will be received instead of the expected “different” status; 2. There is no flexibility in using the synchronization and comparison options; 3. There is no possibility to view record differences and exclude specific records from the synchronization manually; 4. During the synchronization script creation, you should perform data transfer from the server to the client side; 5. The control checksum calculation of a large number of records consumes all server resources; 6. One should provide extra space on the server for the comparison results to be stored in a temporary table. As we can see, this method of comparison has more disadvantages than advantages, that’s why it’s rarely used. Data comparison on the client PC is performed using client machine resources, and the server only provides data for comparison. In turn, this comparison method can be divided further into several ways depending on how the comparison information will be stored. Comparing Data on a local PC when the comparison result is stored in RAM The comparison algorithm is the following: 1. The server passes all data from both tables to the local PC; 2. Every record of every table is sent to RAM and is compared without checksum calculation; 3. If a record gets “only in source”, “only in target”, or “equal” status, only the comparison key is stored in RAM. If records get the “different” status, they are sent to RAM for storage completely. Performance indicators: 1. The speed of data comparison directly depends on the client PC resources and on the speed of data transfer through the network; 2. The maximum size of the database for comparison depends on the size of RAM on the client PC and on the degree to which the databases that should be compared are different – the smaller the number of different records, the larger databases can be compared. Advantages: 1. Minimal server occupancy – the server only performs simple data selection; 2. The simplest algorithm of data comparison (records are sorted on the client side); 3. Flexibility in using the comparison options; 4. Minimal size of the comparison data store; 5. The status of every record for any data is always correct. Disadvantages: 1. To view records with “only in source”, “only in target”, or “equal” status, an extra data selection is needed; 2. An extra data selection is needed to create a synchronization script; 3. An OutOfMemory Exception may be thrown when there are many data differences in the databases; 4. Possibility to compare blob data the size of which does not exceed the amount of free RAM. In older versions of our data comparison tools, this comparison method was used. However, after evaluating its pros and cons, we decided to move on to a different algorithm in newer versions. Comparing Data on a local PC when the comparison result is stored as a cached file on the disk The algorithm of comparison is the following: The server passes all data from both tables sorted by comparison key to a local PC. This data is then read by bytes, compared without checksum calculations, and written to a file on the disk. Performance indicators: 1. The speed of the data comparison directly depends on the client PC resources and on the speed of data transfer through the network; 2. The maximum size of a database to compare is limited by free disk space and does not depend on the degree of data difference in databases. Advantages: 1. Medium server occupancy – server performs data sorting and selection; 2. To view records and synchronization script creation, extra requests to the server are not necessary; 3. The status for every record is always correct for any data; 4. Possibility to compare blob data of the size equal to the amount of free space available on the disk. Disadvantages: 1. Complex data comparison algorithm for the records that have a comparison key of the string data type; 2. Complex disk caching algorithm for temporary information storage creation. We can see that in this case, the only real disadvantage is the implementation difficulty. How to compare data from two different databases In dbForge Studio for MySQL, there is [Data Compare](https://www.devart.com/dbforge/mysql/studio/database-synchronization.html) , a tool that allows you to compare and analyze table data in MySQL databases. Besides, it allows synchronizing data differences to get identical databases. To start comparing data, on the Comparison menu, click New Data Comparison . In the wizard that opens, select the source and target databases you want to compare. On the Options tab, you can customize the default comparison with additional options. On the Mapping tab, the Data Compare tool automatically selects columns with a primary or unique key as comparison keys. If needed, you can set comparison keys manually, however, in such cases, key repetition may occur which can cause conflicted records after comparison. Such records are not compared. Once done, click Compare . After you compared data, you can view the data comparison result in a Data Comparison document displaying top and bottom grids. On the top grid, you can view a number of rows being different, identical, existing only in Target or only in Source for each compared table or view. On the bottom grid, you can view the corresponding data records of the selected database object in the top grid. For example, if a comparison key for a compared pair of records is found in the source database but is absent in the target one, the Data Compare tool considers such records being Only in Source . If the key is found in the target database, the records get the Only in Target status. If a comparison key is found in both databases, the tool compares columns without primary or unique keys and, based on the result, considers the records either Identical or Different . You can review data differences in every compared table and decide which records should be synchronized and which of them should be excluded from the synchronization. After comparing data, you can synchronize it to the target database. To start synchronizing, click Synchronize data to the target database . In the Data Synchronization wizard that opens, you can generate a synchronization script to review how the target database will be changed after synchronization. For example, you can open the script in the editor, save it for later use, or immediately execute it against the target database. For records with the Different status (which have comparison keys in the source and target databases, but their data is different), the tool generates the sсript with INSERT statements. The same is done for records with the Only in Source status (they have a comparison key only in the source database). For the Only in Target records, the Data Compare tool generates the script with DELETE statements. Identical records or ones with repeated comparison keys are not included in the script. Conclusion In the article, we reviewed three different data comparison algorithms: Server-side data comparison using the server’s resources Local data comparison with comparison results stored in RAM Local data comparison with comparison results stored as a cached file on the disk All these methods have different strengths and weaknesses, but we can still assess their usefulness by comparing their respective advantage/disadvantage ratios. As you can see, the last algorithm has the greatest ratio. That’s why we implemented it in all the latest versions of our comparison tools: [dbForge Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) [dbForge Data Compare for MySQL](https://www.devart.com/dbforge/mysql/datacompare/) [dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/) [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) In addition, we demonstrated how to compare data and detect differences between two MySQL databases with [dbForge Data Compare for MySQL](https://www.devart.com/dbforge/mysql/datacompare/) built into dbForge Studio for MySQL. Tags [data compare](https://blog.devart.com/tag/data-compare) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [synchronize database](https://blog.devart.com/tag/synchronize-database) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-comparison-methods-overview.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Comparison+Methods+and+Techniques&url=https%3A%2F%2Fblog.devart.com%2Fdata-comparison-methods-overview.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-comparison-methods-overview.html&title=Data+Comparison+Methods+and+Techniques) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-comparison-methods-overview.html&title=Data+Comparison+Methods+and+Techniques) [Copy URL](https://blog.devart.com/data-comparison-methods-overview.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/data-export-from-amazon-rds-mysql-instance.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Data Export from Amazon RDS MySQL Instance By [dbForge Team](https://blog.devart.com/author/dbforge) July 16, 2020 [0](https://blog.devart.com/data-export-from-amazon-rds-mysql-instance.html#respond) 12507 The necessity to export data from Amazon RDS to the on-premises MySQL database may be spawned by various reasons such as a user request, an upgrade to a system, data consolidation, need for backup, etc. The fact that data are usually required yesterday and export tasks need to be repeated again and again only compounds the difficulty. This article presents a walkthrough on how to export data quickly and easily with the help of dbForge Studio for MySQL. Data Export and Import Tools for MySQL [Data export and import utilities](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) of dbForge Studio for MySQL are long-time favorites of database administrators as they allow getting export/import jobs done with a little as a few clicks of the mouse. And the cherry on the cake is a comprehensive and easy-to-follow user interface letting non-expert users successfully perform data migrations. Export MySQL data from AWS RDS In this worked example, we will export data from the Customer table on Amazon RDS to an MS Excel file. Please note, that with dbForge Studio for MySQL it is also possible to export the result of the query. To export the query result, call the Data Export wizard by right-clicking the result grid and follow the instructions. Step 1: Establish a connection To start working with the Export tool of dbForge Studio for MySQL, we need to [establish a connection to the database on Amazon RDS](https://www.devart.com/dbforge/mysql/studio/amazon-rds.html) . To add a new connection navigate Database -> New connection. In the Database Connection Properties window, make all the necessary connection settings. Step 2: Open Data Export wizard To export data from Amazon RDS to a file, first, you need to call the Data Export wizard. This can be done from the Data Pump tab on the Start Page. Alternatively, in the Database Explorer, right-click a table you want to export data from and select the Export Data command from the context menu that appears. Step 3: Select file format In the Data Export wizard, select an export format and click Next. In this tutorial, we will export MySQL data from Amazon RDS to one of the most common formats – MS Excel 2007. Step 4: Select a source table On the Source tab of the Data Export wizard, select a table you want to export data from and click Next. Step 5: Set output options for the data to be exported The Output settings tab provides the possibility to set export options for the output file. Step 6: Set table grid options for the data to be exported On the Options tab of the Data Export wizard, you can configure the table grid layout for the exported data. You can configure the following: – Borders style and color – Header text color, font, and background – Text color, font, and background for even and odd rows separately. You can preview the result and change the settings if required. Step 7: Select columns for export and adjust data formats On the Data formats tab, you can select columns for export and adjust data formats. It is possible to exclude certain columns from export, change column names and their types of data. Step 8: Set page print options On the Page print options tab, you can select the page size, specify the orientation and margins, add Footer and Header text. Step 9: Select the rows to be exported On the Exported rows tab of the Data Export wizard, you can select the range of rows to be exported. You can choose to: – Export all rows – Export selected rows – Export a range of rows and specify it. Step 10: Specify errors processing behavior and logging options At last, on the Errors handling tab, you can configure the application behavior in case it encounters an error. Here you can also tune the logging options. Step 11: Finish export and save the template Having completed all the steps in the wizard, click the Export button. You will see a notification about the result of your export job. The undoubted advantage of dbForge Studio for MySQL is the ability to save the export configurations for recurring export tasks. Click the Save button to save the settings either as an export template or as a batch file for further command-line operations. Step 12: Check the result Click the Open result file button to check the result. Note: You can then use the file to transfer data to another MySQL database either local or remote. To learn how to import data from a file using dbForge Studio for MySQL, please see the [tutorial](https://blog.devart.com/import-mysql-data-to-amazon-rds.html) . Conclusion [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) allows exporting data from Amazon RDS quickly and easily. You don’t need to suffer configuring the Export wizard again and again – just save the export template and perform your export jobs in a wink. Have a need for scheduling or automating export tasks? It’s not an issue either, save the export options as a batch file and schedule it with Windows Task Scheduler. Try it out and all your doubts, if any, will vanish! Also, you can watch this video tutorial: Tags [export amazon rds](https://blog.devart.com/tag/export-amazon-rds) [export aws rds](https://blog.devart.com/tag/export-aws-rds) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-export-from-amazon-rds-mysql-instance.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Export+from+Amazon+RDS+MySQL+Instance&url=https%3A%2F%2Fblog.devart.com%2Fdata-export-from-amazon-rds-mysql-instance.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-export-from-amazon-rds-mysql-instance.html&title=Data+Export+from+Amazon+RDS+MySQL+Instance) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-export-from-amazon-rds-mysql-instance.html&title=Data+Export+from+Amazon+RDS+MySQL+Instance) [Copy URL](https://blog.devart.com/data-export-from-amazon-rds-mysql-instance.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/data-management-solutions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) Best Data Management Solutions: Features, Pros, and Cons By [Victoria Shyrokova](https://blog.devart.com/author/victorias) March 27, 2025 [0](https://blog.devart.com/data-management-solutions.html#respond) 230 Data management tools can either be your greatest ally or your biggest headache. Effective data management is the process of controlling and organizing your data assets to ensure their quality, security, and accessibility. The right data management solution simplifies the validation, storage, and processing of your data, transforming it into a reliable resource for accurate analysis, informed decision-making, and regulatory compliance. To help you make an informed choice, we’ve done the research for you. In this article, we’ll compare leading data management solutions, evaluating their features, cost, strengths, weaknesses, and how they work for different use cases. Table of contents Types of data management software Best data management tools Choosing the right data management tool for your business Conclusion Types of data management software There’s a wide range of data management software. You have Database Management Systems (DBMS) for transactional operations, Master Data Management systems (MDM) for centralized master data control, and ETL (Extract, Load, Transform) tools. You’ll also find solutions for data warehousing, data modeling, data integration, and data governance. To find the top performers, we’ve analyzed how they handle the different aspects of your data lifecycle. Data integration We’ve prioritized data integration tools with pre-built connectors for relational databases, cloud services, API endpoints, and unstructured data stores. Data quality & cleansing These platforms automate the grunt work of data validation, deduplication, and standardization. We chose those with features like data profiling to identify inconsistencies, rule-based cleansing to enforce standards, and anomaly detection to flag outliers. Data governance & compliance There are also many data management programs for enforcing data policies, tracking lineage, and ensuring compliance with regulations like GDPR and HIPAA. Key features we looked for included data catalogs, policy management, and access controls. Scalability & performance We’ve focused on those tools that ensure your data infrastructure can keep pace with your growth, so you don’t hit performance walls when you need to scale. Think solutions with optimized data retrieval, minimized network latency, and efficient data type handling. Security & access control Security and access control are non-negotiable, so we’ve looked at tools that build layered defenses for your data. These software include robust encryption, both at rest and in transit, and granular, role-based access controls, along with detailed audit trails. Ease of use & automation We’ve also chosen data management tools that are genuinely easy to use and automate, even for non-technical users. These solutions come with drag and drop interfaces for data connection setup, visual query builders for complex data retrieval, and automated data type mapping, among other things. Best data management tools The top data management software seamlessly integrates all of the above while prioritizing scalability and ease of use, ensuring reliable data management across your entire data lifecycle. Devart ODBC drivers [Devart ODBC Drivers](https://www.devart.com/odbc/) provide reliable, direct connections for real-time analytics and data pipelines. They connect directly to most popular databases, like MySQL, PostgreSQL, and Oracle among others. You can also quickly pull data from major cloud services like Adobe Commerce, Freshbooks, and Salesforce. Plus, they’re optimized for speed and work across Windows, macOS, and Linux environments, with secure connection options and straightforward setup. Pros Allow direct, fast connections to +25 databases and more than 60 cloud services. Provide cross-platform compatibility and works with a wide range of data integration and business intelligence tools. Keep your data secure with encryption and OAuth 2.0 support. Easy to install and set up. Offer flexible pricing and a [free 30-day trial](https://www.devart.com/odbc/) . Cons May require some technical know-how for advanced setups. Best to keep them updated to stay compatible with the latest databases. Informatica [Informatica](https://www.informatica.com/) is a great option if you’re managing complex enterprise-level data pipelines that span multiple clouds and on-premise. Its AI automates data quality, and provides robust security and compliance controls, including data cataloging, automated policy enforcement, and real-time data masking. Pros Connects natively to popular cloud platforms with pre-built integrations. Supports advanced cloud data integration patterns. Automates data quality tasks with AI-powered features. Provides robust security and compliance controls. Cons The user interface, while powerful, can have a steep learning curve. Licensing and usage costs can quickly add up. Talend [Talend](https://www.talend.com/) excels in hybrid and multi-cloud environments for data lakes, cloud migration, and real-time streaming. It offers a mix of open-source and commercial data integration solutions, which gives you flexibility depending on your needs and budget. Its visual ETL interface simplifies complex pipelines with drag-and-drop, making it easy to handle transformations and data quality tasks like profiling and cleansing. Pros Facilitates seamless integration across diverse cloud and on-premises systems. Simplifies complex data flow design with its intuitive visual interface. Enables thorough data quality control. Offers adaptable deployment options with both open-source and commercial versions. Cons Can get challenging to debug. Struggles with performance bottlenecks in large-scale data processing. Lacks dynamic schema evolution. Microsoft Azure Data Factory [Azure Data Factory](https://azure.microsoft.com/en-us/products/data-factory) (ADF) is Microsoft’s cloud-based service for building ETL and data workflow automation. It’s built to work hand-in-hand with services like Azure SQL Database and Blob Storage, which is perfect for building data warehouses and analytics. Plus, it offers dynamic scaling, so it’s great for large-scale data processing. Pros Connects seamlessly with the Azure ecosystem. Scales dynamically to handle large data volumes and complex workflows. Provides a wide range of connectors for diverse data sources. Supports serverless execution, optimizing cost and resource utilization. Cons Takes a bit of getting used to, especially if you’re new to Azure. Can get tricky when you’re managing advanced transformations and complex pipelines. Might get expensive if you’re dealing with massive data volumes or running really complex pipelines. IBM InfoSphere [IBM InfoSphere](https://www.ibm.com/information-server) is a solid choice for big data challenges, but it stands out for its advanced MDM. It uses AI to automate things like tracing data lineage and spotting quality issues, which helps keep your data clean and compliant. Plus, it’s built for big data and real-time analytics, with pre-built compliance templates, so you can get insights faster and stay on top of regulations. Pros Automates data governance with AI. Accelerates real-time analytics with in-memory processing. Simplifies regulatory reporting with pre-built templates. Orchestrates complex data workflows across diverse systems. Cons Can be a significant investment, both in licensing and infrastructure. May be overkill for smaller organizations or less complex data needs. Might present compatibility challenges if you’re not deeply embedded in the IBM ecosystem. SAP Data Services [SAP Data Services](https://www.sap.com/products/technology-platform/data-services.html) is a heavyweight software for data management, pretty handy if you’re working with SAP systems. It’s built for serious ETL tasks, like transforming complex transactional data into analytical formats, and has powerful data profiling tools to spot and fix data quality issues. It connects smoothly with SAP applications, but also with other core business systems like Oracle databases or Salesforce CRM. Pros Provides deep integration with SAP applications. Delivers advanced data profiling and cleansing. Supports complex ETL processes. Offers robust metadata management. Cons Can be complex to implement and manage. Might be costly, especially for non-SAP-centric organizations. May have limited flexibility when integrating with non-SAP cloud-native data platforms. Dell Boomi [Dell Boomi](https://boomi.com/) focuses on letting users build cloud-first integrations quickly and visually, so you’ll find the drag-and-drop interface easy to use. Plus, like most modern data management solutions, it uses AI to automate data mapping and transformation, simplifying complex data flows. Pros Simplifies cloud integration with a low-code visual interface. Automates data mapping and transformations using AI. Supports API management and workflow automation. Scales easily in the cloud. Cons May have limitations for highly complex or custom data transformations. Can incur costs that scale with the number of connectors and processes. Not the best fit if you need deep, granular control over integrations. Snowflake [Snowflake](https://www.snowflake.com/en/) is a better option if you’re juggling both structured and semi-structured data. It’s a cloud-first data warehouse, so it handles real-time analytics and scales nicely when workloads spike. Its pay-as-you-go pricing model is a big plus, allowing you to scale resources up or down as needed. In addition, it works across multiple cloud environments and offers high-speed data sharing capabilities. Pros Scales compute and storage independently, optimizing cost and performance. Supports multi-cloud deployments. Enables high-speed data sharing. Offers a consumption-based pricing model. Cons Can rack up unpredictable costs if you’re not careful with how you use it. May require optimization for complex queries to maintain performance. Takes a bit to get the hang of. Choosing the right data management tool for your business Think about your needs before picking a data tool. Cloud solutions scale fast, while on-premises offers control, especially for sensitive data. However, the decision ultimately depends on your company. For instance, smaller businesses usually find cost-effective solutions like Devart ODBC Drivers or cloud-based platforms like Dell Boomi ideal for quick integrations. In contrast, mid-sized organizations might consider ADF for robust cloud ETL, while large enterprises (especially those with strict compliance) might prefer the control of tools like IBM InfoSphere. Now, you also need to factor in your industry and budget. If you work on an e-commerce startup needing rapid, direct database access for inventory management, Devart ODBC Drivers are a great starting point. But a hospital? They might need something like IBM InfoSphere for data security and compliance. Whatever you pick, make sure the tool can scale with your growth and integrates smoothly with your current tech stack. This will save you headaches down the line. Conclusion Choosing the right data solution is how you turn data into smart decisions. We’ve covered the best data management software out there, whether you’re handling quick database connections, building cloud integrations, or managing enterprise-level data. However, you need to carefully evaluate your specific needs and long-term goals before selecting any data management tools. If you’re not sure where to begin, Devart offers a bunch of solutions that you can test out for free without committing. Plus, the sales team can help you quickly find what you’re looking for. Tags [data management](https://blog.devart.com/tag/data-management) [odbc](https://blog.devart.com/tag/odbc) [odbc driver](https://blog.devart.com/tag/odbc-driver) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-management-solutions.html) [Twitter](https://twitter.com/intent/tweet?text=Best+Data+Management+Solutions%3A+Features%2C+Pros%2C+and+Cons&url=https%3A%2F%2Fblog.devart.com%2Fdata-management-solutions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-management-solutions.html&title=Best+Data+Management+Solutions%3A+Features%2C+Pros%2C+and+Cons) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-management-solutions.html&title=Best+Data+Management+Solutions%3A+Features%2C+Pros%2C+and+Cons) [Copy URL](https://blog.devart.com/data-management-solutions.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [ODBC](https://blog.devart.com/category/odbc) [What is Data Integration? Definition, Types, Examples & Use Cases](https://blog.devart.com/what-is-data-integration.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [How to Import Data From Other Software into QuickBooks Online](https://blog.devart.com/import-data-quickbooks-online-odbc-driver.html) April 10, 2025"} {"url": "https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Migrate from MySQL to Oracle: A Comprehensive Guide By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) January 18, 2024 [0](https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html#respond) 4723 [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) and MySQL stand out as the two most widely used relational database management systems globally, and they both are owned by Oracle Corporation. Many organizations incorporate these systems into their workflows. This prevalence often leads to the need for data migration between these systems. Migrating data from MySQL to Oracle can be a wearisome task. It involves a significant amount of manual effort, such as data export, import into other systems, and database reconfiguration. However, this task can be streamlined with the assistance of two Devart products: [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) and the ODBC driver for MySQL. Let’s delve into the process in more detail. What are the advantages of migrating from MySQL to Oracle database? Both MySQL and Oracle are viable database options for organizations, with the choice depending on specific needs and scenarios. Oracle stands out as a more robust solution, offering features unavailable in MySQL. Migrating from MySQL to Oracle brings the following advantages: Enhanced performance and scalability for handling large and complex databases Advanced security features such as encryption, auditing, and extensive data access controls Data partitioning capabilities, ideal for managing large-scale data efficiently High flexibility for both static and dynamic environments Support for both SQL and PL/SQL programming languages Various storage options, including tablespaces and packages Support for multiple data models within a database, including Graph, Relational, Key-Value, and Document A variety of index types, like Normal, Bitmap, Partitioned, Function-based, and Domain indexes The Oracle Streams feature for efficient data replication and integration The Oracle RAC (Real Application Clusters) feature offers advanced clustering technologies Integration with a wide range of other enterprise applications and systems Comprehensive and precise documentation and community support Flexible pricing models allow users to select the option that best fits their needs Thanks to Devart products, we can streamline switching to Oracle from MySQL, ensuring it is swift, straightforward, and user-friendly. Prerequisites In our demonstration, we will use two Devart products: dbForge Edge and the ODBC driver for MySQL. [dbForge Edge](https://www.devart.com/dbforge/edge/) is a versatile software solution comprising four dbForge Studios, each tailored to a specific database management system (MySQL/MariaDB, SQL Server, Oracle, and PostgreSQL). Edge is particularly beneficial for organizations working on multiple projects across various RDBMSs, as it offers a comprehensive toolset to efficiently handle database tasks in each system. The [ODBC driver for MySQL](https://www.devart.com/odbc/mysql/) is a specialized connectivity solution that provides direct and secure access to real-time data in MySQL, MariaDB, Azure for MySQL, and Amazon RDS for MySQL. It does not require any additional client software while ensuring full compliance with ODBC API and data types. In our scenario, these two solutions enable us to migrate the data directly from the MySQL database into the Oracle database. Importing data into Oracle using ODBC drivers The ability to import data directly from various sources, including ODBC, is one of the benefits offered by dbForge Studio for Oracle (and other DBMSs). Importing data via ODBC allows the users to retrieve data directly from various storage in different platforms and simplifies the tasks related to database-related objects, such as reports, ETL processes, etc. Devart currently offers ODBC drivers for 25 databases and 50+ cloud platforms. To start, we need to install and configure the [ODBC driver for MySQL](https://www.devart.com/odbc/mysql/) , as it will allow us to connect to the MySQL database directly using dbForge Studio for Oracle. Install and configure ODBC driver for MySQL Devart offers ODBC Drivers for OS. In our case, we’ll use the ODBC Driver for Windows and illustrate the data migration process from MySQL to Oracle on Windows. [Download MySQL ODBC Driver](https://www.devart.com/odbc/mysql/download.html) from the official page and install it as described in the [instructions](https://blog.devart.com/installing-odbc-driver-and-creating-data-source-on-windows.html) – it only takes a couple of clicks. Then launch the ODBC driver after the installation and go to the System DSN tab. Click Add . Select Devart ODBC Driver for MySQL and click Finish . Connect to the MySQL database by entering the required details into the Driver configuration window (the data source name, the server name, the user ID, the password, and the target database). You can click Test Connection to check if everything is fine. Click OK to save your settings. Now, we can proceed to the data import. Import data from MySQL to Oracle via ODBC Launch dbForge Studio for Oracle – it is available as a stand-alone software and is also provided as a part of dbForge Edge. If you are going to work with your data in Oracle, this Studio offers the complete set of features and options to perform database development, management, and administration tasks. You can import the data into any database provided the user has the necessary rights. In our example, we import the data into the sakila database. Start the data import: select Database > Import Data In the Sources section, select ODBC and click Next In the ODBC options section, select our ODBC Driver for MySQL as the data source and provide the username and password to connect to the MySQL server from the Studio for Oracle. Click Next . In the Destination section, select the table in the MySQL database to import data from it on the left Source pane. The target table is on the right pane – we can import data into the new or an existing table. In our example, we’ll import data into a new table – Active_Customers . This table is present in the sakila MySQL database but is absent in the sakila Oracle database. If necessary, you can apply the custom query to import a data portion according to some criteria. Click the Custom Query button and enter the query. After that, click Next . Check the data formats and mapping settings. By default, dbForge Studio for Oracle maps the columns itself for the new table. Click Next . In the Modes section, select the import mode. For the data import into a new table, the Studio uses the default Append method. For importing data into an existing table, you can choose another mode that would suit your requirements, such as Update , Delete , or Repopulate method. Click Next . In the Output section, you decide how to import the data. The Studio allows us to import the data into the database directly (which is the default choice) or generate SQL scripts to perform this operation. If you choose the Open the data import script… option, the Studio will present the script in the internal editor without making any changes to the target database. This option is helpful if you face a complex data migration task that involves compiling data from several tables or databases. dbForge Studio for Oracle will help you create the script to perform such a task. It can execute that script for you if necessary. Finally, specify the desired Error handling behavior in the corresponding section. The task is configured, and we can import the data immediately. Click the Import button. As you can see, a new Active_Customers table is added to the sakila database in Oracle, and all data is transferred successfully from the MySQL database. If you choose the option to just generate the data import script, you will see that script in the in-built editor after clicking Import : This way, we have imported the data from the specific table in a MySQL database into the Oracle database. The same task can be performed for multiple tables and views and for multiple databases. Besides, dbForge Studio for Oracle allows you to automate this task and save any task settings as templates for recurring jobs. Conclusion Utilizing dbForge Studio for Oracle in conjunction with an ODBC driver for a specific data source provides a potent and adaptable approach to data migration from other platforms to Oracle. We can import data directly into a different database management system (DBMS) or create a data import script to use it further as well as modify it for specific needs. Our case demonstrated how to successfully [migrate MySQL database to another server](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) . However, the solution is capable of configuring and performing migrations between various data sources and databases in leading database management systems. If your work processes suggest the necessity of running such regular data migrations, you might utilize dbForge Edge as the most universal solution that covers all major database management systems. The [fully functional free trial of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) is available for 30 days. This trial allows users to thoroughly test and fine-tune all database-related tasks, regardless of the size and number of databases across different systems, under actual workload conditions. Tags [data import](https://blog.devart.com/tag/data-import) [data migration](https://blog.devart.com/tag/data-migration) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-migration-from-mysql-to-other-dbms.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Migrate+from+MySQL+to+Oracle%3A+A+Comprehensive+Guide&url=https%3A%2F%2Fblog.devart.com%2Fdata-migration-from-mysql-to-other-dbms.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html&title=How+to+Migrate+from+MySQL+to+Oracle%3A+A+Comprehensive+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html&title=How+to+Migrate+from+MySQL+to+Oracle%3A+A+Comprehensive+Guide) [Copy URL](https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Data Type Mapping in Data Access Components for Delphi By [DAC Team](https://blog.devart.com/author/dac) July 5, 2012 [4](https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html#comments) 10225 Data Type Mapping is a flexible and easily customizable gear, which allows mapping between DB types and Delphi field types. In this article there are several examples, which can be used when working with all supported DBs. In order to clearly display the universality of the Data Type Mapping gear, a separate DB will be used for each example. Data Type Mapping Rules In versions where Data Type Mapping was not supported, the [DAC products](https://www.devart.com/dac.html) automatically set correspondence between the DB data types and Delphi field types. In versions with Data Type Mapping support the correspondence between the DB data types and Delphi field types can be set manually. Here is the example with the numeric type in the following table of a PostgreSQL database: CREATE TABLE numeric_types\n(\n id integer NOT NULL,\n value1 numeric(4,0),\n value2 numeric(10,0),\n value3 numeric(15,0),\n value4 numeric(5,2),\n value5 numeric(10,4),\n value6 numeric(15,6),\n CONSTRAINT pk_numeric_types PRIMARY KEY (id)\n) And Data Type Mapping should be used so that: the numeric fields with Scale=0 in Delphi would be mapped to one of the field types: TSmallintField, TIntegerField or TlargeintField, depending on Precision to save precision, the numeric fields with Precision>=10 and Scalе<= 4 would be mapped to TBCDField and the numeric fields with Scalе>= 5 would be mapped to TFMTBCDField. The above in the form of a table: PostgreSQL data type Default Delphi field type Destination Delphi field type numeric(4,0) ftFloat ftSmallint numeric(10,0) ftFloat ftInteger numeric(15,0) ftFloat ftLargeint numeric(5,2) ftFloat ftFloat numeric(10,4) ftFloat ftBCD numeric(15,6) ftFloat ftFMTBCD To specify that numeric fields with Precision <= 4 and Scale = 0 must be mapped to ftSmallint, such a rule should be set: var\n DBType: Word;\n MinPrecision: Integer;\n MaxPrecision: Integer;\n MinScale: Integer;\n MaxScale: Integer;\n FieldType: TfieldType;\nbegin\n DBType := pgNumeric;\n MinPrecision := 0;\n MaxPrecision := 4;\n MinScale := 0;\n MaxScale := 0;\n FieldType := ftSmallint;\n PgConnection.DataTypeMap.AddDBTypeRule(DBType, MinPrecision, MaxPrecision, MinScale, MaxScale, FieldType);\nend; This is an example of the detailed rule setting, and it is made for maximum visualization.Usually, rules are set much shorter, e.g. as follows: // clear existing rules\nPgConnection.DataTypeMap.Clear;\n// rule for numeric(4,0)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 0, 4, 0, 0, ftSmallint);\n// rule for numeric(10,0)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 5, 10, 0, 0, ftInteger);\n// rule for numeric(15,0)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 11, rlAny, 0, 0, ftLargeint);\n// rule for numeric(5,2)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 0, 9, 1, rlAny, ftFloat);\n// rule for numeric(10,4)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 10, rlAny, 1, 4, ftBCD);\n// rule for numeric(15,6)\nPgConnection.DataTypeMap.AddDBTypeRule(pgNumeric, 10, rlAny, 5, rlAny, ftFMTBcd); Defining Data Type Mapping Rules in the Design-Time In addition to the possibility of setting Data Type Mapping in Run-Time, there is also a possibility to set Data Type Mapping in Design-Time with the help of convenient user interface: Rules order When setting rules, there can occur a situation when two or more rules that contradict to each other are set for one type in the database. In this case, only one rule will be applied — the one, which was set first. For example, there is a table in an Oracle database: CREATE TABLE NUMBER_TYPES\n(\n ID NUMBER NOT NULL,\n VALUE1 NUMBER(5,2),\n VALUE2 NUMBER(10,4),\n VALUE3 NUMBER(15,6),\n CONSTRAINT PK_NUMBER_TYPES PRIMARY KEY (id)\n) TBCDField should be used for NUMBER(10,4), and TFMTBCDField – for NUMBER(15,6) instead of default fields: Oracle data type Default Delphi field type Destination field type NUMBER(5,2) ftFloat ftFloat NUMBER(10,4) ftFloat ftBCD NUMBER(15,6) ftFloat ftFMTBCD If rules are set in the following way: OraSession.DataTypeMap.Clear;\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, 9, rlAny, rlAny, ftFloat);\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, rlAny, 0, 4, ftBCD);\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, rlAny, 0, rlAny, ftFMTBCD); it will lead to the following result: Oracle data type Delphi field type NUMBER(5,2) ftFloat NUMBER(10,4) ftBCD NUMBER(15,6) ftFMTBCD But if rules are set in the following way: OraSession.DataTypeMap.Clear;\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, rlAny, 0, rlAny, ftFMTBCD);\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, rlAny, 0, 4, ftBCD);\nOraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, 9, rlAny, rlAny, ftFloat); it will lead to the following result: Oracle data type Delphi field type NUMBER(5,2) ftFMTBCD NUMBER(10,4) ftFMTBCD NUMBER(15,6) ftFMTBCD This happens because the rule OraSession.DataTypeMap.AddDBTypeRule(oraNumber, 0, rlAny, 0, rlAny, ftFMTBCD); will be applied for the NUMBER fields, whose Precision is from 0 to infinity, and Scale is from 0 to infinity too. This condition is met by all NUMBER fields with any Precision and Scale. When using Data Type Mapping, first matching rule is searched for each type, and it is used for mapping. In the second example, the first set rule appears to be the first matching rule for all three types, and therefore the ftFMTBCD type will be used for all fields in Delphi. If to go back to the first example, the first matching rule for the NUMBER(5,2) type is the first rule, for NUMBER(10,4) – the second rule, and for NUMBER(15,6) – the third rule. So in the first example, the expected result was obtained. So it should be remembered that if rules for Data Type Mapping are set so that two or more rules that contradict to each other are set for one type in the database, the rules will be applied in the specifed order. Defining rules for Connection and Dataset Data Type Mapping allows setting rules for the whole connection as well as for each DataSet in the application. For example, such table is created in SQL Server: CREATE TABLE person\n( \n id INT NOT NULL , \n firstname VARCHAR(20) NULL , \n lastname VARCHAR(30) NULL , \n gender_code VARCHAR(1) NULL , \n birth_dttm DATETIME NULL , \n CONSTRAINT pk_person PRIMARY KEY CLUSTERED (id ASC) ON [PRIMARY] \n)\nGO It is exactly known that the birth_dttm field contains birth day, and this field should be ftDate in Delphi, and not ftDateTime. If such rule is set: MSConnection.DataTypeMap.Clear;\nMSConnection.DataTypeMap.AddDBTypeRule(msDateTime, ftDate); All DATETIME fields in Delphi will have the ftDate type, that is incorrect. The ftDate type was expected to be used for the DATETIME type only when working with the person table. In this case, Data Type Mapping should be set not for the whole connection, but for a particular DataSet: MSQuery.DataTypeMap.Clear;\nMSQuery.DataTypeMap.AddDBTypeRule(msDateTime, ftDate); Or the opposite case. For example, DATETIME is used in the application only for date storage, and only one table stores both date and time. In this case, the following rules setting will be correct: MSConnection.DataTypeMap.Clear;\nMSConnection.DataTypeMap.AddDBTypeRule(msDateTime, ftDate);\nMSQuery.DataTypeMap.Clear;\nMSQuery.DataTypeMap.AddDBTypeRule(msDateTime, ftDateTime); In this case, in all DataSets for the DATETIME type fields with the ftDate type will be created, and for MSQuery – with the ftDateTime type. The point is that the priority of the rules set for the DataSet is higher than the priority of the rules set for the whole connection. This allows both flexible and convenient setting of Data Type Mapping for the whole application. There is no need to set the same rules for each DataSet, all the general rules can be set once for the whole connection. And if a DataSet with an individual Data Type Mapping is necessary, individual rules can be set for it. Rules for a particular field Sometimes there is a need to set a rule not for the whole connection, and not for the whole dataset, but only for a particular field. For example, there is such table in a MySQL database: CREATE TABLE item \n(\n id INT NOT NULL AUTO_INCREMENT,\n name CHAR(50) NOT NULL,\n guid CHAR(38), \n PRIMARY KEY (id)\n) ENGINE=MyISAM; The guid field contains a unique identifier. For convenient work, this identifier is expected to be mapped to the TGuidField type in Delphi. But there is one problem, if to set the rule like this: MyQuery.DataTypeMap.Clear;\nMyQuery.DataTypeMap.AddDBTypeRule(myChar, ftGuid); then both name and guid fields will have the ftGuid type in Delphi, that does not correspond to what was planned. In this case, the only way is to use Data Type Mapping for a particular field: MyQuery.DataTypeMap.AddFieldNameRule('guid', ftGuid); In addition, it is important to remember that setting rules for particular fields has the highest priority. If to set some rule for a particular field, all other rules in the Connection or DataSet will be ignored for this field. Ignoring conversion errors Data Type Mapping allows mapping various types, and sometimes there can occur the problem with that the data stored in a DB cannot be converted to the correct data of the Delphi field type specified in rules of Data Type Mapping or vice-versa. In this case, an error will occur, which will inform that the data cannot be mapped to the specified type. For example: Database value Destination field type Error ‘text value’ ftInteger String cannot be converted to Integer 1000000 ftSmallint Value is out of range 15,1 ftInteger Cannot convert float to integer But when setting rules for Data Type Mapping, there is a possibility to ignore data conversion errors: IBCConnection.DataTypeMap.AddDBTypeRule(ibcVarchar, ftInteger, True); In this case, the correct conversion is impossible. But because of ignoring data conversion errors, Data Type Mapping tries to return values that can be set to the Delphi fields or DB fields depending on the direction of conversion. Database value Destination field type Result Result description ‘text value’ ftInteger 0 0 will be returned if the text cannot be converted to number 1000000 ftSmallint 32767 32767 is the max value that can be assigned to the Smallint data type 15,1 ftInteger 15 15,1 was truncated to an integer value Therefore ignoring of conversion errors should be used only if the conversion results are expected. UniDAC and Data Type Mapping When using [UniDAC](https://www.devart.com/unidac/) , there often occurs a hard-to-solve situation, when two similar types from the DB have different types in Delphi. For greater clarity, there are examples below. For example, there is a project, which works with two DBs: Oracle and SQL Server. There is such table created in each DB: Oracle: CREATE TABLE ITEM_INFO\n(\n ID NUMBER NOT NULL,\n CODE VARCHAR2(10) NOT NULL,\n DESCRIPTION NVARCHAR2(250),\n CONSTRAINT PK_ITEM_INFO PRIMARY KEY (id)\n) SQL Server: CREATE TABLE item_info\n( \n id INT NOT NULL , \n code VARCHAR(10) NOT NULL , \n description NVARCHAR(250) NULL , \n CONSTRAINT pk_item_info PRIMARY KEY CLUSTERED (id ASC) \n ON [PRIMARY] \n)\nGO The problem is due to that, when working with Oracle with the enabled UseUnicode option, both CODE and DESCRIPTION fields will have the ftWideString type, and if the UseUnicode option is disabled, both fields will have the ftString type. For SQL Server, the CODE field will always be ftString, and the DESCRIPTION field will always be ftWideString. This problem arises especially sharply when attempting to create persistent fields, because in this case, when working with one of the providers, an error will always occur. Formerly, the only way to avoid the error was to refuse using of persistent fields in such situations. For the time being, this problem can be solved rather easily. Data Type Mapping can be set for the Oracle provider: UniConnection.DataTypeMap.Clear;\nUniConnection.DataTypeMap.AddDBTypeRule(oraVarchar2, ftString);\nUniConnection.DataTypeMap.AddDBTypeRule(oraNVarchar2, ftWideString); Or Data Type Mapping can be set for SQL Server: // for useUnicode = True in the Oracle data provider\nUniConnection.DataTypeMap.Clear;\nUniConnection.DataTypeMap.AddDBTypeRule(msVarchar, ftWideString); or: // for useUnicode = False in the Oracle data provider\nUniConnection.DataTypeMap.Clear;\nUniConnection.DataTypeMap.AddDBTypeRule(msNVarchar, ftString); Best wishes from Devart! This post aimed to describe the main advantages, provided to application developers by the new Data Type Mapping engine. A great amount of efforts and time was spent to make this engine flexible and convenient. It is very much hoped to be estimated by developers and make their application development easy and quick. Tags [c++builder](https://blog.devart.com/tag/cbuilder) [delphi](https://blog.devart.com/tag/delphi) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-type-mapping-in-delphi-data-access-components.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Type+Mapping+in+Data+Access+Components+for+Delphi&url=https%3A%2F%2Fblog.devart.com%2Fdata-type-mapping-in-delphi-data-access-components.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html&title=Data+Type+Mapping+in+Data+Access+Components+for+Delphi) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html&title=Data+Type+Mapping+in+Data+Access+Components+for+Delphi) [Copy URL](https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS Todd June 8, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 2:24 pm Hello there! Would you mind if I share your blog with my facebook group? There’s a lot of people that I think would really appreciate your content. Please let me know. Thank you AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:42 am Hello, Todd! Please feel free to share our blog. We are glad it can be useful. MartinT May 21, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 10:16 am hi, you mention constants like pgNumeric, oraNumber, myChar etc. in the code. where are those constants defined? DAC Team May 21, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 12:14 pm They are defined in the OraDataTypeMap.pas file for ODAC, PgDataTypeMap.pas – for PgDAC, and MyDataTypeMap.pas – for MyDAC. In UniDAC these files are named OraDataTypeMapUni.pas, PgDataTypeMapUni.pas and MyDataTypeMapUni.pas. Comments are closed."} {"url": "https://blog.devart.com/data-vs-metadata.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Data Versus Metadata: An In-Depth Exploration By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) June 17, 2023 [0](https://blog.devart.com/data-vs-metadata.html#respond) 2414 In today’s data-centric society, the quantity and diversity of information are expanding at an unprecedented pace. From social media interactions and financial transactions to scientific research and industrial operations, vast amounts of data are being generated every second. However, amidst this ever-growing sea of data, a crucial distinction must be made between the raw data itself and the underlying metadata that accompanies it. In this article, we delve deeper into the dichotomy between data and metadata, exploring their distinct roles, interdependence, and significance within the realm of data management. Contents What is data? Types of data The vital role of data What is metadata? Types of metadata Key differences between data and metadata Role of metadata in data management dbForge Documenter for metadata management Conclusion What is data? Data refers to raw facts, observations, measurements, or representations of information in various formats, such as numbers, text, images, audio, or video. It serves as the foundation of knowledge and is essential for making informed decisions, conducting research, and gaining insights across various domains. At its core, data is unprocessed and lacks inherent meaning or context. It represents discrete pieces of information that can be collected from numerous sources. While data itself may appear fragmented and disconnected, its true value lies in the insights and knowledge that can be extracted from it through analysis, interpretation, and contextualization. By applying appropriate techniques and tools, data can be transformed into actionable information, enabling organizations, researchers, and individuals to gain a deeper understanding of patterns, trends, correlations, and relationships within the data. In other words, data serves as the raw material for information and knowledge. It represents the building blocks that, when properly processed, organized, and analyzed, provide valuable insights and contribute to decision-making, problem-solving, and innovation across various fields. Types of data Data can be classified into various types based on its characteristics, structure, and representation. Data can be classified in various ways based on a specific context. Let us look at the most general classification of data, used in statistics. Categorical data , also known as qualitative data , is a type of data that represents characteristics, attributes, or categories rather than numerical values. It provides descriptive information about different groups or categories and is often expressed using labels or names. Categorical data is typically non-numeric and cannot be ordered or measured on a numerical scale. Categorical data can be further classified into two subtypes: Nominal data represents categories without any inherent order or ranking. Each data point in this type is assigned to a specific category or label, and the categories are mutually exclusive. Examples of nominal data include gender (male/female), marital status (single/married/divorced), nationality, or types of fruits (mango, kiwi, orange). Ordinal data represents categories that have a natural order or ranking. While the categories in ordinal data can be ranked, the differences between the categories may not be uniform. The relative order or position of each category is significant, but the magnitude of differences between the categories is not well-defined. Examples of ordinal data include satisfaction ratings (very satisfied, satisfied, neutral, dissatisfied, very dissatisfied) or educational levels (elementary, middle school, high school, college, postgraduate). Numerical data , also known as quantitative data , is a type of data that consists of numerical values or measurements. It represents quantities or amounts and can be subjected to mathematical operations and quantitative analysis. Numerical data is typically collected through measurements, observations, or counting. Numerical data can be further classified into two subtypes: Discrete data represents values that are separate and distinct. It consists of whole numbers or integers and cannot take on intermediate values. Discrete data often arises from counting or enumerating. Examples of discrete data include the number of students in a classroom, the number of cars in a parking lot, or the number of books on a shelf. Continuous data represents values that can take on any value within a given range or interval. It can be measured on a continuous scale and often involves fractional or decimal values. Continuous data is obtained through measurements or observations. Examples of continuous data include height, weight, temperature, time, or distance. The vital role of data Data plays a pivotal role in our rapidly evolving world, where information holds the key to success. From businesses and research institutions to governments and individuals, the importance of data cannot be overstated. Informed decision-making : This knowledge, derived from data analysis, helps organizations make informed decisions, identify opportunities, mitigate risks, and gain a competitive edge. Scientific advancements : Data drives scientific research, enabling discoveries, advancements, and evidence-based conclusions in various fields. Innovation : By harnessing the power of data, organizations can uncover new possibilities, identify emerging trends, and develop groundbreaking products and services. Resource allocation : Data helps allocate resources effectively, leading to better resource management and cost reduction. Economic growth : Data serves as a catalyst for economic growth, enabling businesses and governments to identify opportunities, make informed investments, and drive prosperity. Competitive advantage : Embracing the power of data gives organizations and individuals a competitive edge. It is important to recognize that the value of data lies not only in its abundance but also in its quality and integrity. Data must be accurate, reliable, and relevant to generate meaningful insights and drive effective decision-making. What is metadata? Metadata refers to “data about data” and provides additional information about the characteristics, context, and structure of a particular dataset, document, or information resource. It describes various attributes of data, such as its origin, format, content, location, quality, and relationships with other data elements. Metadata enhances the understanding, management, and usability of data by providing crucial details that help in its organization, discovery, interpretation, and preservation. Types of metadata There are different types of metadata: Descriptive metadata : Descriptive metadata provides information about the content, meaning, and context of the data. It includes details such as titles, summaries, keywords, abstracts, and subject classifications. Descriptive metadata helps in locating and identifying relevant data resources. Structural metadata : Structural metadata describes the organization, arrangement, and relationships between different components of a dataset or information resource. It provides information on how the data is structured, including the hierarchy, sequence, and interdependencies of its elements. Administrative metadata : Administrative metadata contains information related to the administrative aspects of data management. It includes details about data ownership, access rights, security, versioning, provenance, and data management policies. Administrative metadata helps in ensuring proper governance and accountability for data resources. Technical metadata : Technical metadata describes the technical characteristics of data, including its file format, size, encoding, resolution, compression, and software dependencies. It helps in understanding the technical requirements and capabilities for accessing, processing, and preserving the data. Metadata plays a crucial role in various domains, including libraries, archives, scientific research, digital asset management, and data-intensive industries. It facilitates data integration, discovery, interoperability, and reuse. Effective metadata management ensures data quality, facilitates data sharing, and supports accurate interpretation and analysis of data. Key differences between data and metadata Data and metadata are two distinct concepts that play integral roles in managing and understanding information. Understanding their key differences is essential for effective data management and interpretation. Here are the key distinctions between data and metadata: Nature and content: Data refers to the raw facts, measurements, observations, or representations collected or generated in various formats. It consists of the actual information being captured, such as numbers, text, images, or audio. Metadata, on the other hand, represents information about the data. It provides context, describes attributes, and adds meaning to the data. Metadata helps to understand the characteristics, structure, relationships, and other properties associated with the data. Purpose and function: Data serves as the primary source of information and provides the substance for analysis, interpretation, and decision-making. It is the core material that needs to be processed and understood to extract insights and knowledge. Metadata, in its turn, serves as a supporting framework for organizing, managing, and interpreting the data. It provides additional information about the data to enhance its usability, discoverability, and interpretation. Representation: Data represents the actual information or content being conveyed. It can be numerical values, text strings, images, or any other form of digital representation. Metadata represents the attributes and characteristics of the data. It describes aspects such as the data source, format, structure, relationships, creation date, and other contextual details. Relationship: Data exists independently and can stand alone as individual pieces of information. In contrast, metadata is inherently linked to the corresponding data. It provides information about the data, establishing a relationship and connection between the metadata and the underlying data. Usage: Data is used for analysis, decision-making, research, and various other purposes specific to the domain or application. Metadata is used to facilitate data management, discovery, interpretation, and understanding. It aids in locating, organizing, and interpreting the data effectively. Role of metadata in data management Metadata plays a critical role in data management, providing essential information about the characteristics, context, and structure of data. It serves as a guiding framework that enhances the understanding, organization, and usability of data resources. Here are some key roles that metadata plays in effective data management: Data discovery and identification Metadata helps users discover and identify relevant data resources. By providing descriptive information about the data, such as titles, keywords, summaries, or subject classifications, metadata enables users to search, locate, and assess the suitability of data for their specific needs. Data organization and structure Metadata aids in organizing and structuring data. It describes the relationships, dependencies, and hierarchy of data elements, helping to establish a logical and coherent structure. Metadata ensures that data is appropriately organized, facilitating efficient data storage, retrieval, and integration. Data quality and integrity Metadata includes information about data quality, validation checks, and provenance. It helps assess the accuracy, completeness, and reliability of data resources. By documenting data quality measures and validation processes, metadata supports data governance, data quality assurance, and data cleansing initiatives. Data interpretation and contextualization Metadata provides context and meaning to data, aiding in its interpretation and understanding. By capturing information about data sources, formats, units of measurement, and transformations applied, metadata helps users interpret data correctly and apply appropriate analysis techniques. Data access and security Metadata includes information about access rights, security measures, and data usage restrictions. It helps manage data access permissions, ensuring that sensitive or confidential data is appropriately protected. Metadata also assists in maintaining data privacy and compliance with regulatory requirements. Data integration and interoperability Metadata facilitates data integration by specifying data mappings, transformations, and standards. It enables the harmonization and interoperability of diverse data sources by providing information about data formats, data models, and data exchange protocols. Metadata ensures that different data sources can be combined and utilized effectively. Data lifecycle management Metadata supports data lifecycle management by documenting the history, versions, and evolution of data resources. It aids in tracking changes, managing data updates, and preserving data provenance. Metadata provides information about data retention policies, archival procedures, and disposal guidelines, ensuring proper data lifecycle management. Collaboration and knowledge sharing Metadata promotes collaboration and knowledge sharing by enabling data discovery, understanding, and reuse. It assists in sharing data across teams, departments, or organizations by providing clear descriptions and standardized metadata formats. Metadata fosters efficient collaboration, accelerates research, and encourages data sharing practices. dbForge Documenter for metadata management To unlock the true value of your data, it is crucial to have proper names, descriptions, and classification. Without effective metadata management, your data may become difficult to use or even worthless. That’s where [dbForge Documenter for SQL Server](https://www.devart.com/dbforge/sql/documenter/) comes in. dbForge Documenter allows you to describe your database schema with metadata, including tables, columns, and relationships. By documenting and organizing this vital information, you can gain a clear understanding of your data model. With dbForge Documenter, you can visualize the data model, making it easier to comprehend and navigate. The tool enables you to create comprehensive documentation that can be shared with everyone in your organization. This documentation empowers both technical and non-technical users to engage in self-service data discovery and analysis. dbForge Documenter excels at preserving and documenting the metadata of all your database objects. It captures and retains crucial information such as descriptions, properties, and other attributes associated with each object within your database. You can learn more about the features and capabilities of dbForge Documenter for SQL Server [here](https://docs.devart.com/documenter-for-sql-server/) . Conclusion By recognizing the importance of metadata, organizations can unlock the full potential of their data. Metadata management enables data discovery, organization, quality assurance, and interpretation. It facilitates collaboration, data integration, and knowledge sharing, empowering users to harness the power of their data resources. To simplify and streamline metadata management, we recommend trying dbForge Documenter. With its comprehensive features for capturing, documenting, and preserving metadata, dbForge Documenter is a valuable tool for any organization seeking to enhance their data management practices. Take advantage of the opportunity to [download and try dbForge Documenter for a 30-day free trial](https://www.devart.com/dbforge/sql/documenter/download.html) . Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [documenter](https://blog.devart.com/tag/documenter) [SQL Server](https://blog.devart.com/tag/sql-server) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdata-vs-metadata.html) [Twitter](https://twitter.com/intent/tweet?text=Data+Versus+Metadata%3A+An+In-Depth+Exploration&url=https%3A%2F%2Fblog.devart.com%2Fdata-vs-metadata.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/data-vs-metadata.html&title=Data+Versus+Metadata%3A+An+In-Depth+Exploration) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/data-vs-metadata.html&title=Data+Versus+Metadata%3A+An+In-Depth+Exploration) [Copy URL](https://blog.devart.com/data-vs-metadata.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-normalization.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Normalization in SQL: Key Steps, Benefits, and Examples By [Victoria Shyrokova](https://blog.devart.com/author/victorias) May 5, 2025 [0](https://blog.devart.com/database-normalization.html#respond) 57 Dealing with a disorganized database often means running into duplicate records, mistakes, and sluggish performance. On the other hand, when your data is structured properly, it’s easier to maintain, more reliable, and much faster to work with. That’s what database normalization does — it cleans up the structure so everything runs smoother and smarter. What is normalization in database? Let’s find out why it’s so essential. Table of contents What is database normalization? Key benefits of database normalization The process of database normalization Database normalization vs. denormalization Normalization in SQL (including SQL Server) Common mistakes in database normalization Best practices for database normalization How dbForge Edge can help with database normalization Conclusion What is database normalization? Database normalization is basically about organizing your data so that it actually makes sense. Instead of cramming everything into one oversized, chaotic table, normalization breaks things down into smaller, connected tables—like sorting your stuff into labeled drawers instead of tossing it all into one big junk drawer. This approach not only saves space but also makes it way easier to update or delete information without messing things up. It helps keep your data clean, consistent, and accurate over time. Plus, a well-normalized database makes queries faster and reduces the chances of running into errors or duplicates. In short, normalization helps you build a database that’s efficient, reliable, and much less of a headache to manage. Key principles of database normalization Normalization of database follows a few straightforward but essential rules to keep your data clean and easy to work with. First up: each column should have atomic values . That just means every column should hold one single piece of information—not a list, not a mix, just one clear value. Next, there’s the idea of removing partial dependencies . In tables that use a composite key (a key made of more than one column), every other column should depend on all parts of that key, not just a piece of it. Finally, normalization helps reduce transitive dependencies . This happens when one column depends on another column that isn’t a key. To avoid confusion and inconsistency, each column in a well-structured table should depend directly on the main key, not on another non-key column. Key benefits of database normalization Keeping your data consistent Imagine entering a customer’s name once and having it correct everywhere. That’s what normalization helps with—no more confusing mix-ups or different versions of the same info. Cutting down on duplicate data Instead of storing the same data over and over (like someone’s phone number in five different places), normalization keeps it in one spot. Less mess, more space. Making updates easier If you ever need to change something (like updating a product price), you only have to do it once. Everything stays neat and up to date without chasing down errors. The process of database normalization When you build a database, it’s important to keep it organized so you don’t run into problems like duplicate info, messy updates, or confusing reports later. A step-by-step process brings you undeniable advantages and helps clean up your database by breaking it into smaller, more focused tables. First Normal Form (1NF) This step is about making sure that each field contains only one piece of information. Imagine you’re keeping track of students and the courses they take. If you write down “Math, Science” in one box under “Courses,” that’s a no-go. 1NF says: list each course separately. One row for each course the student takes. That way, everything stays clear and easy to read. This is something that is easy to perform even for beginners! Second Normal Form (2NF) Once your data follows 1NF, the next step is to remove repeated info that only depends on part of the record. Say you’ve got a student’s name written every time you list a course they’re taking. Since their name has nothing to do with the course itself, you don’t need to keep writing it over and over. Instead, move the student’s personal info (like name or email) to a separate list, and just reference it when you need it. That’s 2NF—cutting out repetition by grouping related info together. Third Normal Form (3NF) Now it’s time to get even more specific in techniques. In 3NF, we make sure every piece of data depends directly on the main subject of the table, not on something else. For example, if you’re storing someone’s ZIP code and their city, but the city can be figured out just from the ZIP code, then the city doesn’t really belong there. You can move that part elsewhere. This helps avoid mistakes like someone accidentally typing the wrong city for a given ZIP code. Boyce-Codd Normal Form (BCNF) Sometimes, 3NF isn’t enough to avoid redundancy. For example, the dependencies that involve non-primary attributes might cause repetition of data, even though they satisfy the requirement that every piece of data is dependent on the main subject of the table. For instance, you have a table that stores customer addresses, such as customer ID (primary key), city, and ZIP code. Each customer can have only one ZIP code and city, and ZIP codes cannot belong to different cities. This tablesatisfies 3NF, since all non-prime attributes (City, ZIP code) depend directly on the primary key (ID). However, ZIP code determines City, and ZIP code is not a candidate key (the one that belongs to a minimal set of columns that uniquely identify each row in a table). In this case, only ID is going to be that determinant. As a result, we get redundancy, since every time a customer from New York is added, you need to repeatedly type “New York” for ZipCode 10001. And, If the city name changes (for some reason — maybe administrative updates), you need to update multiple rows. The Boyce-Codd Normal Form lets you ensure that every determinant used should be a candidate key, which helps avoid such cases. To satisfy BCNF criteria, we can decompose one table into two, one for City and ZIP code information, and another for ID and ZIP code. Fourth Normal Form (4NF) This one’s about not mixing two unrelated types of information in one place. Let’s say you’re tracking the languages a student speaks and also their hobbies. These two things have nothing to do with each other, but if you list them side by side, you end up with a bunch of weird combinations. To fix this, just keep them in separate lists. One list for languages, one list for hobbies. That’s 4NF. Fifth Normal Form (5NF) This is used in more complex situations, like when a product, a supplier, and a region all affect each other in different ways. Trying to cram all of that into one list can get messy. So in 5NF, you break things down even further—into simple relationships between just two items at a time. Then you can piece them back together when you need the whole picture. It’s like solving a puzzle by sorting the pieces first. Database normalization use case Now that you are fully aware of the database normalization process, let’s find out how you can actually use it in a real-world case. Suppose we have a table that contains store order details, such as order ID, customer information, order details, delivery information, payment methods, order notes, delivery date and time, order date, and discount details. Order ID Customer Info Order Details Delivery Info Payment Methods Order Notes Order Date Delivery DateTime Discounts 1 John Doe; 555-1234, 555-5678; 123 Maple St, New York, 456 Elm St, Chicago; 1985-07-04; johndoe@email.net “Tea, 1.50, 2, Beverage, 10%; Coffee, 3.00, 1, Beverage” Pickup; 123 Maple St, New York; 2023-01-05 14:00 Credit Card, Cash “Extra napkins” 2023-01-01 2023-01-05 14:00 5% on total 2 Alice Johnson; 555-8765; 456 Oak Ave, San Francisco; 1990-11-12; alicej@email.net “Milk, 2.00, 1, Dairy” Home Delivery; 456 Oak Ave, San Francisco; 2023-01-06 09:00 Debit Card “Leave at front door” 2023-01-02 2023-01-06 09:00 2% on total 3 Bob Brown; 555-4321; 789 Pine Rd, Miami; 1978-03-22; bobb@email.net “Biscuit, 0.99, 5, Snack, 5%; Tea, 1.50, 3, Beverage; Juice, 2.10, 2, Beverage” Home Delivery; 789 Pine Rd, Miami; 2023-01-07 18:30 Cash, Voucher “Call on arrival” 2023-01-03 2023-01-07 18:30 No discount As you can see, there are different dependencies involved, which lead to redundancy and a lack of data integrity. Let’s use database normalization to design a database that would be easy to scale and maintain and will organize all the data in the most efficient way. First Normal Form (1NF) To bring the table to 1NF, we need to remove all repeating groups and ensure that each field contains only atomic (indivisible) values. We must also ensure that each record has a unique identifier (which we have as the Order ID). Breaking down the fields: Customer Info is split into: Customer Name Primary Phone Optional Mobile Phone Addresses with respective cities DOB Email Order Details are split into: Item Name Price Quantity Category Individual Discount Delivery Info is split into: Delivery Method Delivery Address City Delivery DateTime Payment Methods is split into individual payment methods. 1NF tables Orders table Orde r ID Customer ID Order Date Delivery Method Delivery Address City Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts 1 1 2023-01-01 Pickup 123 Maple St New York 2023-01-05 14:00 Credit Card Cash Extra napkins 5% on total 2 2 2023-01-02 Home Delivery 456 Oak Ave San Francisco 2023-01-06 09:00 Debit Card Leave at front door 2% on total 3 3 2023-01-03 Home Delivery 789 Pine Rd Miami 2023-01-07 18:30 Cash Voucher Call on arrival No discount Customers table Customer ID Name Primary Phone Mobile Phone Address City DOB Email 1 John Doe 555-1234 555-5678 123 Maple St, 456 Elm St New York, Chicago 1985-07-04 johndoe@email.net 2 Alice Johnson 555-8765 456 Oak Ave San Francisco 1990-11-12 alicej@email.net 3 Bob Brown 555-4321 789 Pine Rd Miami 1978-03-22 bobb@email.net Order Items table Order ID Item ID Item Name Category Price Quantity Discount 1 1 Tea Beverage 1.50 2 10% 1 2 Coffee Beverage 3.00 1 2 1 Milk Dairy 2.00 1 3 1 Biscuit Snack 0.99 5 5% 3 2 Tea Beverage 1.50 3 3 3 Juice Beverage 2.10 2 Second Normal Form (2NF) To bring the tables to the Second Normal Form (2NF), we must remove any partial dependencies, where an attribute is dependent only on part of a composite primary key. 2NF also requires that each table be in 1NF, which we’ve already achieved. Review of 1NF tables Let’s ensure there are no partial dependencies on non-primary key attributes in the data. Orders table Order ID Customer ID Order Date Delivery Method Delivery Address City Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts Order ID is the primary key All attributes depend on the Order ID, not just part of it Customer ID Name Primary Phone Mobile Phone Address City DOB Email Customer ID is the primary key All attributes depend on the Customer ID, not just part of it Order Items table Order ID Item ID Item Name Category Price Quantity Discount Order ID and Item ID together form the composite primary key The attributes such as Item Name, Category, Price, Quantity, and Discount should ideally depend on both Order ID and Item ID, but currently, attributes like Item Name, Category, and Price are likely dependent only on the Item ID. This needs correction for 2NF Changes required for 2NF The Order Items table shows partial dependency: Item Name, Category, and Price depend only on Item ID and not on the combination of Order ID and Item ID. We need to move these to a separate table to eliminate partial dependencies. Revised tables for 2NF: Items table (New) Item ID Item Name Category Price 1 Tea Beverage 1.50 2 Coffee Beverage 3.00 3 Milk Dairy 2.00 4 Biscuit Snack 0.99 5 Juice Beverage 2.10 Order Items table (Updated) Order ID Item ID Quantity Discount 1 1 2 10% 1 2 1 2 3 1 3 4 5 5% 3 1 3 3 5 2 Third Normal Form (3NF) To bring the tables into the Third Normal Form (3NF), we need to ensure that it is already in Second Normal Form (2NF) and additionally remove any transitive dependencies. A transitive dependency occurs when a non-key attribute depends on another non-key attribute. Review of current tables in 2NF Let’s examine each table to identify and remove transitive dependencies. Orders table Order ID Customer ID Order Date Delivery Method Delivery Address City Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts Customer ID could potentially create a transitive dependency, where attributes like Delivery Address and City might depend on Customer ID instead of Order ID. Customers table Customer ID Name Primary Phone Mobile Phone Address City DOB Email All attributes depend directly on Customer ID. There doesn’t appear to be a transitive dependency within this table. Items table Item ID Item Name Category Price No apparent transitive dependencies, as each attribute depends only on Item ID. Order Items table Order ID Item ID Quantity Discount No transitive dependencies; all attributes depend directly on the composite key (Order ID and Item ID). Changes required for 3NF The potential issue in the Orders table concerning the Delivery Address and City being dependent on Customer ID rather than Order ID suggests that these attributes might better fit within a separate structure linked to customers rather than orders. Revised tables for 3NF Delivery Info table (New) Customer ID Delivery Address City 1 123 Maple St New York 1 456 Elm St Chicago 2 456 Oak Ave San Francisco 3 789 Pine Rd Miami Orders table (Updated) Order ID Customer ID Order Date Delivery Method Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts 1 1 2023-01-01 Pickup 2023-01-05 14:00 Credit Card Cash Extra napkins 5% on total 2 2 2023-01-02 Home Delivery 2023-01-06 09:00 Debit Card Leave at front door 2% on total 3 3 2023-01-03 Home Delivery 2023-01-07 18:30 Cash Voucher Call on arrival No discount Boyce-Codd Normal Form (BCNF) To achieve the Boyce-Codd Normal Form (BCNF), we must ensure that the database is already in Third Normal Form (3NF) and additionally, every determinant (an attribute or set of attributes on which some other attribute fully functionally depends) must be a candidate key. BCNF is particularly focused on resolving anomalies caused by functional dependencies where the determinant is not a candidate key. Review of current tables for BCNF Let’s examine each table to check for functional dependencies and ensure that each determinant is a candidate key. Orders table Order ID Customer ID Order Date Delivery Method Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts Order ID is the primary key. All attributes are functionally dependent on Order ID. There are no dependencies where a non-candidate key is a determinant. Customers table Customer ID Name Primary Phone Mobile Phone DOB Email Customer ID is the primary key. All attributes depend directly on Customer ID. No attribute determines another outside of its dependency on the primary key. Delivery Info table Customer ID Delivery Address City Customer ID could be considered a primary key if we assume each customer has a unique delivery address. However, if customers can have multiple delivery addresses, then Customer ID combined with Delivery Address should form a composite key. To resolve potential issues, we must ensure that each record in this table can be uniquely identified by the combination of Customer ID and Delivery Address. Items table Item ID Item Name Category Price Item ID is the primary key. All attributes are functionally dependent on Item ID. Order Items table Order ID Item ID Quantity Discount Order ID and Item ID form a composite key. Quantity and Discount are dependent on this composite key. Changes for BCNF Delivery Info table (revised for BCNF): Assuming that customers can have multiple delivery addresses, we should adjust the primary key. Customer ID Delivery Address City Treat Customer ID and Delivery Address as a composite key. Fourth Normal Form (4NF) Moving to the Fourth Normal Form (4NF) involves ensuring that the database is already in Boyce-Codd Normal Form (BCNF) and additionally eliminating any multi-valued dependencies that are not functional dependencies. Multi-valued dependencies occur when one attribute in a table can take on multiple independent values from another attribute, independently of any other attribute. Review of current tables for 4NF To ensure tables are in 4NF, we need to confirm that there are no multi-valued dependencies unless they are on a superkey. Orders table Order ID Customer ID Order Date Delivery Method Delivery DateTime Payment Method 1 Payment Method 2 Order Notes Discounts Order ID is the primary key. All attributes depend on the primary key. The potential issue could arise with Payment Method 1 and Payment Method 2 being split as separate attributes rather than part of a single multi-valued attribute. Customers table Customer ID Name Primary Phone Mobile Phone DOB Email Customer ID is the primary key. No multi-valued dependencies, as all attributes directly depend on the primary key. Delivery Info table Customer ID Delivery Address City Composite key: Customer ID and Delivery Address. No multi-valued dependencies as City depends on Delivery Address, and all attributes are functionally dependent on the composite key. Items table Item ID Item Name Category Price Item ID is the primary key. No multi-valued dependencies, each attribute directly depends on the primary key. Order Items table Order ID Item ID Quantity Discount Composite key: Order ID and Item ID. No multi-valued dependencies, as Quantity and Discount depend on the composite key. Changes for 4NF Orders table (revised for 4NF) To address the issue with payment methods, we should consider normalizing the payment methods into a separate table to eliminate any multi-valued dependency concerns. Payment Methods table (New) Order ID Payment Method 1 Credit Card 1 Cash 2 Debit Card 3 Cash 3 Voucher The final listing of all normalized tables, including their data: 1. Orders table This table contains order-level details. Order ID Customer ID Order Date Delivery Method Delivery DateTime Order Notes Discounts 1 1 2023-01-01 Pickup 2023-01-05 14:00 Extra napkins 5% on total 2 2 2023-01-02 Home Delivery 2023-01-06 09:00 Leave at front door 2% on total 3 3 2023-01-03 Home Delivery 2023-01-07 18:30 Call on arrival No discount 2. Customers table This table holds customer-specific information. Customer ID Name Primary Phone Mobile Phone DOB Email 1 John Doe 555-1234 555-5678 1985-07-04 johndoe@email.net 2 Alice Johnson 555-8765 1990-11-12 alicej@email.net 3 Bob Brown 555-4321 1978-03-22 bobb@email.net 3. Delivery Info table This table contains multiple delivery addresses and their corresponding cities for each customer. Customer ID Delivery Address City 1 123 Maple St New York 1 456 Elm St Chicago 2 456 Oak Ave San Francisco 3 789 Pine Rd Miami 4. Items table This table stores item-specific details. Item ID Item Name Category Price 1 Tea Beverage 1.50 2 Coffee Beverage 3.00 3 Milk Dairy 2.00 4 Biscuit Snack 0.99 5 Juice Beverage 2.10 5. Order Items table This table records the relationship between orders and items, including quantities and discounts. Order ID Item ID Quantity Discount 1 1 2 10% 1 2 1 2 3 1 3 4 5 5% 3 1 3 3 5 2 6. Payment Methods table This table manages payment methods for each order. Order ID Payment Method 1 Credit Card 1 Cash 2 Debit Card 3 Cash 3 Voucher List of the keys Table Key Type Key Name Referenced Table Purpose Customers Primary Key CustomerID – Uniquely identifies each customer. DeliveryInfo Primary Key (CustomerID, DeliveryAddress) – Ensures unique delivery addresses for each customer. DeliveryInfo Foreign Key CustomerID → Customers.CustomerID Customers Links delivery addresses to the corresponding customer. Items Primary Key ItemID – Uniquely identifies each item. Orders Primary Key OrderID – Uniquely identifies each order. Orders Foreign Key CustomerID → Customers.CustomerID Customers Links orders to the customer who placed them. OrderItems Primary Key (OrderID, ItemID) – Ensures no duplicate item records for the same order. OrderItems Foreign Key OrderID → Orders.OrderID Orders Links items to their corresponding order. OrderItems Foreign Key ItemID → Items.ItemID Items Links items in orders to their details in the Items table. PaymentMethods Primary Key (OrderID, PaymentMethod) – Ensures unique payment methods per order. PaymentMethods Foreign Key OrderID → Orders.OrderID Orders Links payment methods to their corresponding order. List of the relationships Relationship From Table To Table Key Purpose Customer and Orders Orders Customers CustomerID → Customers.CustomerID Links orders to the customer who placed them. Customer and DeliveryInfo DeliveryInfo Customers CustomerID → Customers.CustomerID Links delivery addresses to customers. Order and OrderItems OrderItems Orders OrderID → Orders.OrderID Links ordered items to their parent orders. Item and OrderItems OrderItems Items ItemID → Items.ItemID Links items purchased to their details. Order and PaymentMethods PaymentMethods Orders OrderID → Orders.OrderID Links payment methods to their parent orders. Original table functional dependencies Functional Dependency Explanation OrderID → Customer Info, Order Date, Delivery Info, Payment Methods, Order Notes, Discounts All order-level attributes depend on OrderID. Customer Info → Name, Primary Phone, Mobile Phone, DOB, Email Customer-specific attributes depend only on the customer’s identity (Customer Info). OrderID, Item Name → Price, Quantity, Discount, Category Item-specific attributes like Price and Category depend on Item Name rather than OrderID. OrderID →→ Payment Methods Multi-valued dependency exists because orders can have multiple payment methods. Normalization stages and dependency removal Normalization Stage Dependency Issue Solution 1NF Repeating groups in Customer Info, Order Details, Payment Methods. Split concatenated fields into atomic values; create separate rows for multiple items and payments. 2NF Partial dependency: Item Name, Price, Category depend only on ItemID. Move item details to a separate Items table. Update OrderItems to store only quantities and discounts. 3NF Transitive dependency: Delivery Address and City depend on CustomerID. Create a DeliveryInfo table to store delivery addresses and cities for each customer. BCNF Determinant issue: OrderID →→ Payment Methods where Payment Method is not a candidate key. Create a PaymentMethods table with a composite key (OrderID, PaymentMethod). 4NF Multi-valued dependency: Multiple payment methods for each OrderID. Ensure PaymentMethods table resolves multi-valued dependency by storing one method per row. ER diagram SQL script -- Customers table\nCREATE TABLE Customers (\n CustomerID INT PRIMARY KEY,\n Name VARCHAR(50) NOT NULL,\n PrimaryPhone VARCHAR(15) NOT NULL,\n MobilePhone VARCHAR(15),\n DOB DATE NOT NULL,\n Email VARCHAR(50) UNIQUE NOT NULL\n);\n\n-- Delivery Info table\nCREATE TABLE DeliveryInfo (\n CustomerID INT,\n DeliveryAddress VARCHAR(100) NOT NULL,\n City VARCHAR(50) NOT NULL,\n PRIMARY KEY (CustomerID, DeliveryAddress),\n CONSTRAINT FK_DeliveryInfo_Customers FOREIGN KEY (CustomerID)\n REFERENCES Customers(CustomerID)\n);\n\n-- Items table\nCREATE TABLE Items (\n ItemID INT PRIMARY KEY,\n ItemName VARCHAR(50) NOT NULL,\n Category VARCHAR(30) NOT NULL,\n Price DECIMAL(5,2) NOT NULL\n);\n\n-- Orders table\nCREATE TABLE Orders (\n OrderID INT PRIMARY KEY,\n CustomerID INT NOT NULL,\n OrderDate DATE NOT NULL,\n DeliveryMethod VARCHAR(30) NOT NULL,\n DeliveryDateTime DATETIME NOT NULL,\n OrderNotes VARCHAR(255),\n Discounts VARCHAR(30),\n CONSTRAINT FK_Orders_Customers FOREIGN KEY (CustomerID)\n REFERENCES Customers(CustomerID)\n);\n\n-- Order Items table\nCREATE TABLE OrderItems (\n OrderID INT,\n ItemID INT,\n Quantity INT NOT NULL,\n Discount VARCHAR(10),\n PRIMARY KEY (OrderID, ItemID),\n CONSTRAINT FK_OrderItems_Orders FOREIGN KEY (OrderID)\n REFERENCES Orders(OrderID),\n CONSTRAINT FK_OrderItems_Items FOREIGN KEY (ItemID)\n REFERENCES Items(ItemID)\n);\n\n-- Payment Methods table\nCREATE TABLE PaymentMethods (\n OrderID INT,\n PaymentMethod VARCHAR(30) NOT NULL,\n PRIMARY KEY (OrderID, PaymentMethod),\n CONSTRAINT FK_PaymentMethods_Orders FOREIGN KEY (OrderID)\n REFERENCES Orders(OrderID)\n);\n\n-- Insert Data into Customers table\nINSERT INTO Customers (CustomerID, Name, PrimaryPhone, MobilePhone, DOB, Email) VALUES\n(1, 'John Doe', '555-1234', '555-5678', '1985-07-04', 'johndoe@email.net'),\n(2, 'Alice Johnson', '555-8765', NULL, '1990-11-12', 'alicej@email.net'),\n(3, 'Bob Brown', '555-4321', NULL, '1978-03-22', 'bobb@email.net');\n\n-- Insert Data into DeliveryInfo table\nINSERT INTO DeliveryInfo (CustomerID, DeliveryAddress, City) VALUES\n(1, '123 Maple St', 'New York'),\n(1, '456 Elm St', 'Chicago'),\n(2, '456 Oak Ave', 'San Francisco'),\n(3, '789 Pine Rd', 'Miami');\n\n-- Insert Data into Items table\nINSERT INTO Items (ItemID, ItemName, Category, Price) VALUES\n(1, 'Tea', 'Beverage', 1.50),\n(2, 'Coffee', 'Beverage', 3.00),\n(3, 'Milk', 'Dairy', 2.00),\n(4, 'Biscuit', 'Snack', 0.99),\n(5, 'Juice', 'Beverage', 2.10);\n\n-- Insert Data into Orders table\nINSERT INTO Orders (OrderID, CustomerID, OrderDate, DeliveryMethod, DeliveryDateTime, OrderNotes, Discounts) VALUES\n(1, 1, '2023-01-01', 'Pickup', '2023-01-05 14:00', 'Extra napkins', '5% on total'),\n(2, 2, '2023-01-02', 'Home Delivery', '2023-01-06 09:00', 'Leave at front door', '2% on total'),\n(3, 3, '2023-01-03', 'Home Delivery', '2023-01-07 18:30', 'Call on arrival', 'No discount');\n\n-- Insert Data into OrderItems table\nINSERT INTO OrderItems (OrderID, ItemID, Quantity, Discount) VALUES\n(1, 1, 2, '10%'),\n(1, 2, 1, NULL),\n(2, 3, 1, NULL),\n(3, 4, 5, '5%'),\n(3, 1, 3, NULL),\n(3, 5, 2, NULL);\n\n-- Insert Data into PaymentMethods table\nINSERT INTO PaymentMethods (OrderID, PaymentMethod) VALUES\n(1, 'Credit Card'),\n(1, 'Cash'),\n(2, 'Debit Card'),\n(3, 'Cash'),\n(3, 'Voucher'); The final schema consists of separate tables for orders, customers, items, order items, delivery information, and payment methods. This design removed insertion anomalies (new data can be added without affecting unrelated data), update anomalies (data is updated in one place), and deletion anomalies (removing a record does not result in loss of critical data). Primary keys ensure each record is uniquely identifiable, and foreign keys enforce relationships between tables. Data is now stored efficiently, with no duplication, making the database easier to maintain and scale. Queries are simplified and performance is improved by organizing data logically into smaller, related tables. Database normalization vs. denormalization What is normalization in database compared to denormalization? Let’s say you’re organizing your stuff. You can either keep everything perfectly sorted in labeled boxes (normalization), or you can group things more loosely so they’re quicker to grab when you need them (denormalization). That’s basically the difference between normalization and denormalization in a database. So, which one should you choose? It depends on your goal. If you need your data to be clean, accurate, and easy to update, go with normalization. If your app is slow and users are just looking at the data (not updating it), denormalization might help. Some systems even use a mix of both — normalizing some parts and denormalizing others, where performance really matters. Benefits of denormalization Denormalization means adding a bit of “organized mess” to your database to make things faster. Sounds weird, right? But sometimes, it actually makes your system work better, especially when speed matters more than perfect organization. Where does denormalization make sense? 1. You read data way more than you write it Imagine you have a website or app where people are constantly looking up stuff (like product pages, user profiles, or dashboards), but not changing things often. If your system is mostly read-heavy (meaning lots of people are viewing data, but not editing it), then denormalization can make those views load faster, because the data is already bundled together in one place, ready to go. 2. You’re running reports and summaries all the time In business, people love charts, graphs, and dashboards. These tools often run on something called OLAP systems (Online Analytical Processing), which are built for analysis, not updates. If your database feeds dashboards with sales totals, customer stats, or daily performance metrics, denormalization helps by storing pre-calculated data. That way, the system doesn’t have to do all the math whenever someone opens the dashboard. 3. You want faster performance, even if it means some repetition Let’s say you have a product and its category listed in two separate places. In a normalized world, the database has to go look up the category each time it shows a product. But in a denormalized setup, you just copy the category name right next to the product. Yes, it’s repeated — but the app loads quicker because the system doesn’t have to jump between tables. Normalization in SQL (including SQL Server) So you’ve got a bunch of data and you’re using SQL or SQL Server to manage it. Great! But now you’re hearing things like “normalize your tables” and wondering what that even means. This is part of DBMS normalization. What is normalization in DBMS? It’s the process of structuring a database to reduce redundancy and improve data integrity. You can do this through management of information in related tables (with example tables) such as separating customer details into a “Customers” table and ordering information into an “Orders” table. Normalization in SQL is all about keeping your data clean, organized, and free from unnecessary duplicates. Both SQL and SQL Server are designed to help you do this with SQL normal forms using tables, relationships, and constraints — like rules that make sure your data behaves the way it should. We can easily explain normalization in SQL with examples. How SQL and SQL Server help with normalization? Table Design You create different tables for different types of information. For example, one table for customers, another for orders, and another for products. Each table has its own focus — no mixing and matching. Relationships Instead of repeating the same info over and over, you connect your tables. For example, each order “points” to a customer instead of storing all the customer’s info again. Constraints You can tell SQL to enforce rules — like making sure an order always connects to a real customer. These rules are called constraints, and they help keep everything in line. SQL query examples for normalization What is normalization in SQL? Let’s say we’re building a small store database. This is a small tutorial in clear steps with real SQL database normalization examples. First Normal Form (1NF) Rule: No repeating groups. Just one value per field. Bad idea: -- Not normalized\nCREATE TABLE Customers (\n CustomerID INT,\n Name VARCHAR(100),\n PhoneNumbers VARCHAR(255) -- Stores multiple numbers in one field\n); Better (1NF): -- One phone number per row\nCREATE TABLE Customers (\n CustomerID INT PRIMARY KEY,\n Name VARCHAR(100)\n);\nCREATE TABLE CustomerPhones (\n CustomerID INT,\n PhoneNumber VARCHAR(20),\n FOREIGN KEY (CustomerID) REFERENCES Customers(CustomerID)\n); Now, each phone number is separate, and the data is cleaner. Second Normal Form (2NF) Rule: No partial dependency on part of the key. Imagine a table that lists courses students take, along with their name: -- Not 2NF: student name depends only on StudentID, not the full combo\nCREATE TABLE Enrollments (\n StudentID INT,\n CourseID INT,\n StudentName VARCHAR(100)\n); Better (2NF): CREATE TABLE Students (\n StudentID INT PRIMARY KEY,\n StudentName VARCHAR(100)\n);\nCREATE TABLE Enrollments (\n StudentID INT,\n CourseID INT,\n FOREIGN KEY (StudentID) REFERENCES Students(StudentID)\n); Now, the student’s name lives in just one place. Third Normal Form (3NF) Rule: Get rid of columns that don’t depend directly on the primary key. Imagine a table where we store a customer’s ZIP code and city: -- Not 3NF: city depends on ZIP code, not customer\nCREATE TABLE Customers (\n CustomerID INT PRIMARY KEY,\n Name VARCHAR(100),\n ZIPCode VARCHAR(10),\n City VARCHAR(100)\n); Better (3NF): CREATE TABLE ZIPCodes (\n ZIPCode VARCHAR(10) PRIMARY KEY,\n City VARCHAR(100)\n);\n\nCREATE TABLE Customers (\n CustomerID INT PRIMARY KEY,\n Name VARCHAR(100),\n ZIPCode VARCHAR(10),\n FOREIGN KEY (ZIPCode) REFERENCES ZIPCodes(ZIPCode)\n); Now, if a city changes (rare, but still), you update it in one place only. You can use these same principles of database normalization examples in SQL Server too. SQL Server lets you do more, like designing diagrams, setting up indexes for faster searching, and writing stored procedures. But at the core, normalization in SQL Server all starts with a smart, normalized table structure. Common mistakes in database normalization SQL database normalization helps keep your database tidy and easy to manage, but it’s also easy to mess up if you’re not careful: 1. Skipping normalization altogether Some folks just throw all their data into one big table, including names, orders, addresses, product details — everything becomes jumbled together. Sure, it might work for a small project, but it quickly becomes a mess. You end up repeating the same data over and over. Also, it’s harder to make updates without causing mistakes 2. Not going far enough Sometimes, people apply only the first normal form (1NF) and stop there, thinking they’re done. But there could still be duplicate info or weird dependencies hiding in the table. Fix it by taking the time to understand and apply 2NF and 3NF to normalization in SQL. 3. Over-normalizing Breaking your data into too many tiny tables can make things harder, not better. You’ll spend more time writing complex queries just to get a simple result. Sometimes it’s okay to repeat small bits of data if it makes your app faster and easier to use. 4. Ignoring performance Normalization in SQL is great for keeping data clean, but it can slow things down if not done carefully. If your database has to jump across too many tables to find something, performance can take a hit. Tip: Make SQL data normalization for clarity, but test for speed. Denormalize some parts if you need faster results (especially in read-heavy apps like dashboards). Best practices for database normalization Normalization in SQL helps you organize your database so it’s clean, clear, and easy to manage. But like organizing anything—your room, your tools, your kitchen—there’s a smart way to do it, and there’s a way that can make life harder than it needs to be. 1. Understand what you’re storing (and why) Before you even start creating tables, think about what kind of data you have and how it’s used. Ask yourself: What’s the core info I need to keep? Which pieces of data repeat? How are things connected? 2. Start with the first three normal forms Don’t worry about fancy, advanced stuff like 4NF or 5NF right away. Just focus on the basics: 1NF: Keep one value per cell (no lists or combined info) 2NF: Make sure everything in a table depends on the full primary key 3NF: Remove fields that don’t directly depend on the main thing in the table. 3. Use foreign keys to keep data connected When you split data into multiple tables, make sure they stay linked using foreign keys. For example, if you separate customers and orders, orders should include the customer’s ID to stay connected. 4. Don’t over-normalize Yes, it’s possible to go too far. If your database ends up with dozens of tiny tables and every simple query needs five joins… that’s a red flag. Ask yourself: Am I breaking this up because it’s helpful, or just following rules too strictly? Is this extra table actually useful? Sometimes, a little redundancy is okay if it keeps things simpler or faster. 5. Keep performance in mind SQL normalization keeps data tidy, but if users have to wait too long to see something, it’s a problem. Avoid too many joins in critical queries Consider denormalizing parts of your database if speed becomes an issue (e.g., storing total prices or repeated names to save time). You can always clean it up later if it gets messy — performance should feel smooth first. How dbForge Edge can help with database normalization Trying to keep your database clean and well-organized? [dbForge Edge](https://www.devart.com/dbforge/edge/) can be a big help, especially if you’re new to SQL normalization or just want to save time and avoid mistakes. It’s an all-in-one tool built for developers and database admins (DBAs) who work with SQL databases. dbForge Edge makes normalization in database simple: Build visually: Use drag-and-drop tools to create tables and links — no need to write SQL from scratch to [build database design](https://www.devart.com/dbforge/edge/database-design-and-development.html) . Clean up data: Spot and fix duplicate or messy data by organizing it into proper tables. Write smart SQL: Easily write and test queries to control how your data works together. See the big picture: View your whole database as a diagram to find and fix issues fast. Want to see how it works? You can [download a free trial](https://www.devart.com/dbforge/edge/) and try it out with your own project. It’s a great way to learn, explore, and make your database life a lot easier. Conclusion If you’ve made it this far — nice job! By now, you should have a solid idea of what normalization in database is all about and why it matters. We talked about how normalization helps keep your data clean and consistent and improve the overall performance and structure of your database And let’s not forget how much easier this whole process becomes when you have the right tool. dbForge Edge takes the stress out of normalization by letting you: Visualize your database design Create and manage tables and relationships Write and test SQL queries in one place Spot problems before they become real headaches. If you’re building or fixing a database, don’t do it all by hand. Try [dbForge Edge](https://www.devart.com/dbforge/edge/) and see how smooth your workflow becomes. Download the free trial and give it a go! FAQ How does normalization in SQL Server differ from other database systems? Normalization principles are universal and apply across all relational databases, including SQL Server, MySQL, PostgreSQL, and Oracle. What are the benefits of applying SQL data normalization to a large database? Using data normalization for a database leads to reduced data redundancy, improved data integrity, more efficient updates and deletes, optimized storage, and better scalability. What are the common errors to avoid when implementing database normalization in SQL? The most common errors in database normalization often come from over-normalization , as splitting tables too much can lead to complex joins and poor performance. Another mistake is ignoring business logic and using poor key design with unclear primary and foreign key relationships, which leads to orphaned records or update issues. Can dbForge Edge assist in visualizing SQL normalization for database design? Yes, dbForge Edge includes tools that make normalization and schema design much easier, letting you visualize table relationships and dependencies to support normalization decisions, identify normalization issues across environments using schema and data comparison, spot redundancy or transitive dependencies in datasets, and refactor the database. How can I integrate dbForge Edge with my existing SQL database to streamline the normalization process? dbForge Edge provides multiple tools to assist you with database normalization. To start using it on an existing SQL Server database, you must connect it to dbForge Studio for SQL Server, which comes as part of it. To do it, click New Connection on the Database menu and wait for the Database Connection Properties dialog box to open. On the General tab, specify the connection details: Server and Authentication mode. Next, provide credentials for the database you plan to work with: user login, password, and database name. Select the Environment category and specify a connection name if you want to create a custom name for the connection. When all the connection details are provided, you can either verify them by clicking Test Connection or click Connect to set it up and start using dbForge Edge for database normalization. How does normalization in a database help avoid redundancy? Normalization in a database can help organize data into related tables and eliminate duplicate information, so that each value is stored only once. When using this technique, you ensure relationships are defined through foreign keys, avoiding the need to repeat data. As a result, this approach reduces inconsistencies and helps avoid redundancy. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [PostgreSQL Tutorial](https://blog.devart.com/tag/postgresql-tutorial) [sql](https://blog.devart.com/tag/sql) [SQL Tutorial](https://blog.devart.com/tag/sql-tutorial) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-normalization.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Normalization+in+SQL%3A+Key+Steps%2C+Benefits%2C+and+Examples&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-normalization.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-normalization.html&title=Database+Normalization+in+SQL%3A+Key+Steps%2C+Benefits%2C+and+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-normalization.html&title=Database+Normalization+in+SQL%3A+Key+Steps%2C+Benefits%2C+and+Examples) [Copy URL](https://blog.devart.com/database-normalization.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-refresh-and-metadata-update.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Refresh and Update Metadata for a Database in SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) December 12, 2023 [0](https://blog.devart.com/database-refresh-and-metadata-update.html#respond) 2047 The term “database refresh” typically means the process of resetting or renewing a database to a required state. Let’s review some cases where you might need to refresh a database: Data update : Sometimes, it’s necessary just to update data in a database to maintain consistent information. Data rollback : If there are some errors, there is a need to revert a database to a previous state. This helps in identifying bottlenecks faster and facilitates their resolution. Testing : In test and development environments, a database is usually reset to a particular state before running tests or developing new features. This approach guarantees that each test stage starts with a stable and well-defined database state. Disaster recovery : Critical situations may require an emergency response, including database restoration from backups. Performance optimization : Database refresh can be a part of the process to improve database performance. Contents The issue with outdated metadat a Refresh metadata for other objects Common methods for the database refresh Use manual restore for the database refresh Perform the database and metadata refresh Troubleshooting and verification Conclusion The issue with outdated metadata Metadata for non-schema-bound views can become outdated for several reasons. Generally, it happens due to changes in a database. Here is the explanation of possible scenarios: If the columns of tables and other views referenced by a non-schema-bound view are renamed, dropped, added, and so on, the metadata of the view becomes irrelevant. If you alter the tables or views referenced by the non-schema-bound view, the view’s metadata becomes outdated. Even if the structure of the underlying objects remains the same, changes to the data within those objects can lead to outdated metadata. If the non-schema-bound view depends on stored procedures, functions, or other database objects that are modified or removed, the view’s metadata becomes outdated. Alterations to security settings or permissions on underlying objects can also impact non-schema-bound views. If the view relies on access to objects that it no longer has permission to access, it becomes outdated. If objects referenced by the view are renamed, the view’s metadata may become outdated because it still references the old object names. As you can see, there are many factors that can affect non-schema-bound views but sp_refreshview will help you forget about this issue. It updates the metadata of a view to reflect the changes in the underlying tables or columns referenced by the view. The query with sp_refreshview looks as follows: EXEC sp_refreshview ; For example, let’s refresh the metadata for the view named vIndividualCustomer in the Sales schema. USE AdventureWorks2022; \nGO\nEXECUTE sp_refreshview N'Sales.vIndividualCustomer';\nGO To be brief, you can use this procedure to maintain data consistency, resolve errors, and improve performance in database systems. Refresh metadata for other objects If you apply changes to the database schema, dependencies, or the objects themselves, be ready to have obsolete metadata for various database objects, such as stored procedures, user-defined functions, views, DML triggers, and database-level DDL triggers. Let’s figure out why metadata for these objects can become outdated. Schema changes If columns referenced by any of these objects are modified (e.g., data type changes, column renames, or dropping columns), it can lead to outdated metadata. The objects rely on the structure of the referenced columns, and any mismatches can cause errors or unexpected behavior. Object renaming If any of the referenced objects are renamed, the metadata of dependent objects becomes outdated because they still reference the old object names. Security changes If you alter something in the database security policy, then stored procedures, user-defined functions, views, DML triggers, and database-level DDL triggers do not function fully, as an object does not have access to the required resources. Code modifications If you change the code of these objects (stored procedures, user-defined functions, views, DML triggers, and database-level DDL triggers), the updated code and the stored metadata become different. Dependency changes Any modifications to dependent objects impact the metadata of the referencing objects, making it irrelevant. So, how to update the metadata for other objects and avoid further challenges? You can use sp_refreshsqlmodule . It’s a system-stored procedure for refreshing the metadata of a specific module in a database. Here is the syntax of sp_refreshsqlmodule : EXEC sp_refreshsqlmodule schema_name.object_name Where: schema_name is the name of the schema where the required module is located object_name is the name of the module (e.g., stored procedure, function, or view) you want to refresh the metadata for Let’s refresh the metadata for the ufnGetContactInformation function in the AdventureWorks2022 database with the procedure: USE AdventureWorks2022; \nGO\n \nEXEC sys.sp_refreshsqlmodule 'dbo.ufnGetContactInformation';\nGO Common methods for the database refresh There are various approaches for performing the database refresh, but we’ll review the most common ones with their pros and cons. 1. Manual restore It’s a basic method and implies creating a backup of a database in its desired state and then restoring the backup when you need to refresh the database. Pros : Control : Manual backup and restore operations allow you to manage all the stages of the refresh process. Point-in-time recovery : You can restore a database to a specific point in time, which is useful for data recovery and rollbacks. Cons : Time costs : Manual procedures can take much time, especially when you back up large databases. Complexity : You should know how to perform the backup and restore processes. Errors : Manual operations do not exclude the risk of human errors. 2. SQL scripts This method uses SQL statements to update, alter, or recreate the database to bring it to a desired state. Pros : Customization : You can adjust SQL scripts to your needs. Automation : To avoid manual routine, the scripts can be automated. Version control : Managing SQL scripts in version control systems like Git ensures that the changes are tracked and can be rolled back if required. Cons : Development : To create and maintain your custom scripts, you need to spend enough time and effort. Maintenance : You may need to update the scripts according to changes in the database structure. Error handling : SQL scripts must contain the error handling behavior. 3. Containerization Containerization platforms like Docker can be used to create database environments that you can easily refresh. Pros : Scalability : You can scale up or down containers when you need. Automation : Manual operations can be replaced with DevOps pipelines. Consistency : Containers provide consistent and reliable environments. Cons : Expertise : DevOps practices require specific knowledge and skills. Resource consumption : To maintain containers, you need to have enough resources. Not suitable for all databases : This approach may not be suitable for all types of databases or legacy systems. 4. Third-party tools Third-party tools deliver a robust and effective way to refresh databases. These tools are designed to enhance and automate the process of copying, updating, and synchronizing databases. Pros : Ease of use : In most cases, the tools have user-friendly interfaces. Automation : Such instruments offer automation capabilities. Support : Commercial tools come with support and documentation. Cons : Cost : Many tools require licensing fees. Incompatibility : Some tools may be inappropriate for your environment. Use manual restore for the database refresh In this section, we’re going to perform the database refresh with the help of SSMS. This process includes two stages: create a backup of the desired database and restore it. Take a backup 1. In Object Explorer , right-click the database you want to back up and navigate to Tasks > Back Up . 2. Under Destination , confirm the path to the backup file. If you want to change the path, click Remove , and then Add . To select the necessary path, click the button shown in the screenshot. Set the path and click OK . 3. Confirm the backup destination by clicking OK . 4. To back up the database, click OK . Restore the database 1. In Object Explorer , right-click Databases and click Restore Database . 2. Select Device and click three dots to place the backup file. 3. Click Add . Then select the .bak file and click OK . 4. Confirm the backup destination by clicking OK . 5. Select the database where you want to restore the backup from Database . 6. Finally, click OK . As you can see, this process has taken several minutes and we do not need to run any queries for it. Perform the database and metadata refresh Now, let’s shift our focus from theory to the practical aspect. We’ll provide examples of the scripts for refreshing the AdventureWorks2022 database and metadata in it and run both of them in dbForge Studio for SQL Server. Here is an example of the script that you can use to refresh metadata. The script updates metadata information for all VIEWS, DML TRIGGERS, PROCEDURES, FUNCTIONS, and DDL TRIGGERS. USE AdventureWorks2022; \nGO \n \n-- Refresh all VIEWS\n \nPRINT ' -- Refreshing all VIEWS in database ' + QUOTENAME(DB_NAME()) + ' :'\nDECLARE @stmt_refresh_object nvarchar(400)\nDECLARE c_refresh_object CURSOR FOR\nSELECT DISTINCT 'EXEC sp_refreshview '''\n+QUOTENAME(ss.name)+'.'\n+QUOTENAME(so.name)+'''' as stmt_refresh_views\nFROM sys.objects AS so\n INNER JOIN sys.sql_expression_dependencies AS sed\n ON so.object_id = sed.referencing_id\n INNER JOIN sys.schemas AS ss\n ON so.schema_id = ss.schema_id\nWHERE so.type = 'V' AND sed.is_schema_bound_reference = 0\nOPEN c_refresh_object\nFETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nWHILE @@FETCH_STATUS = 0\n BEGIN\n print @stmt_refresh_object\n exec sp_executesql @stmt_refresh_object\n FETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\n END\nCLOSE c_refresh_object\nDEALLOCATE c_refresh_object\nGO\n \n-- Refresh all DML TRIGGERS\n\nPRINT ' -- Refreshing all DML TRIGGERS in database ' + QUOTENAME(DB_NAME()) + ' :'\nDECLARE @stmt_refresh_object nvarchar(400)\nDECLARE c_refresh_object CURSOR FOR\nSELECT DISTINCT 'EXEC sp_refreshsqlmodule '''\n+QUOTENAME(schemas.name)+'.'\n+QUOTENAME(triggers.name)+''''\nas stmt_refresh_dml_triggers\nFROM sys.triggers AS triggers WITH(NOLOCK)\n INNER JOIN sys.objects AS objects WITH(NOLOCK)\n ON objects.object_id = triggers.parent_id\n INNER JOIN sys.schemas AS schemas WITH(NOLOCK)\n ON schemas.schema_id = objects.schema_id\n LEFT JOIN sys.sql_modules AS sql_modules WITH(NOLOCK)\n ON sql_modules.object_id = triggers.object_id\n LEFT JOIN sys.assembly_modules AS assembly_modules WITH(NOLOCK)\n ON assembly_modules.object_id = triggers.object_id\n LEFT JOIN sys.assemblies AS assemblies WITH(NOLOCK)\n ON assemblies.assembly_id = assembly_modules.assembly_id\n LEFT JOIN sys.database_principals AS principals WITH(NOLOCK)\n ON principals.principal_id = assembly_modules.execute_as_principal_id\n OR principals.principal_id = sql_modules.execute_as_principal_id\nWHERE RTRIM(objects.type) IN ('U','V') and parent_class = 1\n AND sql_modules.is_schema_bound = 0\nOPEN c_refresh_object\nFETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nWHILE @@FETCH_STATUS = 0\n BEGIN\n print @stmt_refresh_object\n exec sp_executesql @stmt_refresh_object\n FETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\n END\nCLOSE c_refresh_object\nDEALLOCATE c_refresh_object\nGO\n \n-- Refresh all PROCEDURES\n\nPRINT ' -- Refreshing all PROCEDURES in database ' + QUOTENAME(DB_NAME()) + ' :'\nDECLARE @stmt_refresh_object nvarchar(400)\nDECLARE c_refresh_object CURSOR FOR\nSELECT DISTINCT 'EXEC sp_refreshsqlmodule '''\n+QUOTENAME(s.name)+'.'\n+QUOTENAME(p.name)+''''\nas stmt_refresh_procedures\nFROM sys.procedures AS p WITH(NOLOCK)\n LEFT JOIN sys.schemas AS s WITH(NOLOCK)\n ON p.schema_id = s.schema_id\n LEFT JOIN sys.sql_modules AS sm WITH(NOLOCK)\n ON p.object_id = sm.object_id\n LEFT JOIN sys.assembly_modules AS am WITH(NOLOCK)\n ON p.object_id = am.object_id\n LEFT JOIN sys.assemblies AS a\n ON a.assembly_id = am.assembly_id\n LEFT JOIN sys.objects AS o WITH(NOLOCK)\n ON sm.object_id = o.object_id\n LEFT JOIN sys.database_principals AS dp WITH(NOLOCK)\n ON sm.execute_as_principal_id = dp.principal_id\n OR am.execute_as_principal_id = dp.principal_id\n LEFT JOIN sys.database_principals AS dp1 WITH(NOLOCK)\n ON o.principal_id = dp1.principal_id\nWHERE (CAST(CASE WHEN p.is_ms_shipped = 1 then 1\n WHEN (SELECT major_id FROM sys.extended_properties WHERE\nmajor_id = p.object_id AND minor_id = 0 AND class = 1 AND\nname = 'microsoft_database_tools_support') IS NOT NULL THEN 1\nELSE 0 END AS bit)=0)\nOPEN c_refresh_object\nFETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nWHILE @@FETCH_STATUS = 0\nBEGIN\n print @stmt_refresh_object\n exec sp_executesql @stmt_refresh_object\n FETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nEND\nCLOSE c_refresh_object\nDEALLOCATE c_refresh_object\nGO\n\n-- Refresh all FUNCTIONS\n\nPRINT ' -- Refreshing all FUNCTIONS in database ' + QUOTENAME(DB_NAME()) + ' :'\nDECLARE @stmt_refresh_object nvarchar(400)\nDECLARE c_refresh_object CURSOR FOR\nSELECT DISTINCT 'EXEC sp_refreshsqlmodule '''\n+QUOTENAME(SCHEMA_NAME(o.schema_id))+'.'\n+QUOTENAME(o.name)+''''\nas stmt_refresh_functions\nFROM sys.objects AS o WITH(NOLOCK)\n LEFT JOIN sys.sql_modules AS sm WITH(NOLOCK)\n ON o.object_id = sm.object_id\n LEFT JOIN sys.assembly_modules AS am WITH(NOLOCK)\n ON o.object_id = am.object_id\n LEFT JOIN sys.database_principals p1 WITH(NOLOCK)\n ON p1.principal_id = o.principal_id\n LEFT JOIN sys.database_principals p2 WITH(NOLOCK)\n ON p2.principal_id=am.execute_as_principal_id\n LEFT JOIN sys.database_principals p3 WITH(NOLOCK)\n ON p3.principal_id=sm.execute_as_principal_id\n LEFT JOIN sys.assemblies AS ass WITH(NOLOCK)\n ON ass.assembly_id = am.assembly_id\nWHERE o.type IN ('FN','IF','TF','AF','FS','FT') and sm.is_schema_bound = 0\nOPEN c_refresh_object\nFETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nWHILE @@FETCH_STATUS = 0\n BEGIN\n print @stmt_refresh_object\n exec sp_executesql @stmt_refresh_object\n FETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\n END\nCLOSE c_refresh_object\nDEALLOCATE c_refresh_object\nGO\n \n-- Refresh all DDL TRIGGERS\n\nPRINT ' -- Refreshing all DDL TRIGGERS on database ' + QUOTENAME(DB_NAME()) + ' :'\nDECLARE @stmt_refresh_object nvarchar(400)\nDECLARE c_refresh_object CURSOR FOR\nSELECT DISTINCT 'EXEC sp_refreshsqlmodule '''\n+QUOTENAME(t.name)+''','\n+'''DATABASE_DDL_TRIGGER''' as stmt_refresh_ddl_triggers\nFROM sys.triggers AS t WITH(NOLOCK)\n LEFT JOIN sys.sql_modules AS sm WITH(NOLOCK)\n ON t.object_id = sm.object_id\n LEFT JOIN sys.assembly_modules AS am WITH(NOLOCK)\n ON t.object_id = am.object_id\n LEFT JOIN sys.assemblies AS assemblies WITH(NOLOCK)\n ON assemblies.assembly_id = am.assembly_id\n LEFT JOIN sys.database_principals AS principals WITH(NOLOCK)\n ON principals.principal_id = sm.execute_as_principal_id\n OR principals.principal_id = am.execute_as_principal_id\nWHERE parent_class = 0\nOPEN c_refresh_object\nFETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\nWHILE @@FETCH_STATUS = 0\n BEGIN\n print @stmt_refresh_object\n exec sp_executesql @stmt_refresh_object\n FETCH NEXT FROM c_refresh_object INTO @stmt_refresh_object\n END\nCLOSE c_refresh_object\nDEALLOCATE c_refresh_object\nGO\nPRINT 'Metadata update for non-schema-bound objects is done.'\nGO\nUSE [master]\nGO For convenience, we’ve commented on what each script block does. In each block, the script retrieves a list of views/DML triggers/procedures/functions/DDL triggers in the AdventureWorks2022 database, generates, and runs SQL statements to refresh all these objects. To refresh the database, the following sript can be executed, for example: USE master\nGO\nBACKUP DATABASE AdventureWorks2022\nTO DISK = N'C:\\Program Files\\Microsoft SQL Server\\MSSQL16.SQL2022\\MSSQL\\Backup\\AdventureWorks2022_2023_08_24_14_19.bak'\nWITH NAME = N'AdventureWorks2022-Full Database backup', NOFORMAT, NOINIT, SKIP, NOREWIND, NOUNLOAD, STATS = 1\nGO\n \nRESTORE DATABASE AdventureWorks2022\nFROM DISK = N'C:\\Program Files\\Microsoft SQL Server\\MSSQL16.SQL2022\\MSSQL\\Backup\\AdventureWorks2022_2023_08_24_14_19.bak'\nWITH FILE = 1, STATS = 1;\nGO The script backs up the AdventureWorks2022 database and puts the .bak file to C:\\Program Files\\Microsoft SQL Server\\MSSQL16.SQLEXPRESS02\\MSSQL\\Backup\\AdventureWorks2022.bak , and then restores the database from the backup file. Troubleshooting and verification Of course, not all refresh processes may go smoothly and you can face some challenges. These are the common issues during the database refresh and possible ways for troubleshooting them: Data integrity problems Issues with data corruption can be a result of incomplete data transfer. Troubleshooting: Before refreshing, check the source data for any issues Implement data validation checks before and after the refresh process Use the database backup and recovery tools to restore a database to a known state Poor performance Database productivity can decrease after the refresh because of indexing, statistics, or query plan issues. Troubleshooting: Check and optimize query execution plans Rebuild indexes Track the database performance Issues with security and permissions It’s not a secret that incorrect permissions can also affect the refresh process and lead to failures. Troubleshooting: Keep updated security settings for databases and servers Before refreshing a database, ensure that the account has all the required permissions for it Resource limitations Insufficient server resources (CPU, memory, etc.) can slow down or totally terminate the refresh process. Troubleshooting: Monitor system resource usage Upgrade hardware if required As for the metadata refresh,  the following issues may occur: Lack of dependencies If you change objects that are linked with metadata,  there can be some failures. Troubleshooting: Keep an eye on metadata dependencies and update them Implement automated testing of metadata to detect issues in the early stages Outdated metadata Irrelevant metadata can lead to inaccurate results. Troubleshooting: Regularly check and update metadata definitions Implement versioning to track changes Concurrency issues When several users try to edit the metadata at the same time, the result can be unexpected such as conflicts or errors. Troubleshooting: Use version control and branching for modifying metadata Implement the concurrency control Poor performance Slow metadata refresh processes can leave a negative experience for users. Troubleshooting: Enhance metadata queries and scripts Utilize the monitoring and logging tools Apply cache systems to improve the performance The following recommendations can help you ensure that the database refresh is successful. Check logs This is the first thing that you should inspect after the database refresh. In logs, you can find errors and warnings, and analyze the whole process. Validate data To verify that the data is intact, just compare it from the refreshed database with the source data. Test Launch testing of the applications that use data from the refreshed database. Also, run benchmark tests and monitor response times to check the performance of the database. Verify data consistency Execute SQL queries and scripts to confirm that data remains reliable. Test the backup and restore procedures Ensure that you can recover the previous state of the database. We have added this step to the verification practice but in fact, both procedures must be tested before performing the refresh process. Conclusion To maintain data integrity and accuracy, it’s required to regularly refresh databases. This process is essential for optimal performance, security, and compliance. We’ve shown how to perform it in two different ways and with the help of two powerful tools, SSMS and dbForge Studio for SQL Server. Here, we would like to focus a bit on dbForge Studio. Thanks to an advanced code editor in dbForge Studio for SQL Server, it’s possible to write SQL code of any complexity. The editor offers various features such as syntax highlighting, code completion, and error checking. In addition, dbForge Studio includes a library of code snippets that can help speed up the coding process by providing pre-written and commonly used SQL code segments. Download the [tool for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and try out all the functionality! Tags [#Chart Designer](https://blog.devart.com/tag/chart-designer) [#create pivot tables](https://blog.devart.com/tag/create-pivot-tables) [#dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server-2) [#Pivot Table Designer](https://blog.devart.com/tag/pivot-table-designer) [#pivot tables](https://blog.devart.com/tag/pivot-tables) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-refresh-and-metadata-update.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Refresh+and+Update+Metadata+for+a+Database+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-refresh-and-metadata-update.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-refresh-and-metadata-update.html&title=How+to+Refresh+and+Update+Metadata+for+a+Database+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-refresh-and-metadata-update.html&title=How+to+Refresh+and+Update+Metadata+for+a+Database+in+SQL+Server) [Copy URL](https://blog.devart.com/database-refresh-and-metadata-update.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-reverse-engineering.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Reverse Engineering By [dbForge Team](https://blog.devart.com/author/dbforge) August 28, 2020 [0](https://blog.devart.com/database-reverse-engineering.html#respond) 3769 In our fast-changing world, observations of certain events, operations, and processes determine the development of business and implementation of any solutions or ideas. Such observations are captured in the form of data usually stored and processed in a database. The importance lies not only in designing a database but also in implementing your solution in such a way that you can reverse engineer the database without any documentation or external information apart from the code of the solution. This is also true for the whole information system. In this article, we are going to describe the operation that is opposite to the database design, more specifically, database reverse engineering exemplified by MS SQL Server. Also, we will provide examples of how to obtain various types of information for reverse engineering. What is reverse engineering and what is it used for? Reverse engineering of an information system means obtaining the information on software solution architecture via its implementation, that is via code reproduction. Let’s start by saying that we have already examined [SQL Server database design basics](https://blog.devart.com/sql-database-design-basics-with-example.html) with an example of a database for a recruitment service. It is often the case that a certain information system has been working for quite a long time, and even after many specialists, who were developing and supporting it, leave the company. However, a part of the documentation was kept very comprehensible for those specialists, and some elements of the information system were not documented at all. Finally, there comes a time when the information system needs major changes or has to be rewritten. In such a situation, it is necessary to go into much detail in order to understand how the information system is working at the moment. As you are probably aware of, the most relevant information lies in the very code, not in the documentation system. However, it has been only 5 years (approximately) since people started writing proper code. Just as the task estimation came to incorporate documentation on software development/changes, the code became more accurate and relevant. Despite this, when the solution requires urgent improvement or development, the quality and relevance of documentation may still be affected. When this occurs, we usually obtain relevant information on the software architecture through its implementation, which is code. We call this method reverse engineering. Sooner or later, any information system needs a database to store and process the ever-growing volume of data so that we can analyze it in various ways and make important business decisions based on the analysis. In this article, we will talk about this kind of database reverse engineering. But first, let’s define “DB reverse engineering”. Database reverse engineering is the process that involves obtaining a database schema through its implementation, that is, through the definition of its objects. If you want to learn more about constructing a data model from an existing database, refer to [How to reverse engineer databases](https://docs.devart.com/schema-compare-for-sql-server/working-with-particular-cases/database-reverse-engineering.html) . Reverse Engineering Support for Database Design The following key elements are considered important in a database for reverse engineering: primary keys foreign keys (what is referenced) uniqueness constraints While primary keys and uniqueness constraints are almost always present in the object definition as they are vital for proper code operation, foreign keys may be written right in the table definition. There are several reasons for that, but the most compelling ones are the following: A historical factor. The foreign keys of a replicated database can affect data exchange performance. To boost essential processes related to data modification (since foreign keys require extra time when changing related data). Data integrity within foreign keys is not supported at the database level. The third option is mainly used in real-time systems where even a tiny delay can cause many deaths (transport security, medicine, etc.). If foreign keys have not been set explicitly in the table definition, then one of the following methods is used: Name a field that is a foreign key in such a way that it is clear which field of which table it refers to. For instance, you could use the name of a referenced table field. At the same time, the field contains the name of its table in the name. For example: Img.1. The example of foreign keys support via field names You can see here that the EmployeeID field of the [dbo].[JobHistory] table references the EmployeeID field of the [dbo].[Employee] table. Add the description of the field and what it references to the metadata. In MS SQL Server, one usually uses extended properties with the MS_Description key: EXEC sys.sp_addextendedproperty @name=N'MS_Description'\n\t\t\t\t\t\t\t , @value=N'refers to the field [dbo].[Employee].[EmployeeID]'\n\t\t\t\t\t\t\t , @level0type=N'SCHEMA'\n\t\t\t\t\t\t\t , @level0name=N'dbo'\n\t\t\t\t\t\t\t , @level1type=N'TABLE'\n\t\t\t\t\t\t\t , @level1name=N'JobHistory'\n\t\t\t\t\t\t\t , @level2type=N'COLUMN'\n\t\t\t\t\t\t\t , @level2name=N'EmployeeID'; To learn more about database documenting, refer to [Documenting MS SQL Server Databases](https://www.codeproject.com/Articles/5161784/Documenting-MS-SQL-Server-Databases) . Nonetheless, the better option for reverse engineering support would be to introduce foreign keys in the definition of tables. You can keep several foreign keys disabled if necessary. To disable a foreign key, use the script below: ALTER TABLE [dbo].[JobHistory]\nNOCHECK CONSTRAINT [FK_JobHistory_Employee_EmployeeID]; Here, the [FK_JobHistory_Employee_EmployeeID ] foreign key linking the EmployeeID field of the [dbo].[JobHistory] table with the EmployeeID field of the [dbo].[Employee] table is disabled for the [dbo].[JobHistory] table. To sum up, this method allows you to quickly get the database schema from the definitions of its tables. Obtaining Information for Reverse Engineering To obtain the information on the database schema, you can refer to the following system views: [sys.schemas](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/schemas-catalog-views-sys-schemas?view=sql-server-ver15) is the information on schemas. [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) is the information on tables. [sys.views](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-views-transact-sql?view=sql-server-ver15) is the information on views. [sys.indexes](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-indexes-transact-sql?view=sql-server-ver15) is the information on indexes. [sys.index_columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-index-columns-transact-sql?view=sql-server-ver15) is the information on index columns. [sys.columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-columns-transact-sql?view=sql-server-ver15) is the information on table columns. [sys.check_constraints](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-check-constraints-transact-sql?view=sql-server-ver15) is the information on check constraints. [sys.default_constraints](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-default-constraints-transact-sql?view=sql-server-ver15) is the information on default constraints. [sys.key_constraints](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-key-constraints-transact-sql?view=sql-server-ver15) is the information on primary key or uniqueness constraints. [sys.foreign_keys](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-foreign-keys-transact-sql?view=sql-server-ver15) is the information on foreign keys. [sys.types](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-types-transact-sql?view=sql-server-ver15) is the information on data types. [sys.objects](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-objects-transact-sql?view=sql-server-ver15) is the information on objects. some other system views described in [System Catalog Views](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/catalog-views-transact-sql?view=sql-server-ver15) . To find a complete diagram of system views, use this [link](https://www.microsoft.com/en-us/download/details.aspx?id=39083) . In addition, [system information schema views](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) are used. Obtaining Information about Primary Keys You can use the following query to get the list of primary keys: SELECT\n SCHEMA_NAME(tab.schema_id) AS [schema_name]\n ,pk.[name] AS pk_name\n ,ic.index_column_id AS column_id\n ,col.[name] AS column_name\n ,tab.[name] AS table_name\nFROM sys.tables tab\nINNER JOIN sys.indexes AS pk\n ON tab.object_id = pk.object_id\n AND pk.is_primary_key = 1\nINNER JOIN sys.index_columns AS ic\n ON ic.object_id = pk.object_id\n AND ic.index_id = pk.index_id\nINNER JOIN sys.columns AS col\n ON pk.object_id = col.object_id\n AND col.column_id = ic.column_id\nORDER BY schema_name(tab.schema_id),\npk.[name],\nic.index_column_id; We will receive the following output: Img.2. The list of primary keys (case 1) As you see, the output shows the following fields: schema_name is the name of the table schema. pk_name is the primary key name. column_id is the number of table column included in the primary key definition. column_name is the table column included in the primary key. table_name is the table name for which the primary key is defined. The following system views are used: [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) is the information on tables. [sys.indexes](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-indexes-transact-sql?view=sql-server-ver15) is the information on indexes. [sys.index_columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-index-columns-transact-sql?view=sql-server-ver15) is the information on index columns. [sys.columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-columns-transact-sql?view=sql-server-ver15) is the information on table columns. You can obtain similar information with another query: SELECT \n tc.[CONSTRAINT_CATALOG]\n\t,tc.[CONSTRAINT_SCHEMA]\n\t,tc.[CONSTRAINT_NAME]\n\t,tc.[TABLE_NAME]\n\t,ccu.[COLUMN_NAME]\nFROM \n INFORMATION_SCHEMA.TABLE_CONSTRAINTS AS tc\n INNER JOIN INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE AS ccu\n ON tc.[CONSTRAINT_NAME] = ccu.[Constraint_name]\nWHERE \n tc.[CONSTRAINT_TYPE] = 'Primary Key'\nORDER BY tc.[CONSTRAINT_SCHEMA]\n\t , tc.[TABLE_NAME]\n\t , ccu.[COLUMN_NAME]; The output is as follows: Img.3. The list of primary keys (case 2) As a result, the output gives the following fields: CONSTRAINT_CATALOG is the database. CONSTRAINT_SCHEMA is the table schema name. CONSTRAINT_NAME is the primary key name. TABLE_NAME is the table in which the primary key is defined. COLUMN_NAME is the table field included in the primary key. Here, the following system views from [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) are used: [TABLE_CONSTRAINTS](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/table-constraints-transact-sql?view=sql-server-ver15) are table constraints. [CONSTRAINT_COLUMN_USAGE](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/constraint-column-usage-transact-sql?view=sql-server-ver15) is the table columns that take part in constraints. Obtaining Information about Foreign Keys To obtain the list of foreign keys, apply the query below: SELECT\n f.name AS ForeignKey\n ,SCHEMA_NAME(f.SCHEMA_ID) AS SchemaName\n ,OBJECT_NAME(f.parent_object_id) AS TableName\n ,COL_NAME(fc.parent_object_id, fc.parent_column_id) AS ColumnName\n ,SCHEMA_NAME(o.SCHEMA_ID) ReferenceSchemaName\n ,OBJECT_NAME(f.referenced_object_id) AS ReferenceTableName\n ,COL_NAME(fc.referenced_object_id, fc.referenced_column_id)\n AS ReferenceColumnName\n ,f.create_date\nFROM sys.foreign_keys AS f\nINNER JOIN sys.foreign_key_columns AS fc\n ON f.OBJECT_ID = fc.constraint_object_id\nINNER JOIN sys.objects AS o\n ON o.OBJECT_ID = fc.referenced_object_id\nORDER BY SchemaName, TableName, ColumnName, ReferenceSchemaName,\nReferenceTableName, ReferenceColumnName; The result will be as follows: Img.4. The list of foreign keys (case 1) The output contains the following columns: ForeignKey is the name of the foreign key constraint. SchemaName is the table schema name that contains the foreign key. TableName is the table name that contains the foreign key. ColumnName is the referenced table column. ReferenceSchemaName is the referenced table schema name. ReferenceTableName is the referenced table name. ReferenceColumnName is the referenced table column. create_date is the date and time of the foreign key creation. The following system views are used here: [sys.foreign_keys](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-foreign-keys-transact-sql?view=sql-server-ver15) is the information about foreign keys. [sys.foreign_key_columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-foreign-key-columns-transact-sql?view=sql-server-ver15) is the information about the table columns included in the foreign keys. You can also get similar information with a different query: SELECT\n ccu.table_schema AS SourceSchemaName\n ,ccu.table_name AS SourceTable\n ,ccu.constraint_name AS SourceConstraint\n ,ccu.column_name AS SourceColumn\n ,kcu.table_schema AS TargetSchemaName\n ,kcu.table_name AS TargetTable\n ,kcu.column_name AS TargetColumn\nFROM INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE AS ccu\nINNER JOIN INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS AS rc\n ON ccu.CONSTRAINT_NAME = rc.CONSTRAINT_NAME\nINNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE AS kcu\n ON kcu.CONSTRAINT_NAME = rc.UNIQUE_CONSTRAINT_NAME\nORDER BY SourceSchemaName, SourceTable, SourceColumn,\nTargetSchemaName, TargetTable, TargetColumn; The output is as follows: Img.5. The list of foreign keys (case 2) As a result, the following columns are shown in the output: SourceSchemaName is the schema name of the table containing the foreign key. SourceTable is the name of the table containing the foreign key. SourceConstraint is the name of the foreign key constraint. SourceColumn is the referenced table column. TargetSchemaName is the referenced table schema name. TargetTableName is the referenced table name. TargetColumnName is the referenced table column. The following system views from [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) are used: [CONSTRAINT_COLUMN_USAGE](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/constraint-column-usage-transact-sql?view=sql-server-ver15) indicates the table fields that take part in constraints. REFERENTIAL_CONSTRAINTS indicate foreign key constraints. [KEY_COLUMN_USAGE](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/key-column-usage-transact-sql?view=sql-server-ver15) indicates the table columns included in the keys. Obtaining Information about Uniqueness Constraints To get the list of uniqueness constraints, run the following query: SELECT SCHEMA_NAME(t.[schema_id]) AS [schema_name],\n i.[name] AS constraint_name,\n t.[name] AS table_name,\n c.[name] AS column_name,\n ic.key_ordinal AS column_position,\n ic.is_descending_key AS is_desc\nFROM sys.indexes i\n INNER JOIN sys.index_columns ic\n ON i.index_id = ic.index_id AND i.[object_id] = ic.[object_id]\n INNER JOIN sys.tables AS t \n ON t.[object_id] = i.[object_id]\n INNER JOIN sys.columns c\n ON t.[object_id] = c.[object_id] AND ic.[column_id] = c.[column_id]\nWHERE i.is_unique_constraint = 1\nORDER BY [schema_name], constraint_name, column_position; The output is as follows: Img.6.  The list of uniqueness constraints (case 1) Accordingly, the following fields are output: schema_name is the table schema name that has the definition of a uniqueness constraint. constraint_name is the uniqueness constraint name. table_name is the table name that has a definition of uniqueness constraint. column_name is the table column included in the uniqueness constraint. column_position is the table column position in the definition of uniqueness constraint. is_desc means that sorting by table column in the unique index is descending ( 1 indicates a descending order, 0 indicates an ascending order). The following system views are used here: [sys.indexes](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-indexes-transact-sql?view=sql-server-ver15) is the information about indexes. [sys.index_columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-index-columns-transact-sql?view=sql-server-ver15) is the information about index columns. [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) is the information about tables. [sys.columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-columns-transact-sql?view=sql-server-ver15) is the information on table columns. Additionally, similar information can be obtained in the following way: SELECT \n tc.[CONSTRAINT_CATALOG]\n\t,tc.[CONSTRAINT_SCHEMA]\n\t,tc.[CONSTRAINT_NAME]\n\t,tc.[TABLE_NAME]\n\t,ccu.[COLUMN_NAME]\nFROM \n INFORMATION_SCHEMA.TABLE_CONSTRAINTS AS tc\n INNER JOIN INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE AS ccu\n ON tc.[CONSTRAINT_NAME] = ccu.[Constraint_name]\nWHERE \n tc.[CONSTRAINT_TYPE] = 'UNIQUE'\nORDER BY tc.[CONSTRAINT_SCHEMA]\n\t , tc.[TABLE_NAME]\n\t , ccu.[COLUMN_NAME]; We will obtain the following result: Img.7. The list of uniqueness constraints (case 2) As a result, we can see the following fields in the output: CONSTRAINT_CATALOG is the database. CONSTRAINT_SCHEMA is the table schema name. CONSTRAINT_NAME is the uniqueness constraint name. TABLE_NAME is the table that has a definition of the uniqueness constraint. COLUMN_NAME is the table column included in the uniqueness constraint. The following system views from [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) are used: [TABLE_CONSTRAINTS](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/table-constraints-transact-sql?view=sql-server-ver15) are table constraints. [CONSTRAINT_COLUMN_USAGE](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/constraint-column-usage-transact-sql?view=sql-server-ver15) indicates table columns taking part in the constraint. Obtaining Information about Heaps It bears reminding that a table without a clustered index is called a heap . To get a list of heaps, run the following query: SELECT OBJECT_SCHEMA_NAME(tbl.[object_id]) AS SchemaName ,\n OBJECT_NAME(tbl.[object_id]) AS TableName\nFROM sys.tables AS tbl\n JOIN sys.indexes i ON i.[object_id] = tbl.[object_id]\nWHERE i.[type_desc] = 'HEAP'\nORDER BY TableName; There are no heaps in the JobEmplDB database, hence, we need to run this script in another database that has heaps. Take the [SRV](https://github.com/jobgemws/Projects-MS-SQL-Server-DBA/tree/master/SRV) database as an example. In that case, you can get the following result: Img.8. The list of heaps As a result, the output shows the following fields: SchemaName is the schema name of the heap. TableName is the heap name. The following system views are used: [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) is the information about tables. [sys.indexes](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-indexes-transact-sql?view=sql-server-ver15) is the information about indexes. Obtaining Key Information about Tables, Views, and Their Fields To get the key information about tables and their fields, use the following query: SELECT\n SCHEMA_NAME(tbl.[schema_id]) AS [SchemaName]\n ,tbl.[name] AS [Name]\n ,col.[name] AS [ColumnName]\n ,dc.[name] AS [DefaultConstraintName]\n ,dc.[definition] AS [DefaultDefinition]\n ,col.[column_id] AS [ColumnNum]\n ,t.[name] AS [TypeName]\n ,col.[max_length] AS [TypeMaxLength]\n ,col.[precision] AS [TypePrecision]\n ,col.[scale] AS [TypeScale]\n ,col.[is_nullable] AS [IsNull]\n ,col.[is_rowguidcol] AS [IsRowGUIDCol]\n ,col.[is_identity] AS [IsIdentiity]\nFROM sys.tables AS tbl\nINNER JOIN sys.columns AS col\n ON tbl.[object_id] = col.[object_id]\nINNER JOIN sys.types AS t\n ON col.[user_type_id] = t.[user_type_id]\nLEFT OUTER JOIN sys.default_constraints AS dc\n ON dc.[object_id] = col.[default_object_id]; There are no default values in the JobEmplDB database, so run this script against the [SRV](https://github.com/jobgemws/Projects-MS-SQL-Server-DBA/tree/master/SRV) database. The following result appears: Img.9. Key information about the tables and their fields (case 1) As a result, the following columns are output: SchemaName is the table schema name. Name is the table name. ColumnName is the table column name. DefaultConstraintName is the default constraint name. DefaultDefinition is the default definition of the constraint. ColumnNum is the column position number in the table definition. TypeName is the name of the table column type. TypeMaxLength is the maximum length of the table column type measured in bytes (-1 means that it is not limited by this parameter). TypePrecision is the maximum precision of values of this data type if it is numeric, otherwise, it is 0. TypeScale is the maximum precision of values of this data type if it is numeric, otherwise, it is 0. IsNull shows whether the field can contain the NULL value or not. IsRowGUIDCol defines whether the field is the RowGUID type. IsIdentity defines whether the field is autoincremented. The following system views are applied: [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) is the information about the tables. [sys.columns](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-columns-transact-sql?view=sql-server-ver15) is the information about the table fields. sys.default_constraints is the information about the default values. [sys.types](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-types-transact-sql?view=sql-server-ver15) is the information about the data types. These system views have a lot of useful information that you should take into account. But for the sake of simplicity, you will see here only general information about tables, table fields, default field values, and column types. To get the same information for the views, change [sys.tables](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-tables-transact-sql?view=sql-server-ver15) to [sys.views](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-views-transact-sql?view=sql-server-ver15) in the query: SELECT\n SCHEMA_NAME(tbl.[schema_id]) AS [SchemaName]\n ,tbl.[name] AS [Name]\n ,col.[name] AS [ColumnName]\n ,dc.[name] AS [DefaultConstraintName]\n ,dc.[definition] AS [DefaultDefinition]\n ,col.[column_id] AS [ColumnNum]\n ,t.[name] AS [TypeName]\n ,col.[max_length] AS [TypeMaxLength]\n ,col.[precision] AS [TypePrecision]\n ,col.[scale] AS [TypeScale]\n ,col.[is_nullable] AS [IsNull]\n ,col.[is_rowguidcol] AS [IsRowGUIDCol]\n ,col.[is_identity] AS [IsIdentiity]\nFROM sys.views AS tbl\nINNER JOIN sys.columns AS col\n ON tbl.[object_id] = col.[object_id]\nINNER JOIN sys.types AS t\n ON col.[user_type_id] = t.[user_type_id]\nLEFT OUTER JOIN sys.default_constraints AS dc\n ON dc.[object_id] = col.[default_object_id]; The approximate query result is as follows: Img.10. General information about views and their fields (case 1) You can get similar information using the following script, which outputs the information about both tables and views at the same time: SELECT\n [TABLE_SCHEMA]\n ,[TABLE_NAME]\n ,[COLUMN_NAME]\n ,[ORDINAL_POSITION]\n ,[COLUMN_DEFAULT]\n ,[IS_NULLABLE]\n ,[DATA_TYPE]\n ,[CHARACTER_MAXIMUM_LENGTH]\n ,[CHARACTER_OCTET_LENGTH]\n ,[NUMERIC_PRECISION]\n ,[NUMERIC_PRECISION_RADIX]\n ,[NUMERIC_SCALE]\n ,[DATETIME_PRECISION]\nFROM INFORMATION_SCHEMA.COLUMNS; The approximate output is as follows: Img.11. Basic information about views and their fields (case 2) You can see the following columns in the result: TABLE_SCHEMA is the schema name of the table/view. TABLE_NAME is the name of the table/view. COLUMN_NAME is the column name of the table/view. ORDINAL_POSITION is the ordinal position of the field in the table/view definition. COLUMN_DEFAULT is the default value for the column. IS_NULLABLE indicates whether the column can contain NULL values. DATA_TYPE is the data type of the field. CHARACTER_MAXIMUM_LENGTH is the maximum character length for binary data, character, or text data and images; -1 is for XML data type and big values. The column returns NULL otherwise. CHARACTER_OCTET_LENGTH is the maximum length measured in bytes for binary data, character, or text data and images; -1 is for XML data type and big values. The column returns NULL otherwise. NUMERIC_PRECISION is the precision of approximate and exact numeric data, integer data, or money data. The column returns NULL otherwise. NUMERIC_PRECISION_RADIX is the precision radix of the approximate and exact numeric data, integer data, or money data; otherwise, NULL is returned. NUMERIC_SCALE is the scale of the approximate and exact numeric data, integer data, or money data. The column returns NULL otherwise. DATETIME_PRECISION is the subtype code for the interval data types like DateTime and ISO. For other types of data, NULL is returned. In this case, we use the [COLUMNS](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/columns-transact-sql?view=sql-server-ver15) system view from the system information schema views [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) . Obtaining Information about Stored Procedures, Functions, and Their Parameters To get the information about stored procedures and functions, use the following script: SELECT\n s.[name] AS SchemaName\n ,obj.[name] AS 'ViewName'\n ,obj.[type]\n ,obj.Create_date\n ,sm.[definition] AS 'Definition script'\nFROM sys.objects as obj\nINNER JOIN sys.schemas as s\n ON obj.schema_id = s.schema_id\nINNER JOIN sys.sql_modules as sm\n ON obj.object_id = sm.object_id\nWHERE obj.[type] IN ('P', 'PC', 'FN', 'AF', 'FS', 'FT', 'IF', 'TF'); The approximate script result is as follows: Img.12.Basic information on stored procedures and functions (case 1) You can see the following columns in the output: SchemaName is the schema name of the stored procedure/function. ViewName is the name of the stored procedure/function. type is the type of stored procedure/function (P is a stored procedure in SQL, PC is an assembly stored procedure (CLR environment), FN is a scalar function in SQL, FS is an assembly scalar function (CLR environment), FT is an assembly function (CLR environment) with a table value, IF is a built-in SQL function with a table value, AF is an aggregate function (CLR environment), TF is a SQL function that returns a table value. Create_date is the date and time of the stored procedure/function creation. Stored Procedure script is the definition of the stored procedure/function. The following system views are used: [sys.objects](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-objects-transact-sql?view=sql-server-ver15) is the information about objects. [sys.schemas](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/schemas-catalog-views-sys-schemas?view=sql-server-ver15) is the information about schemas. [sys.sql_modules](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-sql-modules-transact-sql?view=sql-server-ver15) is the information about the definition of objects. You can get similar information with the following query: SELECT\n [SPECIFIC_SCHEMA]\n ,[ROUTINE_NAME]\n ,[ROUTINE_DEFINITION]\n ,[ROUTINE_TYPE]\n ,[CREATED]\n ,[LAST_ALTERED]\nFROM INFORMATION_SCHEMA.ROUTINES; The approximate result of the query is as follows: Img.13. Basic information on stored procedures and functions (case 2) As a result, the output shows the following fields: SPECIFIC_SCHEMA is the schema name of the stored procedure/function. ROUTINE_NAME is the name of the stored procedure/function. ROUTINE_DEFINITION is the definition of the stored procedure/function. ROUTINE_TYPE is the object type (PROCEDURE is a stored procedure, FUNCTION is a function). CREATED is the date and time of the stored procedure/function creation. LAST_ALTERED is the date and time of the last modification of the stored procedure/function. In this case, we use the [ROUTINES](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/routines-transact-sql?view=sql-server-ver15) system view from the system information schema views [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) . To obtain the information on parameters of stored procedures and functions, run the following script: SELECT\n s.[name] AS SchemaName\n ,obj.[name] AS 'ViewName'\n ,obj.[type]\n ,p.[name] AS 'ParameterName'\n ,p.[parameter_id] AS [ParameterNum]\n ,p.[is_output] AS [IsOutput]\n ,p.[default_value] AS [DefaultValue]\n ,t.[name] AS [TypeName]\n ,p.[max_length] AS [TypeMaxLength]\n ,p.[precision] AS [TypePrecision]\n ,p.[scale] AS [TypeScale]\nFROM sys.objects AS obj\nINNER JOIN sys.schemas AS s\n ON obj.schema_id = s.schema_id\nINNER JOIN sys.parameters AS p\n ON obj.object_id = p.object_id\nINNER JOIN sys.types AS t\n ON t.[user_type_id] = p.[user_type_id]\nWHERE obj.[type] IN ('P', 'PC', 'FN', 'AF', 'FS', 'FT', 'IF', 'TF'); The approximate script result is as follows: Img.14.Basic information on stored procedures and functions (case 2) You can see the following fields in the output: SchemaName is the schema name of the stored procedure/function. ViewName is the name of the stored procedure/function. type is the type of the stored procedure/function (P is a stored procedure in SQL, PC is an assembly stored procedure (CLR environment), FN is a scalar function in SQL, FS is an assembly scalar function (CLR environment), FT is an assembly function (CLR environment) with a table value, IF is a built-in SQL function with a table value, AF is an aggregate function (CLR environment), TF is a SQL function that returns a table value. ParameterName is the parameter name. ParameterNum is the parameter position number in the stored procedure definition. IsOutput shows whether it is an output parameter. DefaultValue is the default value for the parameter. TypeName is the parameter type. TypeMaxLength is the parameter maximum length measured in bytes where -1 means that the parameter cannot be applied. TypePrecision is the precision for a numeric parameter; otherwise, it is 0. TypeScale is the numeric parameter scale; otherwise, it is 0. The following system views are used: [sys.objects](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-objects-transact-sql?view=sql-server-ver15) is the information on objects. [sys.schemas](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/schemas-catalog-views-sys-schemas?view=sql-server-ver15) is the information on schemas. [sys.parameters](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-parameters-transact-sql?view=sql-server-ver15) is the information parameters. You can obtain similar information by running the following query: SELECT\n r.[SPECIFIC_SCHEMA]\n ,r.[ROUTINE_NAME]\n ,r.[ROUTINE_TYPE]\n ,p.[PARAMETER_NAME]\n ,p.[ORDINAL_POSITION]\n ,p.[PARAMETER_MODE]\n ,p.[DATA_TYPE]\n ,p.[CHARACTER_MAXIMUM_LENGTH]\n ,p.[CHARACTER_OCTET_LENGTH]\n ,p.[NUMERIC_PRECISION]\n ,p.[NUMERIC_PRECISION_RADIX]\n ,p.[NUMERIC_SCALE]\n ,p.[DATETIME_PRECISION]\nFROM INFORMATION_SCHEMA.PARAMETERS AS p\nINNER JOIN INFORMATION_SCHEMA.ROUTINES AS r\n ON p.[SPECIFIC_SCHEMA] = r.[SPECIFIC_SCHEMA]\n AND p.[SPECIFIC_NAME] = r.[ROUTINE_NAME]; The approximate query result is as follows: Img.15. Basic information about the parameters of stored procedures and functions (case 2) You can find the following fields in the output: SPECIFIC_SCHEMA is the schema name of the stored procedure/function. ROUTINE_NAME is the name of the stored procedure/function. ROUTINE_TYPE is the object type (PROCEDURE is the stored procedure, FUNCTION is the function). PARAMETER_NAME is the parameter of the stored procedure/function. ORDINAL_POSITION is the parameter ordinal position in the definition of the stored procedure/function. PARAMETER_MODE is the input (INOUT) or output (IN) parameter. DATA_TYPE is the data type. CHARACTER_MAXIMUM_LENGTH is the maximum character length for binary data, character or text data, and images, -1 is for XML data type and big values. The column returns NULL otherwise. CHARACTER_OCTET_LENGTH is the maximum length measured in bytes for binary data, character or text data, and images, -1 is for XML data type and big values. The column returns NULL otherwise. NUMERIC_PRECISION is the precision of approximate and exact numeric data, integer data, or money data. The column returns NULL otherwise. NUMERIC_PRECISION_RADIX is the precision radix of approximate and exact numeric data, integer data, or money data, otherwise, NULL is returned. NUMERIC_SCALE is the scale of approximate and exact numeric data, integer data, or money data. The column returns NULL otherwise. DATETIME_PRECISION is the subtype code for the interval data types like DateTime and ISO. For other types of data, NULL is returned. We use the [PARAMETERS](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/parameters-transact-sql?view=sql-server-ver15) and [ROUTINES](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/routines-transact-sql?view=sql-server-ver15) system views from the system information schema views [INFORMATION_SCHEMA](https://docs.microsoft.com/en-us/sql/relational-databases/system-information-schema-views/system-information-schema-views-transact-sql?view=sql-server-ver15) . Obtaining Information about Other Objects and Data To avoid excessive wordiness in this article, we won’t provide the scripts to obtain the whole information about all database object types. Yet, it should be noted that the above-mentioned scripts allow obtaining key information about all basic types of database objects. However, it is often the case that one needs to get information from the extended properties of database objects and also about other objects types like synonyms, sequences, statistics, and others). For instance, you can obtain the information on database triggers using the following script: SELECT\n t.name AS TriggerName\n ,t.parent_class_desc\n ,t.type AS TrigerType\n ,t.create_date AS TriggerCreateDate\n ,t.modify_date AS TriggerModifyDate\n ,t.is_disabled AS TriggerIsDisabled\n ,t.is_instead_of_trigger AS TriggerInsteadOfTrigger\n ,t.is_ms_shipped AS TriggerIsMSShipped\n ,t.is_not_for_replication\n ,s.name AS SchenaName\n ,ob.name AS ObjectName\n ,ob.type_desc AS ObjectTypeDesc\n ,ob.type AS ObjectType\n ,sm.[DEFINITION] AS 'Trigger script'\nFROM sys.triggers AS t --sys.server_triggers\nLEFT OUTER JOIN sys.objects AS ob\n ON t.parent_id = ob.object_id\nLEFT OUTER JOIN sys.schemas AS s\n ON ob.schema_id = s.schema_id\nLEFT OUTER JOIN sys.sql_modules sm\n ON t.object_id = sm.OBJECT_ID; The following system views are used in the query: [sys.triggers](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-triggers-transact-sql?view=sql-server-ver15) is the information about triggers. [sys.objects](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-objects-transact-sql?view=sql-server-ver15) is the information about objects. [sys.schemas](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/schemas-catalog-views-sys-schemas?view=sql-server-ver15) is the information about schemas. [sys.sql_modules](https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-sql-modules-transact-sql?view=sql-server-ver15) is the information about object definition. To learn more about how to obtain information from the extended properties, refer to the article [Documenting MS SQL Server Databases](https://www.codeproject.com/Articles/5161784/Documenting-MS-SQL-Server-Databases) . Conclusion To sum up, the main focus of this article has been on database reverse engineering. Throughout the article, we have done our best to provide the readers with the key information considering this topic in the context of MS SQL Server. We started by defining reverse engineering and explaining its purpose, proceeded with its main elements, and finally, provided numerous examples of how to obtain various types of information for reverse engineering, including primary and foreign keys, uniqueness constraints, heaps, tables, views, and many other objects. Look upon the functionality of the [Database Diagram](https://www.devart.com/dbforge/sql/studio/database-diagram.html) tool in [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , which you can use for database design. Tags [database design](https://blog.devart.com/tag/database-design) [database objects](https://blog.devart.com/tag/database-objects) [reverse engineering](https://blog.devart.com/tag/reverse-engineering) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-reverse-engineering.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Reverse+Engineering&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-reverse-engineering.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-reverse-engineering.html&title=Database+Reverse+Engineering) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-reverse-engineering.html&title=Database+Reverse+Engineering) [Copy URL](https://blog.devart.com/database-reverse-engineering.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-security.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Protection Guide: Best Practices for Ensuring Database Security By [Victoria Shyrokova](https://blog.devart.com/author/victorias) May 5, 2025 [0](https://blog.devart.com/database-security.html#respond) 50 These days, information has enormous power. If you have access to sensitive data, you can do anything with it. It all depends on your imagination. As we know, all data is stored in databases. The reputation, efficiency, and profitability of many companies often depend on the private information they collect and store in databases. One of the most dangerous things that can happen to your database is a data breach. Information leaks not only threaten companies with reputational risks, but also with significant financial losses. Recent research has shown that data privacy and access violations [cost between $100 and $750 for each entry](https://www.infosecurity-magazine.com/opinions/encryption-key-fines-data/) . The Uber app paid the largest fine for a data breach. The company paid a $148 million penalty after the cybercriminal released information about the breach. That’s why businesses must be proactive in building strong database security systems. To understand what we need to do to protect the database, we need to clarify what database security is. Table of contents What is database security? Why database security is critical for your business Key database security issues Database security best practices Database protection techniques The role of database security in compliance Try dbForge Edge for comprehensive database security management Conclusion What is database security? [Database security](https://blog.devart.com/how-to-secure-postgresql-database.html) refers to the collective measures used to protect and secure a database or database management software from unauthorized use, malicious threats, and attacks. It encompasses a broad range of security processes, including access control, encryption, monitoring, and authentication, aimed at protecting the database from breaches, data leaks, and other forms of cyber threats. Think of a database like a digital vault. It stores tons of important stuff—things like customer info, payment details, and business records. Now, imagine what could happen if that vault got broken into. That’s where database security management comes in. It’s all about locking that vault tight so hackers, viruses, or even careless employees can’t mess with what’s inside. Core components of database security: Authentication is the process of verifying the identity of users or systems trying to access a database. It ensures that only authorized individuals can interact with the data. This is often done through passwords, usernames, or more secure methods like biometrics or two-factor authentication (2FA). Encryption is like writing your data in a secret code. If someone tries to steal it, they’ll only see a bunch of scrambled nonsense, unless they have the special key to decode it. This protects your info when it’s sitting in the database and when it’s traveling over the internet. Access control makes sure people only see the data they actually need for their job. Not everyone in a company needs access to everything. So, your marketing team might see customer emails, but not bank account details. It’s about limiting risk by giving the right access to the right people. Why database security is critical for your business With the increasing amount of sensitive information stored in databases, such as personal data, financial information, and intellectual property, the importance of database security and privacy has never been greater. A breach can have serious consequences, including financial loss, legal penalties, and damage to an organization’s reputation. High cost of data breaches When hackers get into your database, it’s not just a tech problem or one of the database security threats, it’s a business disaster. You could face lawsuits, government fines, and spend a fortune cleaning up the mess. On top of that, your company’s name could be dragged through the mud. Customers are quick to walk away if they feel their info isn’t safe. Trust is everything People want to know their data is in good hands. Strong DB security shows your customers that you take their privacy seriously. That trust can turn into repeat business, positive reviews, and a better reputation—all things that give you a competitive edge. Keeping business running smoothly Beyond protecting data, database privacy also keeps your day-to-day operations safe from disruption. A cyberattack can bring your systems to a halt, causing delays, lost revenue, and chaos for your team. With solid security in place, you can keep things running smoothly and stay focused on growth instead of damage control. Key database security issues Security isn’t just about avoiding leaks – it’s also about keeping your systems stable. A cyberattack can shut down operations, slow everything to a crawl, and cost you time and money. Solid security means fewer surprises and more time focusing on your actual business goals. Some of the most common database security issues include: SQL injection attacks Insider threats Data breaches Poor access management. Each of these issues can lead to stolen, lost, or altered data or other types of database attacks, causing damage to your operations, finances, and reputation. SQL injection attacks An SQL injection involves manipulating the web application’s communication with the database. To do this, additional commands are injected into database queries, thus altering the query logic. SQL is the language used to communicate with the database. It is used (in slightly modified form) in various relational database systems (e.g., MySQL). For example, instead of entering a name in a form, an attacker might enter a piece of code that urges the database to disclose secure data. If the database isn’t protected properly, it might actually do that. Why it matters: A successful SQL injection can give attackers full control over your database. They could steal sensitive data, delete records, or even take down your whole system. Insider threats Not every threat comes from outside. Sometimes, the danger is already inside the company. Insider threats can come from employees, former staff, or contractors who have access to your database and either misuse it intentionally or accidentally. For example, someone might copy sensitive customer data and sell it to a third party, or accidentally share a spreadsheet with confidential info. Why it matters: Insiders often have trusted access, which makes it harder to detect their actions. These threats can be just as damaging (or worse) than external attacks. Data breaches A data breach is when sensitive information is accessed or exposed without authorization. This can happen through hacking, poor access controls, or even physical theft of servers and devices. Why it matters: Breaches often lead to identity theft, legal action, and a loss of customer trust. Businesses may face fines, lawsuits, and long-term reputational damage. It’s not just about the loss of data—it’s about the loss of credibility. Strong monitoring systems, regular audits, and limiting access to only those who truly need it are all critical steps in avoiding breaches. Database security best practices How to secure database? You don’t need to be a tech wizard to keep your database safe—you just need to follow a few smart habits. Even small mistakes (like skipping updates or using weak passwords) can leave the door wide open for hackers. Here’s how to stay one step ahead with database security best practices: Regular software patching Think of software updates like fixing cracks in your digital walls. Hackers love outdated systems because they already know how to break in. Skipping updates gives cybercriminals an easy way in. Always install patches and updates as soon as they’re available. Strong password policies If your password is “1234” or “password,” you’re basically handing over the keys. Use long, complex passwords with a mix of letters, numbers, and symbols. A complex password, especially when combined with two-factor authentication (2FA), is your first line of defense. It helps prevent both brute force attacks and casual snooping. Regular backups and disaster recovery plans Even with great security, things can still go wrong. That’s why backups are your safety net. If your system crashes or gets hacked, you won’t lose everything. With regular backups and a disaster recovery plan, you can bounce back fast without starting from zero. Security monitoring and auditing Use tools that monitor who’s doing what in your database. Spotting strange activity early, like someone accessing files at 3 a.m., can stop a problem before it explodes. Monitoring helps you catch threats, track access, and stay compliant with laws like GDPR. It’s like having security cameras for your data. Database protection techniques Securing a database isn’t about using one silver bullet solution. It’s about layering different protection techniques to cover every angle. Cyber threats can come from all directions, so a well-rounded defense strategy is essential. Combining tools like encryption, data masking, tokenization, and role-based access control can allow businesses to significantly reduce the risk of data exposure. Data encryption Encryption is one of the most powerful ways to protect sensitive data. It works by converting readable information into a coded format that can only be decoded with a specific key. At rest: Encrypting data stored in a database means that if someone gains access to the raw files, they still can’t read the actual data without the key. In transit: Encrypting data as it moves between systems (like from your database to a web app) protects it from being intercepted by attackers. Even if hackers get their hands on your data, encryption makes sure it’s useless to them without the decryption key. Data masking Data masking is the process of hiding real data with fake but realistic values. It’s commonly used in testing and development environments where the actual data isn’t needed. For example, a masked version of a credit card number might look like 4111-XXXX-XXXX-1234 , keeping the format the same but hiding the real digits. Developers and testers can work with data without exposing sensitive details, reducing the risk of leaks from non-production systems. Role-based access control (RBAC) RBAC is a fancy way of saying: “Not everyone in the company should see everything.” Instead of giving access to data based on who asks for it, you give access based on what someone does at work. For example, someone in marketing might need customer contact info, but they have no reason to see payroll numbers or server settings. So, they don’t get access to those things. Why this matters: It follows the “need-to-know” rule. The fewer people who can touch sensitive stuff, the lower the chance of someone messing it up, on purpose or by accident. Each of these [database protection](https://blog.devart.com/database-testing.html) techniques covers a different layer of security: Encryption protects data even if someone gets access to it. Masking prevents unnecessary exposure during testing. RBAC ensures that people only see what they’re supposed to. Using them together builds a much stronger defense than relying on just one method. Think of it like locking your doors, setting an alarm, and using security cameras – you’re much safer with all three. The role of database security in compliance Protecting data isn’t just good practice, it’s the law. Regulations like GDPR, HIPAA, and PCI-DSS set strict rules on how personal and sensitive information should be handled. And at the heart of staying compliant with these rules is strong database security compliance. If your business collects or stores customer data (payment info, health records, or personal details), you’re likely subject to one or more data protection laws. These laws are designed to keep people’s private information safe, and they come with serious consequences if you don’t follow them. Failing to meet these requirements can lead to: Hefty fines Lawsuits Loss of customer trust Damage to your brand reputation. Database protection provides the tools and practices needed to meet these regulations. Here’s how it supports compliance: Access control: Regulations like GDPR require that only authorized individuals can access personal data. Role-based access control (RBAC) and authentication help ensure this. Encryption: Many laws demand that sensitive data is encrypted both at rest and in transit. This protects it from unauthorized access, even if it’s stolen. Audit trails: Keeping detailed logs of who accessed what and when is essential for proving compliance. Database monitoring and logging tools make this possible. Data minimization and masking: These techniques help reduce the exposure of sensitive data, especially in non-production environments—a requirement under GDPR and similar laws. Best practices for database security and compliance prove that database security management isn’t just a one-time checklist. It’s an ongoing process. It includes: Regular audits and security assessments Timely software patching Keeping up with changing regulations. Try dbForge Edge for comprehensive database security management As you can see, the importance of database security cannot be underestimated. Looking for a smart way to manage and secure your databases? dbForge Edge is a powerful all-in-one tool that helps you streamline database design, normalization, and security without the hassle. [How to secure database?](https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html) With features like data encryption, activity monitoring, compliance checks, and role-based access control, dbForge Edge gives database administrators everything they need to keep sensitive data safe and meet various data security regulations. It supports multiple database systems, offers a clean user interface, and helps reduce risks while saving time. Try the free trial of dbForge Edge and see how it can simplify your database protection workflow. Conclusion 92% of companies that experience data breaches report negative business impacts, reduced employee productivity, and diminished profits. Don’t wait for a breach to take action. Start by reviewing your current setup, applying the best practices we’ve covered, and exploring tools like dbForge Edge to make your database security smarter and more efficient. Protecting your data starts now—make it a priority with [dbForge Edge](https://www.devart.com/dbforge/edge/) . FAQ How can I improve database protection against security threats and vulnerabilities? To improve database protection, start by strengthening your defenses across all access points. This includes enforcing strong password policies, regularly updating your software, and implementing robust authentication methods. Security monitoring tools are also crucial for detecting unusual behavior early. And keeping regular backups and having a recovery plan ensures that even if something goes wrong, your data can be restored quickly. What are the common types of database attacks, and how can I prevent them? Databases are often targeted by SQL injection attacks, insider threats, and data breaches caused by poor access controls. SQL injections occur when malicious code is inserted into database queries, while insider threats come from users with legitimate access who misuse it, either intentionally or by accident. How can database security management help reduce risks associated with unauthorized access? Database security management establishes the protocols and technologies that prevent unauthorized users from accessing or manipulating data. It ensures only the right people have access to specific data, often through role-based access controls. Authentication methods verify user identity, and encryption protects information from being readable even if it’s intercepted. What is the role of encryption in database security, and how does it enhance database protection? Encryption plays a vital role in keeping data safe by converting it into an unreadable format that can only be accessed with a specific decryption key. When data is encrypted both during storage and while being transmitted, even if cybercriminals manage to steal it, they won’t be able to make sense of it. This means sensitive information like personal records or financial data remains secure, adding a powerful layer of protection to your database infrastructure. What are the key database security issues businesses should address to comply with industry regulations? To stay compliant with regulations like GDPR, HIPAA, or PCI-DSS, businesses must focus on controlling who can access personal data, encrypting sensitive information, and maintaining detailed audit logs. These regulations require organizations to protect data from unauthorized use, prove they’ve done so, and minimize the amount of data that’s exposed in non-production environments. How does dbForge Edge improve database security management for database administrators? dbForge Edge strengthens the security management process by offering a range of tools tailored for database administrators. It supports secure data access and helps monitor user activity and revert changes whenever necessary. What database protection features are included in dbForge Edge to secure sensitive business data? In dbForge Edge, you’ll find features to backup and restore a database, handle and control user privileges, monitor connections, and trace user sessions, which can contribute to safeguarding business data. Apart from that, using Test Data Generator, you can handle development without letting unauthorized parties actually access private business client data. How can dbForge Edge help implement database security best practices in my organization? dbForge Edge makes it easier to apply best practices by combining key security features into one platform. It helps enforce password policies, automate updates, monitor activity, and manage permissions based on roles. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-security.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Protection+Guide%3A+Best+Practices+for+Ensuring+Database+Security&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-security.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-security.html&title=Database+Protection+Guide%3A+Best+Practices+for+Ensuring+Database+Security) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-security.html&title=Database+Protection+Guide%3A+Best+Practices+for+Ensuring+Database+Security) [Copy URL](https://blog.devart.com/database-security.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-testing.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Industry Insights](https://blog.devart.com/category/industry-insights) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Database Testing – Unveiling the Latest Best Practices and Innovations in 2025 By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) April 22, 2024 [0](https://blog.devart.com/database-testing.html#respond) 1445 In software development, data processing and storage are crucial, and databases are invaluable for any application that relies on them. Thus, database testing is a critical aspect of ensuring optimal database performance. Database testing is a subset of software testing that involves verifying tables, triggers, other database objects, and entire database schemas. It also includes checking data integrity and consistency. Typically, this is achieved by creating and executing specific SQL queries that test the database’s structure, attributes, and functions. This article will delve into various types of database testing, methodologies, and practical applications. Let’s begin. Table of contents What is database testing? Why database testing matters more than ever Types of database testing: A closer look Unit database testing Integration/Regression database testing Performance database testing Scalability database testing Resilience database testing Security database testing User-acceptance database testing Usability database testing Database testing steps Handling test data Conclusion What is database testing? Let’s start by clarifying the basics before diving into various types of database testing. What exactly is database testing, and why is it so crucial in modern software development? Simply put, database testing involves checking and confirming the reliability and performance of a database system. Its goal is to ensure that the data stored in databases remains consistent and that the necessary processes can manipulate that data correctly to meet business requirements. When dealing with any relational database management system (RDBMS), a comprehensive testing strategy is essential. This includes testing the code implemented in the database to ensure its correctness and to identify any potential data quality issues before they impact the application. The components of the database that undergo testing include the database schema, all objects, and triggers. Testers use specialized SQL queries and professional database testing tools to assess database security, performance, and structure. Currently, there are over 50 types of database testing methodologies, which can be broadly categorized into three groups: structural, functional, and non-functional testing. This article will delve into functional and non-functional testing types. Why database testing matters more than ever Data serves as the cornerstone of modern business, and its quality is a pivotal factor in determining efficiency. Databases integrated into various applications play a vital role in critical operations. Therefore, thorough testing of the database functionality before deployment and during operation is imperative, regardless of the technology used. Here are some key reasons why database testing is so crucial nowadays. Ensuring accurate data mapping Modern applications rely on the seamless interaction between the user interface (UI) and the database (backend). This interaction may lead to potential issues, such as mismatches between source and target data types and formats, data inconsistencies (like duplicates or empty cells), incorrect data handling rules, etc. These issues can cause severe miscommunications between the UI and the backend database. Hence, comprehensive database testing has to identify and rectify these issues promptly, preventing potential problems. Guaranteeing data accuracy and integrity Data accuracy is about how well the data reflects real-world scenarios, while data integrity pertains to maintaining consistency from the entry point to retrieval and during storage. Given that data can undergo changes throughout its lifecycle, testing these modifications and understanding their impact on the application’s performance is vital. Meeting business requirements Although this aspect is less technical, it remains crucial as it concerns the data relevance. Database testers must comprehend the business requirements and ensure that the database aligns with these requirements throughout the testing process. Types of database testing: A closer look Let’s dive deeper into the types of database testing, which we briefly mentioned earlier: structural testing, functional testing, and non-functional testing that we briefly mentioned earlier. Structural testing Structural testing (or white-box testing) is the primary type of database testing that focuses on checking the database schema with all its objects, relationships between them, constraints, triggers, etc. The types of structural testing are: Validating database tables and columns Testing the stored procedures and functions Index testing Data migration testing Schema testing including schema upgrade processes Database server testing Functional testing Functional testing focuses on ensuring that the product meets its functional requirements and specifications. These include UI functionality, API configuration and access, and backend database operations. The types of functional testing are: Unit testing Integration testing Regression testing System testing Database testing Non-functional testing Non-functional testing deals with performance, load handling, server volume, stress management, etc. These aspects are critical for the overall functioning of the product but are not part of the specific functional requirements. The types of non-functional testing include: Performance testing Scalability testing Load testing Stress testing Volume testing Ideally, we should test databases and all other application features as early and as often as possible. This ongoing testing ensures data consistency, integrity, relevance, and overall product functionality throughout its lifecycle. Unit database testing Unit testing involves checking and validating small pieces of source code, known as units, to prove they function correctly and meet requirements. This process is carried out in isolation for each unit, ensuring it produces accurate results under specific conditions without affecting other units or the entire application. In database testing, unit testing primarily focuses on database objects like procedures, functions, views, and rules. It’s crucial to emphasize that every code segment should undergo testing and validation before integration into the main build. Unit testing isn’t a one-time task; it’s an ongoing process throughout the development cycle. Whenever an object is created or modified, unit tests should be developed and executed before committing changes to version control. This practice guarantees that modifications work as intended and don’t introduce adverse effects. Unit tests are also an essential component of automated Continuous Integration and Continuous Deployment (CI/CD) pipelines. The most straightforward approach to unit testing is focusing on objects with minimal dependencies or none at all. As the number of dependencies increases, unit tests become more complex. While sophisticated unit tests are common and mastered by professionals, the primary objective of unit testing remains creating and executing straightforward tests to validate specific objects. Integration/Regression database testing As we’ve discussed, any application is made up of small units or modules that need thorough testing. However, even after passing unit testing, modules can sometimes encounter issues post-implementation. This is where integration testing and regression testing become crucial. Integration testing occurs when existing modules are linked with new ones or when new modules are added to the application. Its goal is to ensure smooth connectivity and data flow between these components. Regression testing comes after functional testing of new features. It aims to check overall consistency post-changes and ensure that new updates don’t impact the performance of stable features. Both integration and regression tests should be run regularly after each build is finished. Using realistic, high-quality test data is vital to meet business requirements and produce easily interpretable results for stakeholders. For frequent builds, automating these tests is especially beneficial. The results of integration and regression tests should be summarized accurately and promptly reported to developers. Often, these tests are carried out by specialists who are not deeply involved in the software development process, as they require a focus on business requirements. Performance database testing Database performance testing evaluates key aspects of a database system, including response time, resource usage, and consistency. The goal is to check how these metrics align with business needs and ensure stable performance under actual working conditions. This type of testing also examines how a high transaction load or numerous active sessions might affect data integrity. It’s crucial to conduct performance testing alongside integration testing. This approach helps to detect and address performance issues early in the development cycle before the product is released. The tests should use high-quality, realistic data that mirrors the characteristics and volume of live data, allowing for accurate simulation of real-world scenarios without exposing sensitive information. Scalability database testing Scalability testing has to ensure that an increase in workload (mostly data volumes involved) doesn’t harm performance. It’s typically done alongside performance testing as part of the overall non-functional testing process. In practice, scalability testing uses detailed test scripts that simulate user actions and runtime data to interact effectively with the application. Professional tools make it easy to generate test data, allowing scalability tests to be seamlessly integrated into development. This proactive approach means any issues can be addressed promptly, often before finalizing the database design. Resilience database testing Database resilience testing assesses how well a system performs under real-world stress and challenging conditions. Essentially, it ensures that the application can handle unexpected factors like network or hardware failures and the impact of corrupted data. This testing involves simulating different usage scenarios and assessing Extract, Transform, and Load (ETL) processes. Resilience testing is among the most advanced database tests, requiring deep expertise and manual involvement. It involves exploring various unpredictable scenarios that could impact product performance. This complexity makes automation difficult. High-quality test data, including specially modified “dirty” data that could be harmful, is crucial for these tests. Security database testing Database security testing evaluates the policies, configurations, and controls of a database system. The goal is to assess the implementation of security standards and best practices and identify any vulnerabilities that could lead to unauthorized access to data. This testing involves both manual exploratory and automated penetration tests. Automated tests, which are often integrated into the development pipeline, help identify and resolve issues promptly, even in scenarios that may demand redesigning the database and application. Security testing requires large amounts of test data to thoroughly investigate different access scenarios and confirm whether the data remains secure or has been compromised. User-acceptance database testing User Acceptance Testing (UAT) marks the final phase of software development or enhancement before release. Its aim is to evaluate how well the application meets requirements and performs in real-world scenarios. Typically, UAT involves customers—the end users of the database and its associated application. Additionally, developers and professional testers partake in UAT to evaluate the database’s compliance with maintainability standards. Many experts now advocate integrating UAT into the development cycle early on, starting from the initial prototype through to the final product. This proactive approach allows teams to address issues promptly and adapt to evolving business strategies. For UAT to be effective, it must use realistic data. Although actual production data isn’t necessary, the data should be of high quality to allow stakeholders to evaluate the application’s performance against their needs. Typically, test scenarios emerge from collaboration between testers and stakeholders, with successful scenarios often reused in subsequent tests. Usability database testing Usability testing complements UAT by gathering feedback from users to reveal usability aspects and issues related to interface convenience, user-friendliness, performance speed, etc. While usability testing is often part of UAT, it’s conducted by distinct teams for different purposes. Usability tests employ various methods such as surveys, interviews, and analytics. Due to the nature of these methods, automating database usability testing proves challenging and is primarily done manually. The data used for this type of testing should closely mirror real-world scenarios to provide meaningful insights. Database testing steps Database testing involves constructing and executing specialized SQL queries, but it goes beyond just that. It’s a complex process that demands a deep understanding of database design, functionality, and performance, and it requires careful planning and preparation. The steps in database testing include: Gathering and analyzing requirements Creating a detailed test plan covering strategy, scope, and timelines Designing test cases for all possible scenarios Generating test data adhering to security policies Writing SQL queries for test scenarios Setting up realistic test environments Running dedicated SQL queries against databases Collecting and evaluating results Documenting and reporting detected bugs Typically, bug reporting marks the final phase of testing. Running tests frequently is advantageous, so automating database testing whenever feasible is beneficial. This allows teams to focus on more critical tasks while repetitive and time-consuming tests run automatically. Moreover, automation is integral to modern CD/CI pipelines, ensuring more efficient software development cycles. Here, database test scripts and executions occur automatically whenever new code is implemented or as per a predefined schedule. Handling test data As you’ve probably noticed, most testing types require test data. The closer the test data is to live data, the better the results it can provide. However, using live data for testing poses serious security risks. To address this, organizations establish internal security protocols to determine which tests can use production data, who can conduct these tests, the access levels these testers can have, and how to manage them. Yet, this approach isn’t ideal. An alternative is data masking, where real data is altered to meet test requirements without exposing production data. However, this requires additional security measures like encryption, which introduces technical challenges. On the other hand, test data generation tools have advanced significantly in recent years. They can now generate fake data closely resembling live data in type, format, and quantity. Using such tools allows teams to fill database tables with data that has specific characteristics, including both “correct” and “incorrect” data cells for targeted tests. One notable tool in this realm is [dbForge Edge](https://www.devart.com/dbforge/edge/) — an integrated development environment (IDE) for popular database management systems like MS SQL Server, MySQL/MariaDB, Oracle, and PostgreSQL. Among the many other features, Edge offers [data generation tools for these databases](https://www.devart.com/dbforge/edge/features.html#data-generation) , enabling specialists to create realistic demo data with precise settings tailored to test requirements. Moreover, data generation tasks can be automated to consistently provide datasets meeting specific criteria for each case. dbForge Edge’s data generators can also integrate into Continuous Integration and Continuous Delivery cycles, enhancing comprehensive testing procedures for databases. Conclusion Software testing, particularly database testing, is a vital component of the entire development cycle. Comprehensive structural, functional, and non-functional tests should cover every minor piece of code and every performance aspect. By integrating manual and automated testing and utilizing high-quality, realistic test data, a thorough and dependable evaluation of the application’s performance can be achieved at every lifecycle stage. Furthermore, modern database management tools have streamlined the database testing processes as an integral part of the DevOps workflow. They allow for the rapid preparation of test data with specific characteristics, easy construction of unit tests, and automation of these processes. As a result, database professionals can perform extensive testing, eliminating repetitive and monotonous tasks, and focusing on more critical development challenges. dbForge Edge is a software solution designed for database development, management, and administration tasks across SQL Server, MySQL with MariaDB, Oracle, and PostgreSQL. It supports test data generation for these databases and automates the development and deployment processes. A [fully functional 30-day free trial of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) is available, offering an opportunity to explore its advanced features comprehensively. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [multidatabase](https://blog.devart.com/tag/multidatabase) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-testing.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Testing+%E2%80%93+Unveiling+the+Latest+Best+Practices+and+Innovations+in+2025&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-testing.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-testing.html&title=Database+Testing+%E2%80%93+Unveiling+the+Latest+Best+Practices+and+Innovations+in+2025) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-testing.html&title=Database+Testing+%E2%80%93+Unveiling+the+Latest+Best+Practices+and+Innovations+in+2025) [Copy URL](https://blog.devart.com/database-testing.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-set-up-version-control-of-your-sql-server-database-from-scratch.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Version Control Using Source Control for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) May 5, 2021 [0](https://blog.devart.com/database-version-control-using-source-control-for-sql-server.html#respond) 5170 There are many solutions for version-controlling databases. We are going to review one of them, namely [Source Control](https://www.devart.com/dbforge/sql/source-control/) which comes in the SQL Tools pack. As an example, we will use the JobEmplDB database for a recruitment service: Contents Overview of the Source Control menu options Connecting the database to the Source Control Committing changes to a repository Reverse-engineering a database using the Source Control tool Conclusion Img. 1: The recruitment service database schema We have covered the database design basics in [SQL Database Design Basics with Example](https://blog.devart.com/sql-database-design-basics-with-example.html) . Overview of the Source Control Menu Options Now, we are going to review the options available on the Source Control shortcut menu: Img. 2: The Source Control shortcut menu The main options of the Source Control menu include the following commands: Show Source Control Manager displaying the local changes window. Commit displaying the preparation to commit window. Get Latest showing the latest changes in the remote repository (in fact, it precedes the Pull procedure). View Changes History showing the database changes history. Link/Unlink Static Data that can link and version-control the static table data (references, regulatory references, lists, etc.). More information is available at [Linking Static Data](https://docs.devart.com/source-control/working-with-source-control/linking-static-data.html) , and the worked examples are present at [Version-controlling Static Data: Conflicts](https://blog.devart.com/version-controlling-static-data-conflicts.html) . Unlink Database from Source Control that can unlink the current local database from the source control repository. Additionally, to get more information about dbForge Source Control, feel free to watch this [introductory video](https://youtu.be/YhyqFYy5_XI) . Connecting the Database to the Source Control To start with, connect the database to the source control repository. For this, right-click the necessary database, point to Source Control , and then click Link Database to Source Control on the shortcut menu. Img. 3: Connecting the database to the source control repository In the Link Database to Source Control window that opens, navigate to the Source control repository field, and click the plus icon: Img. 4: Selecting the source control repository In the Source Control Repository Properties window, select the required source control system: Img. 5: Selecting the source control system Note: The Source Control tool supports various types of source control systems, including Working Folder, Git, SVN, and Mercurial. In our example, we select Git and enter the required details: Img. 6: Configuring the connection to the Git repository Here we need to fill the following fields: 1. The Repository folder is the path to the local repository on your  PC. Thus, you need first to clone a Git repository. For instance, GitHub allows cloning via a link – you click Clone in the necessary repository and copy the link provided there: Img. 7: Getting the link to clone a repository 2. A repository name is the local repository name. After filling in the necessary fields, click Test and make sure that the connection has successfully been established: Img. 8: Testing the connection to the repository After that, click OK, and then click OK again. Now, select the created source and click OK : Img. 9: Selecting the source control repository For more information about how to connect to Git, see [Linking a Database to Git](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-git.html) . Then, select the database development model and click Link : Img. 10: Selecting the database development model There are two database development models: Dedicated is the model where each developer works on his/her database definition copy. Shared is the model where all developers work on the same database definition copy. In our case, we select the shared database development model for simplicity. However, in projects, we often need a dedicated model. Committing Changes to a Repository Adding changes to a repository After configuring all settings, click Link and wait for the connection process to finish: Img. 11: First commit A window that displays a table with the following columns will appear: A check box to include the object into the commit A change type An object type An object name An object schema Have a look at the bottom pane. To the left, it displays the object description in a local repository (the change we want to deliver with the commit), and the object description in a source control repository (what we want to update). Note : The object definition is present on the left but is absent on the right. It means that the object doesn’t exist in the repository yet. We intend to add this object to the version control repository. Next, select all objects and add them to commit. Then, write a comment there and click Commit . If the commit is successful, you’ll see the following window at the end of the process. Then, click OK . Img. 12: The successful commit process Go to the [GitHub repository](https://github.com/jobgemws/JobEmplDB) to make sure that all the necessary files are created: Img. 13: The created files located in a GitHub repository As you see, the object definitions in a repository are grouped by types: 1. The Programmability folder contains the respective user object definitions grouped by functions and stored procedures. Img. 14: Users programming objects 2. The Security/Schemas folder contains the schema definitions: Img. 15: User schemas 3. The Tables folder contains all user table definitions with their indexes and constraints, as well as primary and foreign keys: Img. 16: Database schemas 4. The Views folder contains the definitions of all views: Img. 17: Database views Deleting objects from a repository To delete all unnecessary objects, use the following script: USE [JobEmplDB]\nGO\n\nDROP VIEW [test].[GetNewID];\nGO\n\nDROP VIEW [test].[GetRand];\nGO\n\nDROP VIEW [test].[GetRandVarBinary16];\nGO\n\nDROP FUNCTION [test].[GetListSymbols];\nGO\n\nDROP FUNCTION [test].[GetSelectSymbols];\nGO\n\nDROP FUNCTION [test].[GetRandString];\nGO\n\nDROP FUNCTION [test].[GetRandString2];\nGO\n\nDROP FUNCTION [test].[GetRandVarbinary];\nGO Then, commit all changes. For this, right-click the necessary database, point to Source Control, and then click Show Source Control Manager or Source Control from the shortcut menu: Img. 18: Preparing to commit After that, the tool displays the Update window, and then opens a window containing the table of changes: Img. 19: Table of changes for commit Take a look at the bottom pane. The object definition is absent on the left but is present on the right. It means that we are going to delete an object from a repository. We select all changes, add a comment, and click Commit . Then, in a new window, click OK : Img. 20: The successful commit completion If we go to the [database view](https://github.com/jobgemws/JobEmplDB/tree/master/Views) in the GitHub repository, we can see that there is just one necessary view left: Img. 21: The remaining database view after deleting the unnecessary views Changing objects in a repository Let’s alter the Company table by adding the IsDeleted column to the Company table: ALTER TABLE [dbo].[Company]\nADD [IsDeleted] BIT NOT NULL DEFAULT(0); Have a look at the [Company table definition](https://github.com/jobgemws/JobEmplDB/blob/master/Tables/dbo.Company.sql) in a repository at GitHub: Img. 22: Company table definition in a GitHub repository As you see, the source control repository does not include the IsDeleted column in the definition of the [dbo].[Company] table so far. To move on, commit the changes, as before. For this, open the Source Control window: Img. 23: Preparation to commit If the window is already open, click Refresh : Img. 24: Refreshing the changed definitions of database objects After refreshing the data about the state of our local repository, we again get the commit preparation table: Img. 25: Preparing database objects to commit Have a look at the bottom pane. The object definition is present both on the right and on the left. Thus, it is present in both the version control repository and in our local repository. Therefore, we alter the object definition. Select all changes, add a comment to describe them, and click Commit . Then, in the new window, click OK . Img. 26: Successful commit completion Now, go to the page that contains the Company table definition in the GitHub repository: Img. 27: The Company table definition in the [GitHub](https://github.com/jobgemws/JobEmplDB/blob/master/Tables/dbo.Company.sql) repository after the applied changes We can see a new IsDeleted column added to the [dbo].[Company] table definition. Canceling changes Now, delete a previously created IsDeleted column from the Company table by using the following script: ALTER TABLE [dbo].[Company] DROP CONSTRAINT [DF__Company__IsDelet__01142BA1];\nGO\n\nALTER TABLE [dbo].[Company]\nDROP COLUMN [IsDeleted]; Important: You need to find the correct name of the DF constraint for the IsDeleted column. Img. 28: Searching for the DF constraint name for the IsDeleted column Open again the Source Control window: Img. 29: Preparation to commit At the bottom, we see the differences in the Company table definition between our local repository and the remote repository. Select all changes and click Undo . This way, we cancel all selected changes. Then, in a new window that appears, click Yes . Img. 30: Confirming the selected local changes rollback When the rollback operation is successfully completed, a new window will appear. Click OK . Img. 31: The local changes rollback is successfully completed [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) shows that the IsDeleted column returns to the Company table: Img. 32: The structure of the Company table after rolling back local changes Viewing the Changes History We can view the history of changes in a database. To view the history of changes, right-click the database, point to Source Control, and then click View Changes History on the shortcut menu: Img. 33: Calling the Changes History View The database changes history window appears. On the top left of the window, see the changes log with the following columns: Revision Date Author Comment On the top right of the window, see the comment to the change itself, the type of operation defined, and the full path to the changed file. The differences are displayed at the bottom of the window: to the left, we see the previous state, and to the right, we see the current state. Reverse-engineering a database using the Source Control tool Let’s create an empty JobEmplEmpty database: Img. 34: Creating a new empty JobEmplEmpty database Right-click the database you’ve created, point to Source Control, and select Link Database to Source Control : Img. 35: Calling the window to link the database to the source control repository In the Link Databases to Source Control window that opens, configure the necessary parameters and click Link : Img. 36: Configuring the parameters to link the database to the source control repository Note : For reverse engineering, you need to select the Dedicated model. The window showing the differences between local and remote repositories appears: Img. 37: The window showing the differences between the local and the remote repositories We must ensure that the JobEmplEmpty database is really empty: Img. 38: The JobEmplEmpty database Now, return to the Source Control window, select all changes, and then click Undo : Img. 39: Reverse engineering from the source control repository to the new JobEmplEmpty database After that, a new window appears asking to confirm the rollback of local changes. To confirm, click OK . In the window informing that the local changes have been rolled back successfully, click OK . Now, the new JobEmplEmpty database definition completely matches the database definition from the [source control repository](https://github.com/jobgemws/JobEmplDB) at GitHub: Img. 40: The JobEmplEmpty database definition after the reverse engineering This way, we’ve performed reverse engineering. We used an empty database and restored the database definition from the source control repository to it. Similarly, we can connect any database to the source control and view the differences: Img. 41: The differences in the database definitions For the demo purpose, review the definition changes for absolutely different databases: Img. 42: The difference in database definitions – 2 Updating the table from the repository What if we update the table from a repository? Will it delete the table data? Let’s check it. Create a JobEmpplDB2 database – a copy of JobEmplDB , using the JobEmplDB database backup. Then, in the JobEmpplDB2 database, execute the following code fragment: ALTER TABLE [dbo].[Company] DROP CONSTRAINT [DF__Company__IsDelet__01142BA1];\nGO\n\nALTER TABLE [dbo].[Company]\nDROP COLUMN [IsDeleted];\n\nALTER TABLE [dbo].[Company]\nADD [Source] NVARCHAR(255); By running this script, we’ve deleted the IsDeleted column and added the Source column to the Company table. Now, link the JobEmplDB2 database to a repository we created earlier. Right-click the JobEmplDB2 database, point to Source Control, and then click Link database to Source Control on the shortcut menu: Img. 43: Linking the database to the repository In the Link Database to Source Control window, configure necessary parameters and click Link : Img. 44: Configuring the database connection to the repository In the window that opens, select changes for the Company table and click Undo : Img. 45: The Company table gets synchronized with the repository When a warning appears, click OK to roll back the local changes we’ve selected (those that mismatch the source control repository): Img. 46: The warning appeared before rolling back the selected local changes At the end of the rollback process, click OK in a new window: Img. 47: Successful completion of the selected local changes rollback Thus, we’ve synced the Company table definition with the source control repository at [GitHub](https://github.com/jobgemws/JobEmplDB) : Img. 48: The Company table structure after rolling the local changes back To ensure that the data has not been deleted, use a simple SELECT query: Img. 49: The number of entries in a company table It turns out that the Source Control in SQL Tools updates the database definition incrementally, without recreating the changed tables. This way, it grants safe reverse engineering to a non-empty database. Conclusion To sum it up, we have exemplified how to use dbForge Source Control for SQL Server to version-control and reverse-engineer the SQL Server database. [Download](https://www.devart.com/dbforge/sql/sql-tools/download.html) a free 30-day trial version of the dbForge SQL tool pack, which includes dbForge Source Control for SQL Server, to evaluate the features and capabilities that will help you perform SQL Server tasks easily and effectively. Related articles [Version Control System – Version Control with Git](https://blog.devart.com/version-control-system-version-control-with-git.html) [Database Versioning with Examples](https://blog.devart.com/database-versioning-with-examples.html) Tags [reverse engineering](https://blog.devart.com/tag/reverse-engineering) [source control](https://blog.devart.com/tag/source-control) [version control](https://blog.devart.com/tag/version-control) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-version-control-using-source-control-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Version+Control+Using+Source+Control+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-version-control-using-source-control-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-version-control-using-source-control-for-sql-server.html&title=Database+Version+Control+Using+Source+Control+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-version-control-using-source-control-for-sql-server.html&title=Database+Version+Control+Using+Source+Control+for+SQL+Server) [Copy URL](https://blog.devart.com/database-version-control-using-source-control-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-versioning-state-based-vs-migrations.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Version Control — State-based or Migration-based By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) February 17, 2023 [0](https://blog.devart.com/database-versioning-state-based-vs-migrations.html#respond) 2620 Modern databases must keep up with dynamic business demands, and it is often a real challenge. The most important requirement is to ensure regular database updating that will preserve all the data. A solution is a well-planned approach to database development and deployment via either a state-based or migration-based approach. However, you need to understand both these approaches thoroughly. Why should you treat database updating as a separate challenge? What is state-based database deployment? What is migration-based database deployment? State-based vs migration-based: comparison Updating the database with Devart dbForge tools Conclusion Why should you treat database updating as a separate challenge? Dealing with databases means constantly keeping in mind that the database has two sides: the data it stores and the structure to organize that data. The database updating approach requires the following issues to be considered thoroughly: A database includes the table structure, the code in stored procedures, the data kept in those tables, and all interrelations between the database objects. It sets more sophisticated challenges for implementing changes. The database sets more strict demands for synchronization, primarily when several developers work on some object in that database. It is critical to preserve all business data as well as the entire database code. After the update, the data must remain safe and sound. That’s why it is impossible to update the database in the same way as the application code – we can’t simply delete the old database and create a new one for every change. On the other hand, there is no need to reinvent the wheel – there are already methods tested and approved by all developers: the state-based and the migration-based database delivery models. As a database developer, you will most likely use both methods according to your project’s needs. What is state-based database deployment? The state-based approach suggests keeping the database schema in the ideal end state in the code repository. This approach was popularized by Microsoft which implemented it in its Visual Studio solution. The entire idea of the state-based approach is simple: you keep a snapshot of the ideal database structure and work on your actual database project to match that ideal. All database objects (tables, views, stored procedures, functions, triggers, etc.) are state-based scripts, each is in a separate SQL file in the final form. When database developers work on the database schema and need to update it, they deploy the database on the server locally and implement the necessary changes. Further, all the hard work is done by the compare tool that generates scripts to synchronize your actual database with that “etalon” database. Finally, the version control system uploads these changes to the server. Changes are implemented in a sequential manner, “from lower to higher,” – from Development to Testing, and further, until it comes to Production. The advantages of the state-based database delivery method : Store the database schema inside the Source Control to conveniently monitor the database state. Detect compile time errors in SQL files immediately. No need to create numerous scripts for the same entity. Watch all changes deployed to the database at any moment and thus manage the process better. Generate and execute the ALTER scripts automatically by the dedicated tool. The disadvantages of the state-based database delivery method : It requires generating a new script for every new environment. It pushes changes only forward (source to target), and it can only be reverted manually. This model can cause issues for automated processes. The state-based approach is the default choice for the new project development (from scratch). It suggests that the database or the entire application based on that database is in work until the final step of releasing the application to the Production area. What is migration-based database deployment? The migration-based method works under a different approach. Instead of having a single snapshot of an ideal database, you have a collection of migration scripts that transfer your actual database from one version to another. A separate script is created for every step. Each such script contains specialized DDL statements, and all these migration scripts are stored in the repository. Using these migration scripts, you can update both the database schema and data. You must create every migration script with the incremental version number. If you need to transfer the database to another environment, you have to execute all these migration scripts in the correct order. That’s why it is crucial to track these version numbers. The most common scope for the migration-based database delivery method is database testing, updating databases with new features and enhancements, or creating a database from the scripts used in the state-based approach. Many developers prefer the migration-based database delivery method because they can do the tasks faster and deploy the scripts quickly. On the other hand, in most cases, we have to create these migration scripts manually. The advantages of the migration-based database delivery method : The possibility to change both the database schema and data simultaneously. Better alignment with the DevOps best practices of incremental changes. The same code is executed in all environments. Better testing and control of features. The possibility to write migration scripts in other programming languages besides SQL. The disadvantages of the migration-based database delivery method : It often requires the developers to write all migration codes manually. It may provoke risks of having the code overridden by other developers’ changes in case of synchronization failures. It is inefficient to work with stored procedures and functions. The migration-based approach is usually applied to the existing database that has been delivered to the end user. The application evolves in time and requires updates and enhancements. Such changes are mostly delivered through the migration scripts. State-based vs migration-based: comparison The main difference between these two approaches is what you consider the “source of truth” – the ideal database you strive for or the scripts you create to upgrade your database. When you as a database developer need to apply this or that approach, you will consider how it meets your project requirements in practice. Some developers say that the state-based approach ensures more reliable testing and immutability. Other specialists prefer the migration-based approach because it suits the deployment challenges better. Besides, there is the question is you have to write the upgrade scripts manually, which is a complicated and time-consuming task. The state-based approach allows you to use computer-generated upgrade scripts 95% of the time. The migration-based approach suggests that you give custom directions in most cases. Also, the stated-based delivery makes it much easier for the team to work together on complicated databases with various sophisticated dependencies. Have a look at other essential differences between these two approaches in the below comparison table: State-based Migration-based Better suited for large teams working on complex databases with many sophisticated dependencies. Better suited for smaller teams working on simpler data stores with easy data migrations and table refactoring. Mainly applicable to databases in development before the product is delivered to end users. Mostly applicable to databases already available to end-users and aiming to keep up with business challenges. Stores the database schema with all objects in a digital repository. Stores a selection of migration scripts in the digital repository. Ensures flexibility in altering the database schema without tracking numerous scripts and their order. Requires tracking a long list of migration scripts and running them in a strict order. Applies the final version of every object to the database directly and immediately. Applies changes to a database object one by one, taking longer to deploy these changes. Generates synchronization scripts automatically and limits the developer’s ability to modify them significantly. Allows developers to control and modify migration scripts for every case. Requires multiple steps to perform complex refactoring due to the usage of predefined, automatically generated syncing scripts. Enables quick complex refactoring due to the ability to review and edit migration scripts before running them. In practice, database developers need to use both state-based and migration-based approaches. During the early stages of a project, the state-based approach offers more agility for the evolving system. However, after the product launch, the migration-based method can be more efficient, providing greater control over changes and increased flexibility in workflow. Updating the database with Devart dbForge tools As previously mentioned, databases need to be continually upgraded to keep up with ever-changing business requirements. In the past, developers could dedicate significant time and effort to writing and maintaining scripts, but this is no longer a feasible approach. As the number of releases increases, these processes require more automation, which can be achieved through the use of specialized tools. Devart offers a set of dedicated solutions designed to perform all database-related tasks including database change management. Such a tool as [Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/) – a popular add-in for SSMS – makes a valuable component in DevOps automation providing database version control functionality to SQL Server developers. This tool allows the users to track and compare changes easily, synchronize the database versions, roll the changes back if needed, and benefit from many other options. dbForge Source Control for SQL Server works in the state-based mode. If your project requires migration-based delivery, you can refer to another tool – [Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) that allows you to compare and synchronize database schemas between different SQL Server databases and scripts. This tool can generate the upgrade scripts, so you won’t need to write these migration scripts manually. Whatever database delivery model you choose, the good news is that you can save your time and energy by automating routines – and here dbForge SQL Tools come in handy providing you with all the necessary functionality. Conclusion So, we have reviewed the two database deployment models – state-based and migration-based. We have explored their scopes and features and identified the pros and cons of both methods. However, it is impossible to apply only a state-based or only a migration-based approach. Both are necessary, and even one project may require them both. Besides, with professional tools, you can achieve your goals faster and easier, letting the software handle the database changes over the course of time. As for the [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) that we mentioned, you can try them free of charge – the fully-functional Free Trial is provided for 30 days for you to evaluate all the powers of every tool. [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-versioning-state-based-vs-migrations.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Version+Control+%E2%80%94+State-based+or+Migration-based&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-versioning-state-based-vs-migrations.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-versioning-state-based-vs-migrations.html&title=Database+Version+Control+%E2%80%94+State-based+or+Migration-based) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-versioning-state-based-vs-migrations.html&title=Database+Version+Control+%E2%80%94+State-based+or+Migration-based) [Copy URL](https://blog.devart.com/database-versioning-state-based-vs-migrations.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/database-versioning-with-examples.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Database Versioning with Examples By [dbForge Team](https://blog.devart.com/author/dbforge) April 8, 2021 [0](https://blog.devart.com/database-versioning-with-examples.html#respond) 7933 To properly manage the database development process, stay informed of the changes delivered by other developers, and avoid conflicts or missing data in the database, one needs database versioning. This allows you to reconstruct any version of the database, view the history of changes, and ensure smooth database delivery. Contents What is database versioning Why is database versioning important Database versioning components Types of database version control Database data versioning Tools for database versioning Conclusion What is database versioning To begin with, let us define database versioning. It is the practice of managing changes to the structure and content of a database in a systematic and organized manner. It involves tracking and recording modifications, updates, and additions to the schema and data over time. This process helps developers and database administrators understand the evolution of the database and ensure consistency across different environments (such as development, testing, and production). Database versioning systems typically include features like version history, branching, merging, and rollback capabilities. Why is database versioning important Database versioning is a cornerstone of modern database management. It brings discipline to the process of making changes, reduces risks associated with updates, and supports an environment where multiple stakeholders can work together effectively and with confidence: Minimizing Downtime: Keeping a clear record of versioned changes allows for precise planning and execution of updates. This ensures that downtime is kept to a minimum, as teams can schedule updates during off-peak hours and have confidence that the deployment process will be swift and reliable. Improving Developer Productivity: Developers can work concurrently on different parts of a database, and the versioning system helps them integrate their changes seamlessly. It reduces conflicts, as the version control system handles merging changes intelligently. This allows developers to focus on their tasks without worrying about accidentally overwriting each other’s work. Ensuring Reproducibility: Versioning is crucial for debugging, testing, and auditing purposes. If an issue arises after a deployment, developers can roll back to a previous version to analyze and fix the problem. This reproducibility ensures that the database environment can be reliably reconstructed, which is particularly important in regulated industries or mission-critical applications. Collaboration and Teamwork: In collaborative environments, where multiple developers or teams work on the same database, versioning facilitates smooth teamwork. It provides a centralized repository where everyone can contribute their changes. Additionally, it offers features like branching, which allows for parallel development efforts. This means that teams can work on different features or fixes simultaneously and then merge their changes back together seamlessly. Database versioning components Unlike files and projects, a database has four main components: Database schema: Defines the structure of the database, including tables, columns, relationships, indexes, and constraints. It outlines how data is organized and stored within the database. Versioning the schema involves tracking changes to these structural elements, ensuring consistency, and allowing for controlled modifications over time. Database object definitions: Refer to elements within the schema, such as tables, views, stored procedures, functions, and triggers. Versioning database objects requires managing changes to the code or scripts that define these objects. Database data: Encompasses the information stored in the tables, such as records, rows, and entries. Versioning data allows you to manage changes to this content and track the inserts, updates, and deletions. User roles and privileges: Dictate who has access to the database and what actions they can perform. Versioning user roles and privileges involves managing changes to permissions, granting or revoking access, and defining the level of authority users have within the database. Database DDL scripts include individual DDL scripts for database schemas and database objects. They contain: Table and view definitions Primary key definitions Foreign key definitions Stored procedures and functions definitions Triggers definitions Sequences definitions Index definitions Constraints definitions Types of database version control There are three primary types of data for versioning: Database DDL versioning (DDL scripts for database schemas and other database objects) Database data versioning The versioning of user roles and privileges When it comes to implementation, point three is pretty much similar to point one. Thus, let’s examine point one in-depth. DDL versioning We can implement DDL versioning by applying the two main approaches: Redefine all objects within the database, except for tables (tables are processed separately). Update changed objects. Redefining objects The first approach is simple but inconvenient for modifying tables. You need to redefine tables in a separate iteration, and that affects the deployment duration. Still, this type applies to the [SQL Server Database Project](https://docs.microsoft.com/en-us/sql/ssdt/how-to-create-a-new-database-project?view=sql-server-ver15) type in [Visual Studio](https://visualstudio.microsoft.com/) : Img. 1. A database project Updating changed objects The second approach is complicated but more effective. It allows you to update the changed objects selectively and does not lead to fake redefinitions. Therefore, it significantly reduces the deployment duration. There are a lot of ready solutions applicable to the first approach. As the second approach allows reducing the deployment time considerably, let’s review it in more detail. There are three ways to record and check versions: With the help of database and object extended properties With the help of a dedicated table in a database With the help of checksums of the database and object definitions. The third way is not that common. Hence, we won’t examine it here. Ways one and two are more common – for instance, the second is implemented in the [.NET Entity Framework](https://docs.microsoft.com/en-us/ef/ef6/what-is-new/past-releases) . The essence of the second way is that the migration versions are saved in a dedicated database table. Now, let’s refer to the first way. We’ll review the process of updating the database definition by updating definitions of those objects that have been changed, as well as checking and recording their versions via their extended properties. As a rule, we define the build version number first. Common practice suggests the following format: YYYY_NN for the release where NN is the release number in the YYYY year . YYYY_NN_HF_N for the hotfix where N is the hotfix number for the YYYY_NN release This numbering format is appropriate for the releases that take place not more often than once a week, and there are no more than nine hotfixes per release. Then, we can tag the database with a version number in the following way: declare @DB_Version nvarchar(128)=N'XXXX_NN'; --specify the release version here, for example, 2020_02 for the release version or 2020_01_HF_01 for the hotfix version \n--the variable for the previous release value\ndeclare @DB_BackVersion nvarchar(128);\nSELECT TOP(1)\n@DB_BackVersion=cast([value] as nvarchar(128))\nFROM sys.extended_properties\nWHERE [class]=0\n AND [name]='DB_Version';\n \nIF(@DB_BackVersion is not null)\nBEGIN\n EXEC sys.sp_updateextendedproperty @name=N'DB_Version', @value=@DB_Version;\nEND\nELSE\nBEGIN\n EXEC sys.sp_addextendedproperty @name=N'DB_Version', @value=@DB_Version;\nEND\n \nIF(\n EXISTS(\n SELECT TOP(1) 1\n FROM sys.extended_properties\n WHERE [class]=0\n AND [name]='DB_BackVersion'\n )\n )\nBEGIN\n EXEC sys.sp_updateextendedproperty @name=N'DB_BackVersion', @value=@DB_BackVersion;\nEND\nELSE\nBEGIN\n EXEC sys.sp_addextendedproperty @name=N'DB_BackVersion', @value=@DB_BackVersion;\nEND Note: Before updating the DB_Version tag, the script saves the tag for the previous version in DB_BackVersion. You can tag any database object in a similar way. To roll back the database version, you can use the following script: --a variable that will contain the previous release number\ndeclare @DB_BackVersion nvarchar(128);\n \nSELECT TOP(1)\n@DB_BackVersion=cast([value] as nvarchar(128))\nFROM sys.extended_properties\nWHERE [class]=0\n AND [name]='DB_BackVersion';\n \n--if the previous version is not set, we initialize it as 2020_00, i.e. no versioning\nif(@DB_BackVersion is null) set @DB_BackVersion=N'2020_00';\n \nIF(\n EXISTS(\n SELECT TOP(1) 1\n FROM sys.extended_properties\n WHERE [class]=0\n AND [name]='DB_Version'\n )\n )\nBEGIN\n EXEC sys.sp_updateextendedproperty @name=N'DB_Version', @value=@DB_BackVersion;\nEND\nELSE\nBEGIN\n EXEC sys.sp_addextendedproperty @name=N'DB_Version', @value=@DB_BackVersion;\nEND Here, the DB_Version tag gets the DB_BackVersion tag value assigned. Similarly, we can roll back the version tag for any database object. Also, we can develop a more complicated versioning system for rolling tags forward and back. For instance, it can remember not only the previous version’s value but also store more tag values or even all tag values. However, it is usually enough to store the current and previous versions of the database objects and the database itself. When we have the version tags arranged on a database, we can check a database object definition for the current version value. Based on the received value, the system can either update the object definition or inform the user that the update is impossible. For example, you’ve created the stored procedure for the 2020_03 build. Hence, you can roll this stored procedure definition forward only if its version in the destination database is less than 2020_03. If you roll the stored procedure definition of the 2020_03 version forward on the same version or a newer one, you will overwrite the newer procedure definition. It would not be a roll forward – it would be a rollback. It is the same for rolling back a database and object definitions. You can roll back the 2020_03 version only if the object is located precisely in the same version as the database itself. Otherwise, you can roll back the wrong version as the object and the database are located in a different version, not the one that is intended for a rollback. Summing up, we’ve reviewed the database versioning basics. Database data versioning But how can we version database data? Here, we need to consider the following checks: Check for the existence of the entry. Check for the required field(s) value. In general, to add or alter the row, we need to write the following pseudocode: ;MERGE . AS trg\nUSING #tbl_source AS src ON trg.[PK_1]=src.[PK_1] AND ... AND trg.[PK_N]=src.[PK_N]\nWHEN NOT MATCHED THEN\n\t INSERT ([PK_1], ..., [PK_N], [Field_1], ..., [Field_M]) \n\t VALUES (src.[PK_1], ..., src.[PK_N], src.[Field_1], ..., src.[Field_M])\nWHEN MATCHED AND ((((trg.[Field_1]<>src.[Field_1]) AND ((trg.[Field_1] IS NOT NULL) AND (src.[Field_1] IS NOT NULL))) OR ((trg.[Field_1] IS NULL) AND (trg.[Field_N] IS NOT NULL)) OR ((trg.[Field_1] IS NOT NULL) AND (trg.[Field_1] IS NULL)))\n\t\t\t\t\tOR\n\t\t\t\t\t...\n\t\t\t\t\tOR (((trg.[Field_M]<>src.[Field_M]) AND ((trg.[Field_M] IS NOT NULL) AND (src.[Field_M] IS NOT NULL))) OR ((trg.[Field_M] IS NULL) AND (trg.[Field_M] IS NOT NULL)) OR ((trg.[Field_M] IS NOT NULL) AND (trg.[Field_M] IS NULL)))\n\t\t\t\t\t)\n\t THEN UPDATE SET\n\t trg.[Field_1]=src.[Field_1],\n\t ...\n\t trg.[Field_M]=src.[Field_M]; Here, instead of the #tbl_source temporary table, any source of data that satisfies the output parameters can be used. Those parameters must match the destination table. For instance, instead of the #tbl_source temporary table, we can use the specific values for one or several rows, as follows: ;MERGE . AS trg\nUSING (\n\t\tSELECT AS [PK_1], ..., AS [PK_N], AS [Field_1], ..., AS [Field_M] UNION ALL\n\t\t...\n\t\tSELECT AS [PK_1], ..., AS [PK_N], AS [Field_1], ..., AS [Field_M]\n\t ) AS src ON trg.[PK_1]=src.[PK_1] AND ... AND trg.[PK_N]=src.[PK_N]\nWHEN NOT MATCHED THEN\n\t INSERT ([PK_1], ..., [PK_N], [Field_1], ..., [Field_M]) \n\t VALUES (src.[PK_1], ..., src.[PK_N], src.[Field_1], ..., src.[Field_M])\nWHEN MATCHED AND ((((trg.[Field_1]<>src.[Field_1]) AND ((trg.[Field_1] IS NOT NULL) AND (src.[Field_1] IS NOT NULL))) OR ((trg.[Field_1] IS NULL) AND (trg.[Field_N] IS NOT NULL)) OR ((trg.[Field_1] IS NOT NULL) AND (trg.[Field_1] IS NULL)))\n\t\t\t\t\tOR\n\t\t\t\t\t...\n\t\t\t\t\tOR (((trg.[Field_M]<>src.[Field_M]) AND ((trg.[Field_M] IS NOT NULL) AND (src.[Field_M] IS NOT NULL))) OR ((trg.[Field_M] IS NULL) AND (trg.[Field_M] IS NOT NULL)) OR ((trg.[Field_M] IS NOT NULL) AND (trg.[Field_M] IS NULL)))\n\t\t\t\t\t)\n\t THEN UPDATE SET\n\t trg.[Field_1]=src.[Field_1],\n\t ...\n\t trg.[Field_M]=src.[Field_M]; Let’s take the example of the [dbo].[Company] table definition: SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [dbo].[Company](\n\t[CompanyID] [int] IDENTITY(1,1) NOT NULL,\n\t[CompanyName] [nvarchar](255) NOT NULL,\n\t[Description] [nvarchar](255) NOT NULL,\n\t[IsDeleted] [bit] NOT NULL,\n CONSTRAINT [PK_Company_CompanyID] PRIMARY KEY CLUSTERED \n(\n\t[CompanyID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]\n) ON [PRIMARY]\nGO\n\nALTER TABLE [dbo].[Company] ADD DEFAULT ((0)) FOR [IsDeleted]\nGO To add and change the data for the two companies, we can write the following code fragment: SET IDENTITY_INSERT [dbo].[Company] ON;\n\n;MERGE [dbo].[Company] AS trg\nUSING (\n\t\tSELECT 1001 AS [CompanyID], 'Microsoft' AS [CompanyName], 'IT company' AS [Description], 0 AS [IsDeleted] UNION ALL\n\t\tSELECT 10 AS [CompanyID], 'DSS' AS [CompanyName], 'IT company' AS [Description], 1 AS [IsDeleted]\n\t ) AS src ON trg.[CompanyID]=src.[CompanyID]\nWHEN NOT MATCHED THEN\n\t INSERT ([CompanyID], [CompanyName], [Description], [IsDeleted]) \n\t VALUES (src.[CompanyID], src.[CompanyName], src.[Description], src.[IsDeleted])\nWHEN MATCHED AND ((((trg.[CompanyName]<>src.[CompanyName]) AND ((trg.[CompanyName] IS NOT NULL) AND (src.[CompanyName] IS NOT NULL))) OR ((trg.[CompanyName] IS NULL) AND (trg.[CompanyName] IS NOT NULL)) OR ((trg.[CompanyName] IS NOT NULL) AND (trg.[CompanyName] IS NULL)))\n\t\t\t\t\tOR\n\t\t\t\t (((trg.[Description]<>src.[Description]) AND ((trg.[Description] IS NOT NULL) AND (src.[Description] IS NOT NULL))) OR ((trg.[Description] IS NULL) AND (trg.[Description] IS NOT NULL)) OR ((trg.[Description] IS NOT NULL) AND (trg.[Description] IS NULL)))\n\t\t\t\t OR\n\t\t\t\t (((trg.[IsDeleted]<>src.[IsDeleted]) AND ((trg.[IsDeleted] IS NOT NULL) AND (src.[IsDeleted] IS NOT NULL))) OR ((trg.[IsDeleted] IS NULL) AND (trg.[IsDeleted] IS NOT NULL)) OR ((trg.[IsDeleted] IS NOT NULL) AND (trg.[IsDeleted] IS NULL)))\n\t\t\t\t\t)\n\t THEN UPDATE SET\n\t trg.[CompanyName]=src.[CompanyName],\n\t trg.[Description]=src.[Description],\n\t trg.[IsDeleted]=src.[IsDeleted];\n\nSET IDENTITY_INSERT [dbo].[Company] OFF; Note: As the CompanyID is the IDENTITY column, you need to enable the insert permission for this column in the Company table before inserting or merging. After that, you will need to disable this option using the following statement: [SET IDENTITY_INSERT . ](https://docs.microsoft.com/en-us/sql/t-sql/statements/set-identity-insert-transact-sql?view=sql-server-ver15) . A simple SELECT query proves that our table now contains the values we inserted with the help of the [MERGE](https://docs.microsoft.com/en-us/sql/t-sql/statements/merge-transact-sql?view=sql-server-ver15) statement: Img.2. The data added and altered in the Company table You can perform deleting in the same way: ;MERGE . AS trg\nUSING #tbl_source AS src ON trg.[PK_1]=src.[PK_1] AND ... AND trg.[PK_N]=src.[PK_N]\nWHEN MATCHED DELETE; Where instead of the #tbl_source temporary table, any source of data that satisfies the output parameters can be used. Those parameters must match the destination table. For instance, instead of the #tbl_source temporary table, we can use the specific values for one or several rows, as follows: ;MERGE . AS trg\nUSING (\n\t\tSELECT AS [PK_1], ..., AS [PK_N], AS [Field_1], ..., AS [Field_M] UNION ALL\n\t\t...\n\t\tSELECT AS [PK_1], ..., AS [PK_N], AS [Field_1], ..., AS [Field_M]\n\t ) AS src ON trg.[PK_1]=src.[PK_1] AND ... AND trg.[PK_N]=src.[PK_N]\nWHEN MATCHED DELETE; For example, to delete two companies from our table, we need to write the following code fragment: ;MERGE [dbo].[Company] AS trg\nUSING (\n\t\tSELECT 1001 AS [CompanyID], 'Microsoft' AS [CompanyName], 'IT company' AS [Description], 0 AS [IsDeleted] UNION ALL\n\t\tSELECT 1002 AS [CompanyID], 'NPP' AS [CompanyName], 'IT company' AS [Description], 1 AS [IsDeleted]\n\t ) AS src ON trg.[CompanyID]=src.[CompanyID]\nWHEN MATCHED THEN DELETE; To make sure that our efforts were successful and the entries were deleted, we can execute the following query: Img. 3. The result of data deletion In all situations mentioned above, we add, alter, and delete items using the [MERGE](https://docs.microsoft.com/en-us/sql/t-sql/statements/merge-transact-sql?view=sql-server-ver15) statement. In the same way, we can use the script on more complex examples. Tools for database versioning As a rule, data versioning applies to references and regulatory-referencing information. The code with the data itself is version-controlled, using three primary version-control systems: [Git Flow](https://blog.devart.com/version-control-system-version-control-with-git.html#git-flow) [GitHub Flow](https://blog.devart.com/version-control-system-version-control-with-git.html#github) [GitLab Flow](https://blog.devart.com/version-control-system-version-control-with-git.html#gitlab) However, there are many other options that offer a range of versioning capabilities tailored to different database environments: Database migration tools Liquibase: A versatile tool supporting various configuration languages for database schema and alterations. It facilitates cross-platform migrations. Redgate Deploy: Designed for Microsoft SQL Server, with strong integration into the Microsoft Developer ecosystem, including Visual Studio. Planetscale: Offers advanced database migration features, including schema branch/merge functionality, with a modern deployment environment. Specializes in MySQL. Version-controlled data lakes LakeFS: Introduces data lake versioning, enabling commits, branches, and rollbacks for unstructured or semi-structured data stored in cloud storage systems like S3 and GCS. Version-controlled databases Terminus DB: Provides full schema and data versioning, along with a graph database interface and a custom query language called Web Object Query Language (WOQL). Offers optional schema usage and JSON querying capabilities. Version control SSMS add-in dbForge Source Control for SQL Server: Helps version-control database schemas and data, view and resolve conflicts, roll back changes, and maintain the overall integrity of databases. Perfect for the SQL Server Management Studio users. Conclusion To sum up, we have reviewed the main approaches to database versioning and demonstrated how to version control database objects, including database schema and database data. If you seek a reliable solution that will support all stages of the database lifecycle and ensure consistent database builds and releases, [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) is the tool to help you accomplish that. This handy SSMS add-in for an SQL Server database was designed to assist you in version controlling your database. Using the tool, you can commit and revert changes as well as view conflicts and resolve any inconsistencies that occur. Besides, you gain the possibility to automate your database development by means of the DevOps approach. Tags [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [source control](https://blog.devart.com/tag/source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [version control](https://blog.devart.com/tag/version-control) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdatabase-versioning-with-examples.html) [Twitter](https://twitter.com/intent/tweet?text=Database+Versioning+with+Examples&url=https%3A%2F%2Fblog.devart.com%2Fdatabase-versioning-with-examples.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/database-versioning-with-examples.html&title=Database+Versioning+with+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/database-versioning-with-examples.html&title=Database+Versioning+with+Examples) [Copy URL](https://blog.devart.com/database-versioning-with-examples.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Compare Open-Source and Paid SQL Server Tools By [dbForge Team](https://blog.devart.com/author/dbforge) December 23, 2022 [0](https://blog.devart.com/dbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html#respond) 5793 If a company has to choose between a free database tool and a paid one, and both of them deliver the same features and performance, it is natural that the paid one doesn’t stand a chance. But what if the free tool doesn’t contain all the necessary functionality for the workflow? What if it has limitations that prevent the employees from becoming more productive? And if it turns out that their work requires either 7 separate, perhaps even hardly compatible free tools, wouldn’t it be more effective to pay for just one tool that would have it all in place? The fact that our customers tend to continue working with the [dbForge database tools](https://www.devart.com/dbforge/) after the trial period says a lot about the true value of productivity and convenience. Still, free and open-source database tools will always be popular, since they cover most of the average needs and requirements of a database specialist free of charge. This very idea inspired us to compare top open-source and commercial solutions for SQL Server development and administration. We wanted to compare their functional capabilities as well as less obvious criteria that might be hidden from the user’s eye. And we took 3 popular examples for this purpose: DBeaver is a free database tool that supports an extensive number of databases. While its closed-source enterprise edition involves a commercial license, DBeaver is mostly known as an open-source solution with a clean user interface and a fine set of tools to handle your routine database work. HeidiSQL is another well-known free and open-source database management and administration tool, a DBeaver alternative that has the aim to be easy to learn. It supports a limited number of database systems (SQL Server, MySQL, PostgreSQL, and SQLite), yet delivers enough goodies to gather a large following. dbForge Studio for SQL Server is an IDE that covers nearly all tasks related to the development, management, and administration of SQL Server databases, data analysis and reporting. It delivers a number of features that can’t be found in free tools, e.g. comparison and synchronization of data and schemas, generation of realistic test data and documentation. It should also be noted that, besides SQL Server, there are Studios available for MySQL, PostgreSQL, and Oracle. All of them are paid solutions that are nonetheless available free of charge for a 30-day trial period. You might already be using one of these tools, but how well does it fare in comparison with others? And would switching to a commercial solution make your daily work easier and more productive? To answer these questions, we made up a list of criteria that we believe matter most to database specialists. Let’s see how each tool handles them. Contents Functionality Stability & Security Support & Troubleshooting Cost Conclusion Functionality Open-source tools, such as DBeaver and HeidiSQL, typically deliver the essentials for daily work and get gradually expanded with new features. The trick with commercial solutions is their focus. In our case, dbForge Studio for SQL Server delivers an all-encompassing set of tools for SQL Server and easy integration with enterprise-grade systems and corporate DevOps processes. Well-polished user experience for effective teamwork is also included. Let’s take a closer look. Functionality DBeaver HeidiSQL dbForge Studio for SQL Server Visual Database Designer Yes No Yes Code Autocompletion Yes Yes Yes SQL Scripts Management Yes No Yes SQL Formatting Yes Yes Yes SQL Snippets No Yes Yes Built-in T-SQL Analyzer No No Yes (via [T-SQL Code Analyzer](https://www.devart.com/dbforge/sql/studio/sql-analyzer.html) ) Integrated Source Control No No Yes Visual Query Builder Yes Yes Yes Query Optimization Yes No Yes (via [Query Profiler](https://www.devart.com/dbforge/sql/studio/sql-query-profiler.html) ) Built-in Data Viewer Yes Yes Yes Index Management No No Yes Master-Detail Browser No No Yes Search for Invalid Objects No No Yes Built-in T-SQL Debugger No No Yes Server Security Management No No Yes Server Diagnostics No No Yes (via Profile Server Events) Online SQL Server Monitoring No No Yes Data Import & Export Yes Yes Yes Export to CSV Yes Yes Yes Backup & Restore Yes Yes Yes Automated Unit Tests No No Yes Test Data Generation Yes No Yes (via [Data Generator](https://www.devart.com/dbforge/sql/studio/sql-server-data-generator.html) ) Data Comparison & Synchronization Yes No Yes Schema Comparison & Synchronization Yes No Yes Generation of Database Documentation No No Yes Integration with the DevOps Process No No Yes Now you can compare these criteria with your daily work (or the daily tasks of your DB team) to see whether the expanded functionality of paid solutions is worth taking note of. Stability & Security Now let’s delve deeper into the non-functional value of database tools, the usual adjectives that come with most of them, like “stable”, “secure”, “reliable” and such. For a commercial product, these qualities equal the reputation and profits of the vendor. There is no place for neglecting the competition or the customer. Paid software is continuously kept up to date and expanded to remain viable on the market. On the contrary, the development of open-source tools is rather sporadic and less focused. Both DBeaver and HeidiSQL boast large user communities, which makes it easier to find problems and vulnerabilities in the source code and eliminate them. Still, with all the openness, you can’t be fully protected from malicious code that might come with another update. Proprietary software, developed in a closed environment, focuses on regular updates and fast responses to any customer emergencies. Thus we assume it’s a safer way, especially for enterprises. Speaking about enterprises, we should mention the security of business-critical and sensitive personal data. For instance, you need certain volumes of data for testing. Naturally, you cannot use real data for that purpose, but the problem has to be solved somehow. The solution comes with dbForge Studio. It includes a tool that generates varied test data according to your settings. The data combines the best qualities one would want: it is fake, yet realistic, and maintains the compliance of your databases with a range of data protection acts and regulations, e.g. GDPR. Support & Troubleshooting Most of what has been said in the previous paragraph applies here as well. Your vendor is always there to answer your questions and help you with the issues that may arise. And if you represent a company, you can get more diverse support options (for instance, special service packages for enterprises) and mutually beneficial long-term business relations with your vendor. When it comes to free software, your most likely source of help is the user community, where you can ask your questions, look for solutions, and watch countless video guides and tutorials that will make your work easier. Professional support bears additional costs, and it can be received from a third-party consultant (more on that later). And if you switch to commercial software, your search will be narrowed down to brief official guides and comprehensive product documentation. In other words, along with the commercial software you get fast and certified on-demand answers. Cost First the value, then the cost, right? It is obvious that free and open-source products owe much of their popularity to the fact that you don’t have to pay for them (at least, initially). Much like individuals, companies want to cut costs wherever possible and opt to use open-source solutions in their operations. Thus, when a free product of high quality enters the market, it can quickly gather a large community. And if so many people use that product, it simply has to be good (and in many cases, including DBeaver and HeidiSQL, it really is). However, we should not forget a number of hidden costs that may become evident over time. Mostly, these costs are related to corporate use and can hinder the growth of the employees’ daily performance. Let’s name a few. Open-source software might take much time and effort to be integrated with the existing corporate systems and environments, as well as properly customized. Complex products may require assistance with installation and training. When it comes to open-source products, communities are the most obvious source of help; still, there might be pitfalls that will turn out to be more time-consuming than expected and require the involvement of the abovementioned third-party consultant with an additional price tag. Maintenance is yet another source of hidden costs; the company will need to allocate additional resources to handle it. The use of commercial software eliminates most of these problems so quickly and effectively that the explanation will fit in a single sentence. Vendors usually take care of product support and maintenance; and if you are purchasing a solution for multiple employees, they can assist you with integration and, if required, provide all the necessary materials for fast training. Sure, few of us are ready to pay upfront for a product they don’t know yet. But that is why the trial period was invented. In the case of dbForge Studio for SQL Server, every user gets a fully functional product for 30 days to evaluate its capabilities and performance. You can check whether it’s convenient for you and your teammates. You can see how it affects your performance. You can see whether all the added value that comes with it is worth the investment. Conclusion It is only up to you to evaluate each criterion according to your own workflow and preferences. All in all, as long as advanced functionality, reliable vendor support, stability, and security are concerned, commercial software pays off. Finally, we invite you to [try dbForge Studio for SQL Server free of charge](https://www.devart.com/dbforge/sql/studio/download.html) . After the trial, you will be able to re-activate it with a purchased activation key. It is also worth noting that we have special offers for enterprises with discounts for multiple purchased licenses. If your work suggests handling multiple databases on different database management systems, Devart has a new solution for you – [dbForge Edge](https://www.devart.com/dbforge/edge/) , the most powerful IDE of all Devart products. Edge covers SQL Server, MaySQL and MariaDB, Oracle, and PostgreSQL databases, allowing its users to perform all development, management, and administration tasks with a single solution. The fully-functional [Free Trial of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) is available – get it for 30 days and test this most advanced software toolset under your workload. Tags [Alternatives](https://blog.devart.com/tag/alternatives) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Compare+Open-Source+and+Paid+SQL+Server+Tools&url=https%3A%2F%2Fblog.devart.com%2Fdbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html&title=Compare+Open-Source+and+Paid+SQL+Server+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html&title=Compare+Open-Source+and+Paid+SQL+Server+Tools) [Copy URL](https://blog.devart.com/dbeaver-vs-heidisql-vs-dbforge-studio-comparing-open-source-and-paid-sql-server-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbexpress-drivers-for-macos-and-linux-64bit.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in dbExpress Drivers: Support for 64-bit versions of macOS and Linux By [DAC Team](https://blog.devart.com/author/dac) August 13, 2019 [0](https://blog.devart.com/dbexpress-drivers-for-macos-and-linux-64bit.html#respond) 4201 dbExpress driver development team has released updated versions of data access drivers with support for macOS 64-bit and Linux 64-bit. This release addresses potential challenges that  developers of 32-bit database applications may face when Apple ends support for 32-bit applications in macOS 10.15. If your app users are planning to upgrade their Mac operating system to the newest version in the fall of 2019, the new dbExpress drivers could be of particular benefit to you. Another new feature is support for Linux 64-bit in our drivers: you can now compile your Delphi apps powered by dbExpress data access technology using the Embarcadero compiler for the Linux platform. Devart dbExpress Driver is a lightweight, cross-platform interface for accessing data from SQL database servers such as Interbase, Oracle, or MySQL in Delphi and C++ applications. The key benefits of Devart dbExpress technology are direct access to database servers using native libraries, data type mapping between database and Delphi data types, and cross-platformity. You are welcome to visit dbExpress driver download page to evaluate the new version for free. Please note that Release 2 is now required to work with Devart dbExpress drivers in RAD Studio 10.3, Delphi 10.3 Rio, and C++ Builder 10.3 Rio. [dbExpress Driver for Oracle 7.0](https://www.devart.com/dbx/oracle/) [ [Download](https://www.devart.com/dbx/oracle/download.html) ] [ [Revision History](https://www.devart.com/dbx/oracle/revision_history.html) ] [dbExpress Driver for SQL Server 8.0](https://www.devart.com/dbx/sqlserver/) [ [Download](https://www.devart.com/dbx/sqlserver/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlserver/revision_history.html) ] [dbExpress Driver for MySQL 7.0](https://www.devart.com/dbx/mysql/) [ [Download](https://www.devart.com/dbx/mysql/download.html) ] [ [Revision History](https://www.devart.com/dbx/mysql/revision_history.html) ] [dbExpress Driver for InterBase and Firebird 5.0](https://www.devart.com/dbx/interbase/) [ [Download](https://www.devart.com/dbx/interbase/download.html) ] [ [Revision History](https://www.devart.com/dbx/interbase/revision_history.html) ] [dbExpress Driver for PostgreSQL 4.0](https://www.devart.com/dbx/postgresql/) [ [Download](https://www.devart.com/dbx/postgresql/download.html) ] [ [Revision History](https://www.devart.com/dbx/postgresql/revision_history.html) ] [dbExpress Driver for SQLite 4.0](https://www.devart.com/dbx/sqlite/) [ [Download](https://www.devart.com/dbx/sqlite/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlite/revision_history.html) ] Any feedback would be greatly appreciated: you can click the [forums](https://forums.devart.com/viewforum.php?f=43) link to join the conversation. Tags [dbexpress](https://blog.devart.com/tag/dbexpress) [delphi](https://blog.devart.com/tag/delphi) [what's new dbexpress](https://blog.devart.com/tag/whats-new-dbexpress) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-for-macos-and-linux-64bit.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+dbExpress+Drivers%3A+Support+for+64-bit+versions+of+macOS+and+Linux&url=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-for-macos-and-linux-64bit.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbexpress-drivers-for-macos-and-linux-64bit.html&title=New+in+dbExpress+Drivers%3A+Support+for+64-bit+versions+of+macOS+and+Linux) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbexpress-drivers-for-macos-and-linux-64bit.html&title=New+in+dbExpress+Drivers%3A+Support+for+64-bit+versions+of+macOS+and+Linux) [Copy URL](https://blog.devart.com/dbexpress-drivers-for-macos-and-linux-64bit.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dbexpress-drivers-for-rad-studio-alexandria.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) dbExpress Drivers for RAD Studio 11 Alexandria with support for Firebird 4 and Apple M1 By [DAC Team](https://blog.devart.com/author/dac) September 16, 2021 [0](https://blog.devart.com/dbexpress-drivers-for-rad-studio-alexandria.html#respond) 2681 Following the recent release of RAD Studio 11 Alexandria, we are ready to share the availability of Devart dbExpress drivers for the new RAD Studio with support for Firebird 4 and the ARM platform (Apple M1). Our drivers fully support all the new features introduced in RAD Studio Alexandria. Devart is the only vendor that continuously updates its products for the dbExpress technology and adds support for new versions of databases once they are released. We are glad to announce support for Firebird 4 which was released on June 1, 2021. Firebird 4 introduces new data types and many improvements, such as logical replication, longer metadata identifiers, timeouts for connections and statements, and more. The IntegerAsLargeInt connection option, which maps SQLite INTEGER columns to fields of type ftLargeInt, was added in the dbExpress driver for SQLite. You may download the new versions using the following links: [dbExpress Driver 8.0 for Oracle](https://www.devart.com/dbx/oracle/) [ [Download](https://www.devart.com/dbx/oracle/download.html) ] [ [Revision History](https://www.devart.com/dbx/oracle/revision_history.html) ] [dbExpress Driver 9.0 for SQL Server](https://www.devart.com/dbx/sqlserver/) [ [Download](https://www.devart.com/dbx/sqlserver/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlserver/revision_history.html) ] [dbExpress Driver 6.0 for InterBase and Firebird](https://www.devart.com/dbx/interbase/) [ [Download](https://www.devart.com/dbx/interbase/download.html) ] [ [Revision History](https://www.devart.com/dbx/interbase/revision_history.html) ] [dbExpress Driver 8.0 for MySQL](https://www.devart.com/dbx/mysql/) [ [Download](https://www.devart.com/dbx/mysql/download.html) ] [ [Revision History](https://www.devart.com/dbx/mysql/revision_history.html) ] [dbExpress Driver 5.0 for PostgreSQL](https://www.devart.com/dbx/postgresql/) [ [Download](https://www.devart.com/dbx/postgresql/download.html) ] [ [Revision History](https://www.devart.com/dbx/postgresql/revision_history.html) ] [dbExpress Driver 5.0 for SQLite](https://www.devart.com/dbx/sqlite/) [ [Download](https://www.devart.com/dbx/sqlite/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlite/revision_history.html) ] Tags [dbexpress](https://blog.devart.com/tag/dbexpress) [macOS](https://blog.devart.com/tag/macos) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new dbexpress](https://blog.devart.com/tag/whats-new-dbexpress) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-for-rad-studio-alexandria.html) [Twitter](https://twitter.com/intent/tweet?text=dbExpress+Drivers+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1&url=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-for-rad-studio-alexandria.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbexpress-drivers-for-rad-studio-alexandria.html&title=dbExpress+Drivers+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbexpress-drivers-for-rad-studio-alexandria.html&title=dbExpress+Drivers+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1) [Copy URL](https://blog.devart.com/dbexpress-drivers-for-rad-studio-alexandria.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dbexpress-drivers-new-interbase-postgresql-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) dbExpress Drivers with Support for InterBase 2020, PostgreSQL 12, and Oracle 19c By [DAC Team](https://blog.devart.com/author/dac) November 29, 2019 [0](https://blog.devart.com/dbexpress-drivers-new-interbase-postgresql-oracle.html#respond) 4519 Devart dbExpress team released new versions of dbExpress drivers. New RDBMS versions were supported in respective drivers: InterBase 2020, PostgreSQL 12, and Oracle 19c. Another big update is the removal of the limitation on the number of table columns that you can retrieve in the trial version of dbExpress drivers on macOS and Linux. You can now test table operations in a production-like environment. TLS 1.2 was supported in Direct connection mode in dbExpress driver for SQL Server. TLS 1.2 fixes some issues in TLS 1.1 and is currently the most used version of TLS. Also we implemented the functionality for creating parameters for a command automatically in the driver for SQL Server. You are welcome to visit dbExpress driver download page to evaluate the new fully-functional version for free. [dbExpress Driver for Oracle 7.1](https://www.devart.com/dbx/oracle/) [ [Download](https://www.devart.com/dbx/oracle/download.html) ] [ [Revision History](https://www.devart.com/dbx/oracle/revision_history.html) ] [dbExpress Driver for SQL Server 8.1](https://www.devart.com/dbx/sqlserver/) [ [Download](https://www.devart.com/dbx/sqlserver/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlserver/revision_history.html) ] [dbExpress Driver for MySQL 7.1](https://www.devart.com/dbx/mysql/) [ [Download](https://www.devart.com/dbx/mysql/download.html) ] [ [Revision History](https://www.devart.com/dbx/mysql/revision_history.html) ] [dbExpress Driver for InterBase and Firebird 5.1](https://www.devart.com/dbx/interbase/) [ [Download](https://www.devart.com/dbx/interbase/download.html) ] [ [Revision History](https://www.devart.com/dbx/interbase/revision_history.html) ] [dbExpress Driver for PostgreSQL 4.1](https://www.devart.com/dbx/postgresql/) [ [Download](https://www.devart.com/dbx/postgresql/download.html) ] [ [Revision History](https://www.devart.com/dbx/postgresql/revision_history.html) ] [dbExpress Driver for SQLite 4.1](https://www.devart.com/dbx/sqlite/) [ [Download](https://www.devart.com/dbx/sqlite/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlite/revision_history.html) ] Any feedback would be appreciated: please follow the [link](https://forums.devart.com/viewforum.php?f=43) to join our forum. Tags [dbexpress](https://blog.devart.com/tag/dbexpress) [delphi](https://blog.devart.com/tag/delphi) [what's new dbexpress](https://blog.devart.com/tag/whats-new-dbexpress) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-new-interbase-postgresql-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=dbExpress+Drivers+with+Support+for+InterBase+2020%2C+PostgreSQL+12%2C+and+Oracle+19c&url=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-new-interbase-postgresql-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbexpress-drivers-new-interbase-postgresql-oracle.html&title=dbExpress+Drivers+with+Support+for+InterBase+2020%2C+PostgreSQL+12%2C+and+Oracle+19c) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbexpress-drivers-new-interbase-postgresql-oracle.html&title=dbExpress+Drivers+with+Support+for+InterBase+2020%2C+PostgreSQL+12%2C+and+Oracle+19c) [Copy URL](https://blog.devart.com/dbexpress-drivers-new-interbase-postgresql-oracle.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in dbExpress Drivers: Support for Oracle 21c, PostgreSQL 13, and InterBase OTW Encryption By [DAC Team](https://blog.devart.com/author/dac) July 14, 2021 [0](https://blog.devart.com/dbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html#respond) 2342 We released new versions of dbExpress drivers to support Oracle 21c, PostgreSQL 13, and InterBase OTW encryption. The latest versions of PostgreSQL and Oracle Database were supported in the respective dbExpress drivers. You may safely update your database server—the client applications that use our drivers in the data access layer will seamlessly work with all the new features of Oracle Database and PostgreSQL. The Over-the-Wire (OTW) encryption feature of InterBase was supported in the dbExpress driver for InterBase to allow you to secure your data with SSL/TLS encryption during the transmission process . Three new options were added in the dbExpress driver for SQLite—LockingMode for configuring the locking mode, JournalMode for setting the journal mode, and Synchronous for configuring the database synchronization mode when writing to disk. You are welcome to download the new versions. Please feel free to share your feedback with us in the comments section. [dbExpress Driver 7.3 for Oracle](https://www.devart.com/dbx/oracle/) [ [Download](https://www.devart.com/dbx/oracle/download.html) ] [ [Revision History](https://www.devart.com/dbx/oracle/revision_history.html) ] [dbExpress Driver 8.3 for SQL Server](https://www.devart.com/dbx/sqlserver/) [ [Download](https://www.devart.com/dbx/sqlserver/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlserver/revision_history.html) ] [dbExpress Driver 5.3 for InterBase and Firebird](https://www.devart.com/dbx/interbase/) [ [Download](https://www.devart.com/dbx/interbase/download.html) ] [ [Revision History](https://www.devart.com/dbx/interbase/revision_history.html) ] [dbExpress Driver 7.2 for MySQL](https://www.devart.com/dbx/mysql/) [ [Download](https://www.devart.com/dbx/mysql/download.html) ] [ [Revision History](https://www.devart.com/dbx/mysql/revision_history.html) ] [dbExpress Driver 4.3 for PostgreSQL](https://www.devart.com/dbx/postgresql/) [ [Download](https://www.devart.com/dbx/postgresql/download.html) ] [ [Revision History](https://www.devart.com/dbx/postgresql/revision_history.html) ] [dbExpress Driver 4.3 for SQLite](https://www.devart.com/dbx/sqlite/) [ [Download](https://www.devart.com/dbx/sqlite/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlite/revision_history.html) ] Tags [dbexpress](https://blog.devart.com/tag/dbexpress) [what's new dbexpress](https://blog.devart.com/tag/whats-new-dbexpress) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+dbExpress+Drivers%3A+Support+for+Oracle+21c%2C+PostgreSQL+13%2C+and+InterBase+OTW+Encryption&url=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html&title=New+in+dbExpress+Drivers%3A+Support+for+Oracle+21c%2C+PostgreSQL+13%2C+and+InterBase+OTW+Encryption) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html&title=New+in+dbExpress+Drivers%3A+Support+for+Oracle+21c%2C+PostgreSQL+13%2C+and+InterBase+OTW+Encryption) [Copy URL](https://blog.devart.com/dbexpress-drivers-support-oracle-21c-postgresql-13-and-otw-encryption.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dbexpress-drivers-with-support-for-rad-studio-10-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) dbExpress Drivers Now Support RAD Studio 10.4 By [DAC Team](https://blog.devart.com/author/dac) June 10, 2020 [0](https://blog.devart.com/dbexpress-drivers-with-support-for-rad-studio-10-4.html#respond) 2533 We are excited to announce the release of Devart dbExpress Drivers with support for the newly released RAD Studio 10.4 Sydney. All of our dbExpress Drivers now support the latest version of RAD Studio. Additionally, dbExpress Driver for MySQL received support for JSON data type. You are welcome to download and try the new versions. [dbExpress Driver 7.2 for Oracle](https://www.devart.com/dbx/oracle/) [ [Download](https://www.devart.com/dbx/oracle/download.html) ] [ [Revision History](https://www.devart.com/dbx/oracle/revision_history.html) ] [dbExpress Driver 8.2 for SQL Server](https://www.devart.com/dbx/sqlserver/) [ [Download](https://www.devart.com/dbx/sqlserver/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlserver/revision_history.html) ] [dbExpress Driver 5.2 for InterBase and Firebird](https://www.devart.com/dbx/interbase/) [ [Download](https://www.devart.com/dbx/interbase/download.html) ] [ [Revision History](https://www.devart.com/dbx/interbase/revision_history.html) ] [dbExpress Driver 7.2 for MySQL](https://www.devart.com/dbx/mysql/) [ [Download](https://www.devart.com/dbx/mysql/download.html) ] [ [Revision History](https://www.devart.com/dbx/mysql/revision_history.html) ] [dbExpress Driver 4.2 for PostgreSQL](https://www.devart.com/dbx/postgresql/) [ [Download](https://www.devart.com/dbx/postgresql/download.html) ] [ [Revision History](https://www.devart.com/dbx/postgresql/revision_history.html) ] [dbExpress Driver 4.2 for SQLite](https://www.devart.com/dbx/sqlite/) [ [Download](https://www.devart.com/dbx/sqlite/download.html) ] [ [Revision History](https://www.devart.com/dbx/sqlite/revision_history.html) ] Tags [dbexpress](https://blog.devart.com/tag/dbexpress) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new dbexpress](https://blog.devart.com/tag/whats-new-dbexpress) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-with-support-for-rad-studio-10-4.html) [Twitter](https://twitter.com/intent/tweet?text=dbExpress+Drivers+Now+Support+RAD+Studio+10.4&url=https%3A%2F%2Fblog.devart.com%2Fdbexpress-drivers-with-support-for-rad-studio-10-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbexpress-drivers-with-support-for-rad-studio-10-4.html&title=dbExpress+Drivers+Now+Support+RAD+Studio+10.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbexpress-drivers-with-support-for-rad-studio-10-4.html&title=dbExpress+Drivers+Now+Support+RAD+Studio+10.4) [Copy URL](https://blog.devart.com/dbexpress-drivers-with-support-for-rad-studio-10-4.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Compare Bundle vs Visual Studio Diff Tools: Which One Compares Databases Better? By [dbForge Team](https://blog.devart.com/author/dbforge) June 17, 2022 [0](https://blog.devart.com/dbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html#respond) 3170 Identification and management of differences in database objects and actual data is an indispensable part of database management. Without a doubt, you want it to get done fast, easy, and with maximum flexibility of settings. In this article, we are going to cover the comparison capabilities of specialized tools for SQL Server (exemplified by dbForge Compare Bundle) and compare them with the built-in diff management tools of Microsoft Visual Studio. What is dbForge Compare Bundle? [dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) comprises two standalone solutions that provide full control over database diff handling. The first one is called [Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) , and it helps you locate and review differences in SQL Server schemas and database objects, as well as deploy the said differences to target databases using error-free synchronization scripts. The second solution is [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) , which does just about the same with table and view data, as well as scripts folders and native SQL Server backups. The whole bundle, naturally, comes at a lower price than two separate purchases, and can be explored free of charge during a 30-day trial. What is Microsoft Visual Studio? Well, we guess this product needs no special introduction, so let’s keep it short. Visual Studio is an integrated development environment, designed and developed by Microsoft. It is one of the most popular software development tools on the market, and it offers support for multiple languages (including SQL) along with quite a few useful features for code completion, editing, refactoring, debugging, and—what interests us most of all today—SQL Server database comparison. Now that we have finished with the introductions, let’s take a look at the data and schema comparison capabilities of both tools. Feature comparison: dbForge Schema Compare vs Visual Studio diff tools So, here we go—let’s begin with Schema Compare. For your convenience, all the features are divided into categories. Feature comparison: dbForge Data Compare vs Visual Studio diff tools Now let’s proceed to Data Compare, with all the features divided into the same categories. Note : For this comparison, the following versions of the tools were used: dbForge Schema Compare v5.2.8, dbForge Data Compare v5.2.4, and Microsoft Visual Studio 2022 v17.2.2. Conclusion Now that we have shown you the results of our little research, it is obvious that specialized tools do a far better job at configuring and performing (not to mention automating) regular comparisons of SQL Server database objects and table data. Still, the best thing is to try it all yourself, so we invite you to [download dbForge Compare Bundle for a FREE 30-day trial](https://www.devart.com/dbforge/sql/compare-bundle/download.html) and get some firsthand experience with its database comparison and synchronization capabilities. Bonus: Introductory video tutorials to Data Compare and Schema Compare Since you were interested enough to make it this far, we just couldn’t leave you without a nice bonus. How about a 3-minute video teaser of Schema Compare? Just watch it, and you’ll learn everything about its main features, hidden under a visually simple yet elaborate user interface. This video will help you understand whether this tool is really what you need. We have a similar video introduction to dbForge Data Compare, so feel free to watch it as well—and see whether it meets your functional demands (spoiler: most certainly, it does). Tags [data compare](https://blog.devart.com/tag/data-compare) [dbForge Compare Bundle](https://blog.devart.com/tag/dbforge-compare-bundle) [dbforge data compare](https://blog.devart.com/tag/dbforge-data-compare) [Schema Compare](https://blog.devart.com/tag/schema-compare) [schema comparison](https://blog.devart.com/tag/schema-comparison) [SQL Server](https://blog.devart.com/tag/sql-server) [visual studio](https://blog.devart.com/tag/visual-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Compare+Bundle+vs+Visual+Studio+Diff+Tools%3A+Which+One+Compares+Databases+Better%3F&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html&title=dbForge+Compare+Bundle+vs+Visual+Studio+Diff+Tools%3A+Which+One+Compares+Databases+Better%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html&title=dbForge+Compare+Bundle+vs+Visual+Studio+Diff+Tools%3A+Which+One+Compares+Databases+Better%3F) [Copy URL](https://blog.devart.com/dbforge-compare-bundle-vs-visual-studio-diff-tools-which-one-compares-databases-better.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-data-compare-for-oracle-new-life-to-the-product-line.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Data Compare for Oracle: new life to the product line By [dbForge Team](https://blog.devart.com/author/dbforge) July 14, 2010 [0](https://blog.devart.com/dbforge-data-compare-for-oracle-new-life-to-the-product-line.html#respond) 3068 Recently our development efforts were focused on dbForge for [SQL Server product line](https://www.devart.com/dbforge/sql/) . We’ve made five major releases of SQL Server database tools in last 18 months. Besides, we’ve made two major releases of [MySQL database tools](https://www.devart.com/dbforge/mysql/) in this period. Our [Oracle database tools](https://www.devart.com/dbforge/oracle/) product line, once actively developed, was frozen for almost three years. Sure we made maintenance releases, but no new features and tools. Our Oracle tools even were not re-branded to dbForge for Oracle. But now we decided to breeze the new life into Oracle tools development. The first tool in the dbForge for Oracle product line will be [Data Compare](https://www.devart.com/dbforge/oracle/datacompare/) . For the first release we decided to make a free tool with basic functionality: comparison and synchronization of tables only (views are not supported) table and column mapping (by name) type conversion is not supported (column types must match) only simple types will be supported Tool intended to help accomplishing simple database synchronization tasks and we plan to keep it free while adding new features to the commercial version. Release is scheduled at the first decade of August 2010. After gaining some experience in SQL Server data compare tools development we decided to release. Tags [data compare](https://blog.devart.com/tag/data-compare) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [what's new oracle tools](https://blog.devart.com/tag/whats-new-oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-oracle-new-life-to-the-product-line.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Data+Compare+for+Oracle%3A+new+life+to+the+product+line&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-oracle-new-life-to-the-product-line.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-data-compare-for-oracle-new-life-to-the-product-line.html&title=dbForge+Data+Compare+for+Oracle%3A+new+life+to+the+product+line) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-data-compare-for-oracle-new-life-to-the-product-line.html&title=dbForge+Data+Compare+for+Oracle%3A+new+life+to+the+product+line) [Copy URL](https://blog.devart.com/dbforge-data-compare-for-oracle-new-life-to-the-product-line.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-data-compare-for-postgresql-32.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) Latest connectivity options in dbForge Data Compare for PostgreSQL v3.2 By [dbForge Team](https://blog.devart.com/author/dbforge) November 8, 2019 [0](https://blog.devart.com/dbforge-data-compare-for-postgresql-32.html#respond) 3606 We are thrilled to inform our PostgreSQL users, that a new version of our ultimate data diffs management tool, [dbForge Data Compare for PostgreSQL, v.3.2,](https://www.devart.com/dbforge/postgresql/datacompare/) has been rolled out. The new version extends the PostgreSQL connectivity options and provides yet more convenience and flexibility to the PostgreSQL data comparison and synchronization routines. Connectivity Support for PostgreSQL 12.x dbForge Data Compare for PostgreSQL allows connecting to and working with the latest PostgreSQL 12.x. Tell Us What You Think We welcome you to [try the new version](https://www.devart.com/dbforge/postgresql/datacompare/download.html) of dbForge Data Compare for PostgreSQL and [share your thoughts](https://www.devart.com/dbforge/postgresql/datacompare/feedback.html) about the updated tool with us. This will help us to keep improving the tool in line with your needs! Tags [data compare](https://blog.devart.com/tag/data-compare) [PostgreSQL](https://blog.devart.com/tag/postgresql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-postgresql-32.html) [Twitter](https://twitter.com/intent/tweet?text=Latest+connectivity+options+in+dbForge+Data+Compare+for+PostgreSQL+v3.2&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-postgresql-32.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-data-compare-for-postgresql-32.html&title=Latest+connectivity+options+in+dbForge+Data+Compare+for+PostgreSQL+v3.2) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-data-compare-for-postgresql-32.html&title=Latest+connectivity+options+in+dbForge+Data+Compare+for+PostgreSQL+v3.2) [Copy URL](https://blog.devart.com/dbforge-data-compare-for-postgresql-32.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-data-compare-for-postgresql-v-3-3-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Data Compare for PostgreSQL v.3.3 Released! By [dbForge Team](https://blog.devart.com/author/dbforge) September 1, 2020 [0](https://blog.devart.com/dbforge-data-compare-for-postgresql-v-3-3-released.html#respond) 2378 We are glad to announce that a new version of [dbForge Data Compare for PostgreSQL v.3.3](https://www.devart.com/dbforge/postgresql/datacompare/) is available for download. In this release, we have extended connectivity options to provide the possibility to work with the latest PostgreSQL database engine. Also, we adjusted the Pre & Post Script execution functionality that helps you perform additional operations before and/or after data synchronization. Connectivity support for PostgreSQL v.13 [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) allows you to connect and work with PostgreSQL v.13. Additional Scripts The Synchronization wizard now contains the Additional Scripts tab where you can embed scripts in a target database to be executed after and/or before the data synchronization process. Tell Us What You Think [Download the new version](https://www.devart.com/dbforge/postgresql/datacompare/download.html) of dbForge Data Compare for PostgreSQL and [share your feedback](https://www.devart.com/dbforge/postgresql/datacompare/feedback.html) with us! For more information about the tools that will help you improve PostgreSQL database development, navigate to [dbForge for PostgreSQL](https://www.devart.com/dbforge/postgresql/) . Tags [data compare](https://blog.devart.com/tag/data-compare) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-postgresql-v-3-3-released.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Data+Compare+for+PostgreSQL+v.3.3+Released%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-data-compare-for-postgresql-v-3-3-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-data-compare-for-postgresql-v-3-3-released.html&title=dbForge+Data+Compare+for+PostgreSQL+v.3.3+Released%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-data-compare-for-postgresql-v-3-3-released.html&title=dbForge+Data+Compare+for+PostgreSQL+v.3.3+Released%21) [Copy URL](https://blog.devart.com/dbforge-data-compare-for-postgresql-v-3-3-released.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-devops-automation-teamcity.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Enjoy Continuous Integration in TeamCity with dbForge DevOps Automation By [dbForge Team](https://blog.devart.com/author/dbforge) December 6, 2019 [0](https://blog.devart.com/dbforge-devops-automation-teamcity.html#respond) 4691 We are thrilled to inform our users of [dbForge DevOps Automation for SQL Server](https://staging.devart.com/dbforge/sql/database-devops/) that we have extended the range of supported automation systems with TeamCity. TeamCity Integration Now dbForge DevOps Automation for SQL Server allows setting up the DevOps processes in TeamCity with the help of the just-released dbForge DevOps Automation for SQL Server Plug-in. Availability Download dbForge DevOps Automation TeamCity Plugin for SQL Server [here](https://www.devart.com/dbforge/sql/database-devops/static/teamcity-dbforge-plugin.zip) . dbForge DevOps Automation for SQL Server is a free product that is supplied exclusively as a part of [dbForge SQL Tools.](https://www.devart.com/dbforge/sql/sql-tools/download.html) Tags [devops](https://blog.devart.com/tag/devops) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-devops-automation-teamcity.html) [Twitter](https://twitter.com/intent/tweet?text=Enjoy+Continuous+Integration+in+TeamCity+with+dbForge+DevOps+Automation&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-devops-automation-teamcity.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-devops-automation-teamcity.html&title=Enjoy+Continuous+Integration+in+TeamCity+with+dbForge+DevOps+Automation) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-devops-automation-teamcity.html&title=Enjoy+Continuous+Integration+in+TeamCity+with+dbForge+DevOps+Automation) [Copy URL](https://blog.devart.com/dbforge-devops-automation-teamcity.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-documenter-for-mysql-10.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) MySQL Documenting Has Never Been Easier! By [dbForge Team](https://blog.devart.com/author/dbforge) April 18, 2018 [0](https://blog.devart.com/dbforge-documenter-for-mysql-10.html#respond) 14134 We just love to make our users’ life a bit easier. This time, we are excited to treat you with a brand new addition to our dbForge for MySQL product line, – [dbForge Documenter for MySQL](https://www.devart.com/dbforge/mysql/documenter/) . The tool automatically generates documentation for multiple MySQL and MariaDB databases in the HTML, PDF, and MARKDOWN file formats and allows customizing the generated documentation with help of a rich set of options and settings. Our team has done its best to make the new MySQL documentation tool fast, productive, simple, and good-looking. Now it’s your turn to try it out! Key features Database Structure from A to Z dbForge Documenter for MySQL extracts an extensive database info about all MySQL objects, including their details, properties, SQL script, and inter-object dependencies. Multilevel Customization You can select database objects and properties for each individual object to be included in the documentation. You also get a number of style templates as well as the ability to apply your own Bootstrap themes to get a nice-looking layout. Multiple File Formats Documenter for MySQL can generate documentation in HTML, PDF, and MARKDOWN file formats. All formats are searchable, which makes navigation through large documents a way easier. Object Annotations The description of specific database objects is read out from the COMMENT property of the self-described objects to which this property is attributed. Easy navigation through documentation file Type the name of an object you are looking for, and dbForge Documenter will highlight the matching text in the object tree. You can also navigate throughout the documentation with the built-in breadcrumbs. Automatic Database Documenting dbForge Documenter for MySQL comes with support for the command line interface. Thus, you can use Windows task scheduler to set up automatic database documenting. The tool also allows creating a command line execution file to run routine database documentation tasks in a single click. Free Trial As always, we provide a free 30-day trial for the tool. So, we invite you to [download the new tool right now](https://www.devart.com/dbforge/mysql/documenter/download.html) , and enjoy seamless MySQL database documenting! Tags [documenter](https://blog.devart.com/tag/documenter) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-documenter-for-mysql-10.html) [Twitter](https://twitter.com/intent/tweet?text=MySQL+Documenting+Has+Never+Been+Easier%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-documenter-for-mysql-10.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-documenter-for-mysql-10.html&title=MySQL+Documenting+Has+Never+Been+Easier%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-documenter-for-mysql-10.html&title=MySQL+Documenting+Has+Never+Been+Easier%21) [Copy URL](https://blog.devart.com/dbforge-documenter-for-mysql-10.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-documenter-for-oracle-release.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Keep Calm and Document Oracle Databases By [dbForge Team](https://blog.devart.com/author/dbforge) August 16, 2018 [0](https://blog.devart.com/dbforge-documenter-for-oracle-release.html#respond) 18362 We are thrilled to inform that our dbForge for Oracle product line has been expanded with a brand new tool for smart and easy documenting of Oracle databases, – [dbForge Documenter for Oracle](https://www.devart.com/dbforge/oracle/documenter/) ! The tool automatically generates documentation for multiple Oracle databases in the HTML, PDF, and MARKDOWN file formats and allows customizing the generated documentation with help of a rich set of options and settings. Our team has done its best to make the new Oracle documentation tool fast, simple, and good-looking. Explore Database Structure dbForge Documenter for Oracle extracts an extensive database info about all Oracle objects, including their details, properties, SQL scripts, and inter-object dependencies. The database is presented as a neat navigation tree. Each item of the tree allows displaying properties of each database object as a skinned HTML list. Multilevel Customization You can select database objects and properties for each individual object to be included in the documentation. You also get a number of style templates as well as the ability to apply your own themes to get a nice-looking layout. Multiple File Formats Documenter for Oracle can generate documentation in HTML, PDF, and MARKDOWN file formats. All formats are searchable, which makes navigation through large documents a way easier. Easy Navigation Type the name of an object you are looking for in the generated document file, and the matching text will be highlighted in the object tree. You can also navigate throughout the documentation with the built-in breadcrumbs. Automatic Database Documenting dbForge Documenter for Oracle also comes with the support for the command line interface. Thus, you can use the Windows task scheduler to set up automatic database documenting. The tool also allows creating a command line execution file (.bat) to run the routine database documenting tasks in a single click. Free Trial As always, we provide a free 30-day trial for the tool. So, we invite you to [download the new tool right now](https://www.devart.com/dbforge/oracle/documenter/download.html) , and enjoy seamless Oracle database documenting! We would also like to know your thoughts about the tool – [leave a feedback](https://www.devart.com/dbforge/oracle/documenter/feedback.html) about dbForge Documenter for Oracle and help us to make it better! Tags [documenter](https://blog.devart.com/tag/documenter) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-documenter-for-oracle-release.html) [Twitter](https://twitter.com/intent/tweet?text=Keep+Calm+and+Document+Oracle+Databases&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-documenter-for-oracle-release.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-documenter-for-oracle-release.html&title=Keep+Calm+and+Document+Oracle+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-documenter-for-oracle-release.html&title=Keep+Calm+and+Document+Oracle+Databases) [Copy URL](https://blog.devart.com/dbforge-documenter-for-oracle-release.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-fusion-350-released-more-features.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Fusion 3.50 released! Enjoy more valuable features! By [dbForge Team](https://blog.devart.com/author/dbforge) July 6, 2009 [1](https://blog.devart.com/dbforge-fusion-350-released-more-features.html#comments) 2848 Devart has released today the updated version 3.50 of [dbForge Fusion for MySQL](https://www.devart.com/dbforge/mysql/fusion/) – a powerful add-in for Microsoft Visual Studio, designed to simplify MySQL database development and enhance data management. Up to 100 usability improvements have been implemented, so you get better insight and satisfaction as well as a total control over performed tasks through the modernized UI of dbForge Fusion for MySQL. Striving to speed up performed tasks, dbForge Fusion offers compression to reduce traffic, while exchanging data with MySQL server, and makes the exchange quicker. The new features of dbForge Fusion for MySQL 3.50 include: A new excellent Database Designer The new version includes a modern tool for online database design – Database Designer. It is indispensable for anyone who needs to build a clear and effective database structure visually and get the complete picture representing all the tables, foreign key relations between them, views, and stored routines of the required database. It streamlines access to the database objects for viewing their properties, editing, retrieving data, and executing stored routines. The database diagram enables reverse engineering of databases to IDEF1X or IE diagrams, which can be easily printed. Data comparison improvements While comparing data, dbForge Fusion for MySQL 3.50 better caters for the diversity of data comparison tasks. Custom object mapping delivers more freedom in comparison – you can map tables, columns, and views with different names, map columns with different types, or map a table with a view. Always you can cancel custom mapping and return to the automatic one. Mapping process is more simple and clear, now you get detailed information about each mapped column and see warnings, for example, if a comparison key is set incorrectly. You can quickly find the required object for mapping using the Find box or filtering the list of objects by their status (the following are available: valid or invalid mapping, auto or user mapped, fully or partially mapped, included or excluded in comparison). New mapping options Ignore Case, Ignore Spaces, Ignore Underscores are added to let you customize the comparison. Data comparison of any amounts of data has been accelerated and now is 5-10 times faster delivering more customized and understandable comparison results. Schema comparison improvements Schema comparison results management has become easier and quick due to convenient filtering by object type, update operation, or status, quick search by object name, ability to group by comparison difference type and update operation or cancel grouping at all. You can grasp the comparison results even quicker thanks to their new representation. The new version offers more speedy synchronization process, moreover adds more visibility and control, as you can preview the synchronization script for any object in the comparison results to ensure the correct synchronization result on the first try. Optimized approach in working with large SQL scripts The new version provides a special Execute Script Wizard to enable quick and convenient execution of large SQL scripts without waiting for their opening in the editor. It takes only to select the script and enjoy the executed result. Other improvements A number of bug fixes and minor improvements have enhanced the product performance targeting better users’ satisfaction. [Download dbForge Fusion for MySQL](https://www.devart.com/dbforge/mysql/fusion/download.html) and add its features to the arsenal of your tools. Use [dbForge Fusion for MySQL Feedback Page](https://www.devart.com/dbforge/mysql/fusion/feedback.html) to tell us what you think about the new version. Tags [dbforge fusion](https://blog.devart.com/tag/dbforge-fusion) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-350-released-more-features.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Fusion+3.50+released%21+Enjoy+more+valuable+features%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-350-released-more-features.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-fusion-350-released-more-features.html&title=dbForge+Fusion+3.50+released%21+Enjoy+more+valuable+features%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-fusion-350-released-more-features.html&title=dbForge+Fusion+3.50+released%21+Enjoy+more+valuable+features%21) [Copy URL](https://blog.devart.com/dbforge-fusion-350-released-more-features.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025 1 COMMENT Joe Perez January 22, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 8:37 pm I recently came accross your site and have been reading along. I thought I would leave my first comment. I dont know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this site very often. Joe Comments are closed."} {"url": "https://blog.devart.com/dbforge-fusion-for-mysql-v66.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) New Connectivity Options in dbForge Fusion for MySQL, v6.6 By [dbForge Team](https://blog.devart.com/author/dbforge) February 26, 2021 [0](https://blog.devart.com/dbforge-fusion-for-mysql-v66.html#respond) 2606 We are glad to inform our MySQL users, that a new version of our Visual Studio plugin for MySQL database development, [dbForge Fusion for MySQL, v.6.6](https://www.devart.com/dbforge/mysql/fusion/) , has been just rolled out. Connectivity The new version features a massive update of connectivity options implemented to ensure that you can easily connect and work with the latest database engines and cloud services. In particular, the new dbForge Fusion for MySQL provides connectivity support for: MySQL 8.0; Percona 8.0; MariaDB 10.1-10.5; Azure MySQL. Availability [Download](https://www.devart.com/dbforge/mysql/fusion/download.html) the new version of dbForge Fusion for MySQL and [share your thoughts](https://www.devart.com/dbforge/mysql/fusion/feedback.html) about the product. Your feedback helps us find the right direction with future updates and improve the tools according to the needs of our users. Tags [dbforge fusion](https://blog.devart.com/tag/dbforge-fusion) [Releases](https://blog.devart.com/tag/releases) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-for-mysql-v66.html) [Twitter](https://twitter.com/intent/tweet?text=New+Connectivity+Options+in+dbForge+Fusion+for+MySQL%2C+v6.6&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-for-mysql-v66.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-fusion-for-mysql-v66.html&title=New+Connectivity+Options+in+dbForge+Fusion+for+MySQL%2C+v6.6) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-fusion-for-mysql-v66.html&title=New+Connectivity+Options+in+dbForge+Fusion+for+MySQL%2C+v6.6) [Copy URL](https://blog.devart.com/dbforge-fusion-for-mysql-v66.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-fusion-for-oracle-v310.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Meet dbForge Fusion for Oracle, v3.10 with Fresh Connectivity Options By [dbForge Team](https://blog.devart.com/author/dbforge) February 26, 2021 [0](https://blog.devart.com/dbforge-fusion-for-oracle-v310.html#respond) 2585 We are delighted to share the great news with our Oracle users – a new version of our Visual Studio plugin for Oracle database management, [dbForge Fusion for Oracle, v3.10](https://www.devart.com/dbforge/oracle/fusion/) , has been just released! Connectivity The new version features brand new connectivity options implemented to ensure that you can easily connect and work with the latest Oracle server versions In particular, the new dbForge Fusion for Oracle provides connectivity support for: Oracle 12c R2; Oracle 18c; Oracle 19c. Availability [Download](https://www.devart.com/dbforge/oracle/fusion/download.html) the new version of dbForge Fusion for Oracle and [share your thoughts](https://www.devart.com/dbforge/oracle/fusion/feedback.html) about the product. Your feedback helps us find the right direction with future updates and improve the tools according to the needs of our users. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbforge fusion](https://blog.devart.com/tag/dbforge-fusion) [oracle tools](https://blog.devart.com/tag/oracle-tools) [what's new oracle tools](https://blog.devart.com/tag/whats-new-oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-for-oracle-v310.html) [Twitter](https://twitter.com/intent/tweet?text=Meet+dbForge+Fusion+for+Oracle%2C+v3.10+with+Fresh+Connectivity+Options&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-fusion-for-oracle-v310.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-fusion-for-oracle-v310.html&title=Meet+dbForge+Fusion+for+Oracle%2C+v3.10+with+Fresh+Connectivity+Options) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-fusion-for-oracle-v310.html&title=Meet+dbForge+Fusion+for+Oracle%2C+v3.10+with+Fresh+Connectivity+Options) [Copy URL](https://blog.devart.com/dbforge-fusion-for-oracle-v310.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-integrate-with-strongdm.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) What Is StrongDM, and How dbForge Studio for PostgreSQL Supports It By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) September 23, 2024 [0](https://blog.devart.com/dbforge-integrate-with-strongdm.html#respond) 885 Privileged Access Management (PAM) solutions have become increasingly popular due to their ability to address data security concerns with flexible access control. One essential area for employing PAM solutions is using them alongside database clients (from the simplest ones to full-fledged IDEs) to ensure safe access to databases. StrongDM is a leading PAM solution that supports modern software development infrastructures both on-premises and in the cloud. This article explores the workflow and benefits of using StrongDM together with dbForge Studio for PostgreSQL. What is StrongDM? StrongDM is an infrastructure access platform designed to simplify and secure access to various environments, such as databases, servers, cloud environments, and Kubernetes clusters. It serves as a central hub for managing and monitoring access, ensuring that only authorized users can reach the necessary resources. Moreover, it limits access to the duration of a specific work session, meaning that even if an account is compromised, it won’t have prolonged access to critical areas. In simpler terms, StrongDM functions as a proxy server between data sources (like PostgreSQL, MySQL, SQL Server, Oracle, etc.) and client applications/IDEs such as SSMS, MySQL Workbench, or dbForge Studio. It offers a wide range of features to manage user access and log their activities, making it a popular tool for data security management. Key advantages of StrongDM: Unified access management : StrongDM centralizes access to all resources, whether on-premises or in the cloud. With this single platform, you can manage all access-related tasks, such as granting or revoking permissions, without switching between different solutions. Access control : Access policies can be configured based on roles, groups, or specific resources. Management options include just-in-time access (no permanent access), role-based access, attribute-based access, and direct access. Access revocation is instant, which delivers real-time security. Audit and monitoring : The system automatically logs all access attempts and activities, providing detailed audit reports for every session, query, and command performed within integrated tools. This data helps reduce Mean Time to Investigate (MTTI) and Mean Time to Respond (MTTR), helping users quickly identify and address issues, and minimizing potential damage. Secure access : With StrongDM, there’s no need for VPNs, SSH keys, or shared credentials. It provides encrypted, secure connections to resources, ensuring unauthorized access is prevented, even without credentials — users only need to log into StrongDM. Seamless integration : StrongDM can be integrated with a variety of solutions, allowing organizations to maintain high-security standards within their existing workflows. It supports all native protocols, so developers can continue using their preferred tools without interruption. In our guideline, we focus on integrating StrongDM with dbForge Studio for PostgreSQL – a powerful, multi-featured IDE for PostgreSQL and Amazon Redshift databases. Overview of dbForge Studio for PostgreSQL [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is a leading tool for comprehensive database development, management, and administration. With its robust toolkit and intuitive graphical user interface (GUI), the Studio simplifies all database-related tasks with variegated visual tools and automation options. Key features of dbForge Studio for PostgreSQL: Coding assistance : The Studio provides an array of SQL coding tools, including context-aware auto-completion, syntax validation, code formatting, and a library of code snippets, enhancing coding efficiency and accuracy. Database comparison and synchronization : There are specialized tools that help compare database schemas and table data, identify discrepancies, conduct in-depth analysis, and synchronize changes with autogenerated scripts. Visual query profiling : This feature helps users pinpoint and resolve performance bottlenecks in queries. Data import and export : The Studio supports direct data import and export in over ten popular formats, facilitating easy data migration. Test data generation : This tool allows users to generate high-quality, realistic test data in required volumes, with customizable data types and characteristics to align with specific test scenarios. Data analysis and reporting : Features such as Pivot Tables, Chart Designer, and Master-Detail Browser enable users to analyze large data sets and generate detailed, visually engaging reports. CLI-powered task automation : dbForge Studio automates routine tasks through the command-line interface, converting configured settings into executable .bat files for recurring operations, thus freeing up time for more critical tasks. PostgreSQL is renowned for its robust security features, making it a top choice for developers around the world. Integrating StrongDM with PostgreSQL StrongDM is a proxy platform that manages and audits access to databases, servers, clusters, and web applications, helping create a secure network. The architecture of StrongDM includes a local desktop client, a gateway intermediary, and a configuration layer. The local desktop client , which includes both graphical and command-line interfaces, tunnels users’ requests from their workstations to the gateway via a single TLS 1.2-secured TCP connection. This setup is compatible with Windows, macOS, and Linux. Users must log in to the local client for authentication. The gateway serves as the entry point to the network. It can be deployed with a DNS entry, operate privately within the corporate network, or be placed behind a VPN. All data is routed through this network. The gateway decrypts credentials on behalf of end users and breaks down requests for auditing purposes. Admin UI is the configuration layer. Administrators assign roles and set access permissions for users. Any configuration changes are pushed to the local clients and are updated in real time. All of these components work in the following way: users log in through the StrongDM desktop client before establishing a connection with an IDE like dbForge Studio for PostgreSQL. StrongDM monitors all the commands and queries executed by the user in real time and records the entire session, capturing every action the user takes within the Studio, including query execution, data modification, and administration tasks. This recording can be stored for auditing purposes and reviewed if any suspicious activity is detected. Also, StrongDM manages and stores the credentials required to access databases securely. Users do not directly use their own credentials; instead, they check out credentials from this PAM system, which adds an additional layer of security and ensures that password policies are enforced. Besides, users’ actions within dbForge Studio for PostgreSQL will be restricted based on their role set and configured in StrongDM. Now, let us review the process of configuring StrongDM. Step 1. Sign up for a StrongDM account Your first step is to create a StrongDM account. On the [registration page](https://www.strongdm.com/signup) , provide the necessary details and click Submit . After that, you need to verify your email address by clicking on the confirmation link sent to your inbox. Note: If you sign up directly, you’ll create an Administrator account for your organization. This involves providing your work email and setting a password. Step 2. Install the StrongDM desktop client Once your account is set up, [download the StrongDM client](https://app.strongdm.com/app/downloads) to the devices accessing your infrastructure. Install the client on your machine and log in using your admin credentials. You can apply filters to the client to manage the existing data sources more conveniently. Initially, the desktop client contains only test resources; further, it will also include all those added by the Administrator. Step 3. Add a connection host to StrongDM and dbForge Studio for PostgreSQL To integrate dbForge Studio with the secure network managed by StrongDM, you must add the data sources used in the Studio to the StrongDM Infrastructure section. This integration enables StrongDM to monitor user activities on these databases and manage access permissions accordingly. That said, let’s add a data source to StrongDM. Proceed to Infrastructure > Datasources . In this section, you will see the list of existing data sources. To add a new source (a new database in our case), click Add datasource . Provide the necessary details and click Create to complete the process. Note that it might take some time for the new data source to be verified. After that, you can connect to it using the local desktop client. To connect to the added database with dbForge Studio for PostgreSQL, do the following: Launch the Studio, click the New Connection icon in Database Explorer , and configure the connection properties for the database. The host and port should match those specified in the StrongDM client. The username and the password aren’t necessary as the required credentials are stored in and managed by StrongDM. After that, click Test Connection , and the Studio will verify the connection details and connect to the specified database. Step 4. Configure user roles User roles are the method of providing and restricting user access to resources, a collection of permissions granted to the user. In the online console of StrongDM, navigate to the Access section > Roles . This area contains the list of the existing roles and allows the Administrator to create new roles. To create a new role, click Add role . Name the role and click Create role to proceed to the configuration on the Access Rules page. Two methods are available for assigning permissions to a role. Static rules are assigned manually. Dynamic rules are set according to tags and resource types. After defining the role, you can assign users to it. This can be done in the Users section where you only need to select a user or a group of users and set roles for them. To invite users, click Add user in the Users section and enter their email addresses. Step 5. Monitor activities Once everything is set up, you can start using the StrongDM client to access resources securely. When users log in via the StrongDM local client, the roles are applied to them, they can query and access the data sources and servers to which they have been granted access, and their activities are monitored and audited. The full history of all queries executed by the user in dbForge Studio for PostgreSQL is recorded on the Queries page under Logs . In the same section, you can view detailed information about any specific user’s actions on the Activities page. As always, you should periodically review data sources and user permissions to ensure compliance with your security policies. It is done conveniently in the admin console online. Conclusion StrongDM is a viable solution that enhances security, simplifies access management, and secures operations with databases through IDEs such as dbForge Studio for PostgreSQL. It enables real-time logging and monitoring of all database queries and tasks performed through the Studio, providing administrators with critical information that is particularly useful in responding to failures and security threats. dbForge Studio for PostgreSQL also does its part, being specifically designed as a single solution that provides comprehensive functionality for developing, managing, and administering PostgreSQL and Amazon Redshift. You can test both solutions in your actual work environment to assess their effectiveness. dbForge Studio for PostgreSQL offers a [fully functional 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) with personalized tech support. Tags [access management](https://blog.devart.com/tag/access-management) [database security](https://blog.devart.com/tag/database-security) [dbForge Studio for PosrgreSQL](https://blog.devart.com/tag/dbforge-studio-for-posrgresql) [PAM solutions](https://blog.devart.com/tag/pam-solutions) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-integrate-with-strongdm.html) [Twitter](https://twitter.com/intent/tweet?text=What+Is+StrongDM%2C+and+How+dbForge+Studio+for+PostgreSQL+Supports+It&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-integrate-with-strongdm.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-integrate-with-strongdm.html&title=What+Is+StrongDM%2C+and+How+dbForge+Studio+for+PostgreSQL+Supports+It) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-integrate-with-strongdm.html&title=What+Is+StrongDM%2C+and+How+dbForge+Studio+for+PostgreSQL+Supports+It) [Copy URL](https://blog.devart.com/dbforge-integrate-with-strongdm.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge for SQL Server Is Ready for Windows 11 By [dbForge Team](https://blog.devart.com/author/dbforge) September 20, 2021 [0](https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html#respond) 2461 As we all know, Microsoft is preparing for a huge event that everyone has been looking forward to—the release of the next-gen desktop operating system, [Windows 11](https://www.microsoft.com/en-us/windows/windows-11) . We at Devart have checked everything on our side to make sure our tools are all set for the OS shift! [The new, significantly redesigned version of the operating system](https://www.microsoft.com/en-us/windows/windows-11-specifications) so hotly loved by all of us has been available in beta preview for a couple of months, and finally, the Windows 11 release date was announced—October 5, 2021. Keeping up with the times, we have successfully tested our flagship SQL Server and Azure SQL solutions—dbForge Studio for SQL Server and dbForge SQL Tools—on Windows 11. To let our customers better sleep at night and guarantee them a smooth transition to the piping hot OS, we’ve checked everything out and assure that our tools are 100% compatible with the upcoming Windows 11. Tags [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [ready for Windows 11](https://blog.devart.com/tag/ready-for-windows-11) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-line-for-sql-server-is-ready-for-windows-11.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+for+SQL+Server+Is+Ready+for+Windows+11&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-line-for-sql-server-is-ready-for-windows-11.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html&title=dbForge+for+SQL+Server+Is+Ready+for+Windows+11) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html&title=dbForge+for+SQL+Server+Is+Ready+for+Windows+11) [Copy URL](https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-product-line-is-ready-for-windows-11.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) dbForge Product Line Is Ready for Windows 11 By [dbForge Team](https://blog.devart.com/author/dbforge) October 12, 2021 [0](https://blog.devart.com/dbforge-product-line-is-ready-for-windows-11.html#respond) 2639 Following [our recent announcement](https://blog.devart.com/dbforge-line-for-sql-server-is-ready-for-windows-11.html) that dbForge tools for SQL Server were updated and tested for seamless work on [Windows 11](https://www.microsoft.com/en-us/windows/windows-11) , we hurry to make the same statement regarding our tools for MySQL, Oracle, and PostgreSQL databases. Now the entire [dbForge product line](https://www.devart.com/dbforge/) is ready to accompany your database development and administration on the latest Windows OS. dbForge for MySQL The widest set of tools is available for MySQL. You have dbForge Studio, a far more advanced alternative to the conventional MySQL Workbench. You have Data Compare and Schema Compare, a bundle of tools for fast comparison and effective synchronization of data and database schemas, respectively. You have Data Generator, a single tool containing 200+ smart generators of realistic test data. Additionally, you have Documenter and Query Builder (whose names speak for themselves), as well as Fusion, a handy plugin for Microsoft Visual Studio. Now you can give them a go on Windows 11. dbForge for Oracle And what about Oracle? Well, there’s a lot to be found here, too. Starting with the flagship Studio IDE, and all the way up to the abovementioned Data Compare, Schema Compare, Data Generator, Documenter, and Fusion tools, functionally identical to those we offer for MySQL databases. All of these are ready to work on Windows 11 just as well. dbForge for PostgreSQL As for PostgreSQL, we offer a fair share of tools to cover the needs of developers. First and foremost, it’s the same multifunctional dbForge Studio, designed for effective work with queries, data editing, and reporting. Secondly, such essentials as Data Compare and Schema Compare are firmly in place (the latter is a free product). It’s easy to check our tools in action on Windows 11. Simply download the ones you need for a free 30-day trial. Our [free products](https://www.devart.com/free-products.html) are available to you indefinitely. Stay tuned for further updates! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studios](https://blog.devart.com/tag/dbforge-studios) [oracle tools](https://blog.devart.com/tag/oracle-tools) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [ready for Windows 11](https://blog.devart.com/tag/ready-for-windows-11) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-product-line-is-ready-for-windows-11.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Product+Line+Is+Ready+for+Windows+11&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-product-line-is-ready-for-windows-11.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-product-line-is-ready-for-windows-11.html&title=dbForge+Product+Line+Is+Ready+for+Windows+11) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-product-line-is-ready-for-windows-11.html&title=dbForge+Product+Line+Is+Ready+for+Windows+11) [Copy URL](https://blog.devart.com/dbforge-product-line-is-ready-for-windows-11.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Another Day, Another Badge – dbForge Products for SQL Server Win G2 and SoftwareSuggest Awards By [dbForge Team](https://blog.devart.com/author/dbforge) January 5, 2022 [0](https://blog.devart.com/dbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html#respond) 2519 We are happy to announce that the dbForge product line has received multiple remarkable awards. [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) and [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) have won two badges each: both G2 and SoftwareSuggest users found these Devart products to be among the best of their kind. This is yet another proof that Devart is able to constantly evolve and adjust to the market demands. dbForge Studio for SQL Server Hardly had 2022 started when dbForge Studio for SQL Server harvested two prestigious awards: 1. G2 High Performer Winter 2022 G2 users consider [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) to be one of the most powerful and convenient IDEs for SQL Server management. According to them, the solution is rather straightforward and intuitive. SQL developers and DBAs performing complex database tasks can use this GUI tool to speed up almost any of them, including designing databases, writing SQL code, comparing databases, synchronizing schemas and data, generating meaningful test data, and much more. To read more reviews, feel free to visit the [official G2 website](https://www.g2.com/products/dbforge-studio-for-sql-server-2018-12-04/reviews) . 2. SoftwareSuggest Best Meet Requirement Winter 2022 The SoftwareSuggest reviewers have rated [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) by Devart on a basis of four criteria: Feature – 4.9/5 Ease of use – 4.9/5 Value for money – 4.9/5 Customer support – 4.9/5 Consequently, the overall score reached a high value of 4.9/5. For more information, please refer to [SoftwareSuggest](https://www.softwaresuggest.com/dbforge-studio-sql-server) . dbForge SQL Complete [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) has also been awarded by two trustworthy online platforms for Business Software discovery: 1. G2 High Performer Winter 2022 According to G2, [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) is rated 4.5 out of 5 stars, and is used most often by Computer Software professionals. In their reviews, they mention that the everyday code-writing routine can be easily changed by this advanced add-in for Visual Studio and SSMS. With SQL Complete, you will be able to enhance code accuracy and quality, as well as simplify the process of creating SQL queries. The add-in will take care of autocompleting your SQL queries so that you will be able to code faster, yet with a sharper focus. [Explore the reviews](https://www.g2.com/products/dbforge-sql-complete/reviews) on dbForge SQL Complete on the G2 website. 2. SoftwareSuggest Best Result Winter 2022 Just like the previous product, dbForge SQL Complete has been rated according to these four criteria: Feature , Ease of use , Value for money , and Customer support . In this case, however, the scores are slightly different: Feature – 4.7/5 Ease of use – 4.7/5 Value for money – 4.6/5 Customer support – 4.8/5 The evaluated overall score equals 4.9/5. To read the reviews and maybe even write your own, feel free to visit the [SoftwareSuggest website](https://www.softwaresuggest.com/dbforge-sql-complete) . Conclusion Today, we are proud to see the results of our hard work. We are also very thankful to the loyal users that not only provide us with constructive feedback but also spread a good word about Devart among those who haven’t come across our products yet. To get a closer look at other products for SQL Server, MySQL, Oracle, and PostgreSQL, do not hesitate to [visit our website](https://www.devart.com/products.html) . Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [SoftwareSuggest awards](https://blog.devart.com/tag/softwaresuggest-awards) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Another+Day%2C+Another+Badge+%E2%80%93+dbForge+Products+for+SQL+Server+Win+G2+and+SoftwareSuggest+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html&title=Another+Day%2C+Another+Badge+%E2%80%93+dbForge+Products+for+SQL+Server+Win+G2+and+SoftwareSuggest+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html&title=Another+Day%2C+Another+Badge+%E2%80%93+dbForge+Products+for+SQL+Server+Win+G2+and+SoftwareSuggest+Awards) [Copy URL](https://blog.devart.com/dbforge-products-for-sql-server-win-g2-softwaresuggest-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Schema Compare for MySQL to deliver speed, simplicity, and safety! By [dbForge Team](https://blog.devart.com/author/dbforge) June 4, 2009 [0](https://blog.devart.com/dbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html#respond) 2680 Dear users, Devart has released today [dbForge Schema Compare for MySQL](https://www.devart.com/dbforge/mysql/schemacompare/) – a special advanced tool for modern comparison and synchronization of MySQL databases. dbForge Schema Compare for MySQL 1.00 meets the needs of both database administrators and developers to automate and simplify the diversity of tasks related to updating and migrating of MySQL databases. Being part of [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , the company’s bestseller product for MySQL database administration and development, the functionality of dbForge Schema Compare has met approval of many DBAs and developers. Speed, simplicity, and safety Schema Compare is based on the solid cornerstones vital for smooth updates and migration of MySQL databases. They are speed, simplicity, and safety. The new product provides database administrators and developers with capabilities to quickly compare databases regardless of their structure or MySQL server version. It reduces the intensive hours of schema differences management to minutes and guarantees quick databases synchronization without errors and program misoperation. dbForge Schema Compare has passed hundreds of usability tests to simplify schema comparison and deliver the clear display of comparison results listing object differences in the grid and showing their DDL differences under the grid. Multiple options for quick sorting, filtering, and search of schema differences ensure easy management. In modern fast-pace environment safe and error-free schema synchronization is one of key points. The functionality of [MySQL Schema Compare](https://www.devart.com/dbforge/mysql/studio/mysql-database-schema-compare.html) includes the preview and auto-generation of synchronization scripts for every compared schema object. Main product features include: Support of all MySQL server versions 3.23-6.0 Faster comparison of any databases including extra-large ones Clear display of comparison results, no time waste for understanding what to do next Efficient management of compared objects with a rich set of options Text comparer to show DDL differences of compared objects Schema synchronization wizard to generate a standard-driven synchronization script with additional options Integrated SQL editor for advanced work with SQL scripts and query files Free trial version available Download a fully functional 30-day trial version of [dbForge Schema Compare for MySQL](https://www.devart.com/dbforge/mysql/schemacompare/download.html) for free. Go to the [Feedback Page](https://www.devart.com/dbforge/mysql/schemacompare/feedback.html) and tell us what you think about the new product. We are looking forward to your comments and suggestions. Tags [Releases](https://blog.devart.com/tag/releases) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Schema+Compare+for+MySQL+to+deliver+speed%2C+simplicity%2C+and+safety%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html&title=dbForge+Schema+Compare+for+MySQL+to+deliver+speed%2C+simplicity%2C+and+safety%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html&title=dbForge+Schema+Compare+for+MySQL+to+deliver+speed%2C+simplicity%2C+and+safety%21) [Copy URL](https://blog.devart.com/dbforge-schema-compare-for-mysql-100-to-deliver-speed-simplicity-and-safety.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Schema Compare for PostgreSQL v.1.2 rolled out! By [dbForge Team](https://blog.devart.com/author/dbforge) July 29, 2021 [0](https://blog.devart.com/dbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html#respond) 4618 We are thrilled to inform our customers that the new version of [dbForge Schema Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/schemacompare/) has been rolled out. The release brings a number of new features and major improvements. New features in Schema Compare for PostgreSQL v.1.2 Improved Text Compare Control Text diff visualization control has been honed to perfection for you to grasp schema differences quicker and more efficiently. More to the point, now you can customize the diffs colors to your liking. Comparison Reports Schema Compare for PostgreSQL v1.2 got a comprehensive reporting feature, easily customizable to fit specific requirements. It allows exporting the received comparison results in the most popular file formats: HTML, Excel, and XML. Additional Scripts tab added to the Schema Comparison wizard Aiming to give you more control over database deployments, we add the possibility to create custom pre- and post-synchronization scripts that will be run before or after schema synchronization. Support for materialized views comparison and synchronization in Amazon Redshift Striving to keep our product up-to-date, we add the support for materialized views in Amazon Redshift. Try the Schema Compare for PostgreSQL out and enjoy the increased query performance that can be achieved with Amazon Redshift materialized views. Connectivity Support to Heroku Cloud Heroku is a platform as a service allowing developers to build, run, and operate applications entirely in the cloud. It supports multiple programming languages, has powerful integrated data services and mighty ecosystem that enables developers to deploy and run modern apps. We are happy to announce that Schema Compare for PostgreSQL now allows connection to Heroku cloud. Object Filter The Object Filter functionality allows the user to fine-tune database deployment by excluding objects from synchronization according to the specified filter condition. Share your feedback with us [Get the new version](https://www.devart.com/dbforge/postgresql/schemacompare/download.html) of Schema Compare for PostgreSQL and [share your thoughts](https://www.devart.com/dbforge/postgresql/schemacompare/feedback.html) with us! [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Schema+Compare+for+PostgreSQL+v.1.2+rolled+out%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html&title=dbForge+Schema+Compare+for+PostgreSQL+v.1.2+rolled+out%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html&title=dbForge+Schema+Compare+for+PostgreSQL+v.1.2+rolled+out%21) [Copy URL](https://blog.devart.com/dbforge-schema-compare-for-postgresql-v-1-2-rolled-out.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-schema-compare-for-postgresql-v1-3-is-out.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Schema Compare for PostgreSQL v1.3 Is Out! By [dbForge Team](https://blog.devart.com/author/dbforge) January 30, 2023 [0](https://blog.devart.com/dbforge-schema-compare-for-postgresql-v1-3-is-out.html#respond) 2388 To catch up with the rest of the dbForge product line for PostgreSQL, [Schema Compare](https://www.devart.com/dbforge/postgresql/schemacompare/) has also received a couple of nice features in the newly released v1.3—namely, the option to select databases and copy settings in the comparison wizard plus a few useful enhancements for Text Editor. Without further ado, let’s take a closer look. The first feature concerns the comparison wizard—it’s the option to specify the databases to be compared as well as copy source settings to the target database and vice versa. The rest of the features are all about making operations in Text Editor convenient. Now, upon pressing Ctrl+C on a line without highlighting any particular text, the entire line will be copied. After that, you will be able to go to a new line and paste the copied line there using Ctrl+V . And if you need to cut the entire line instead of copying it, press Ctrl+X . Another trick is in the triple click . Now you can use it to select the entire line in Text Editor. The last feature we’ve added in this release is the display of RAISE [ NOTICE | WARNING | INFO ] in Error List & Output messages . Download dbForge Schema Compare for PostgreSQL v1.3 for free today! In comparison with the simultaneous releases of the Studio and Data Compare, this one’s more of a minor enhancement, but we believe it’s going to be a pleasant addition to a free product. Feel free to [download Schema Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/schemacompare/download.html) and check it in action! Tags [PostgreSQL](https://blog.devart.com/tag/postgresql) [Releases](https://blog.devart.com/tag/releases) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-postgresql-v1-3-is-out.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Schema+Compare+for+PostgreSQL+v1.3+Is+Out%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-postgresql-v1-3-is-out.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-schema-compare-for-postgresql-v1-3-is-out.html&title=dbForge+Schema+Compare+for+PostgreSQL+v1.3+Is+Out%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-schema-compare-for-postgresql-v1-3-is-out.html&title=dbForge+Schema+Compare+for+PostgreSQL+v1.3+Is+Out%21) [Copy URL](https://blog.devart.com/dbforge-schema-compare-for-postgresql-v1-3-is-out.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-schema-compare-for-redshift-v10.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) Meet dbForge Schema Compare for PostgreSQL, v1.0! By [dbForge Team](https://blog.devart.com/author/dbforge) September 4, 2020 [0](https://blog.devart.com/dbforge-schema-compare-for-redshift-v10.html#respond) 2736 We are proud to announce the release of a new product from the dbForge Tools for PostgreSQL line – [dbForge Schema Compare for PostgreSQL, v1.0](https://www.devart.com/dbforge/postgresql/schemacompare/) . The tool is best suited when you need to compare Redshift databases quickly and efficiently, as well as to generate synchronization SQL scripts to update target database schemas. Support for Amazon Redshift In this version, we supported Amazon Redshift to compare and synchronize database schemas: Limited Support for PostgreSQL We have implemented the limited support for the following PostgreSQL databases you can use as a source and a target database to compare and synchronize database schemas: PostgreSQL Databases Amazon RDS for PostgreSQL Azure PostgreSQL Support for Redshift Objects You can compare and sync the following Redshift objects: Schemas Tables Views Functions Procedures Comments Privileges Identification of Object Dependencies The tool allows you to sync objects that you have not selected for synchronization but they have been used by objects you added to synchronize. Review of DDL Differences in Database Objects You can view DDL differences for each object pair to analyze the schema comparison result. Execution of Query History You can view basic information about PostgreSQL queries executed in the database for a certain period of time. Execution of Large Scripts You can execute large scripts without opening them in the editor. Share Your Feedback with Us [Download the first version](https://www.devart.com/dbforge/postgresql/schemacompare/download.html) of dbForge Schema Compare for PostgreSQL and [share your thoughts](https://www.devart.com/dbforge/postgresql/schemacompare/feedback.html) with us! For more information about the tools that will help you improve PostgreSQL database development, navigate to [dbForge for PostgreSQL](https://www.devart.com/dbforge/postgresql/) . Tags [Amazon Redshift](https://blog.devart.com/tag/amazon-redshift) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-redshift-v10.html) [Twitter](https://twitter.com/intent/tweet?text=Meet+dbForge+Schema+Compare+for+PostgreSQL%2C+v1.0%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-schema-compare-for-redshift-v10.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-schema-compare-for-redshift-v10.html&title=Meet+dbForge+Schema+Compare+for+PostgreSQL%2C+v1.0%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-schema-compare-for-redshift-v10.html&title=Meet+dbForge+Schema+Compare+for+PostgreSQL%2C+v1.0%21) [Copy URL](https://blog.devart.com/dbforge-schema-compare-for-redshift-v10.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Search vs Redgate SQL Search vs ApexSQL Search: A Comparison of SSMS Add-ins By [dbForge Team](https://blog.devart.com/author/dbforge) May 27, 2022 [0](https://blog.devart.com/dbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html#respond) 3566 The three times that the word “search” is repeated in the title of this article might seem an overkill, but what can we do? The feature itself is absolutely crucial to everyone who happens to work with databases in SQL Server Management Studio. You don’t want to be left without a way to quickly browse your databases for a required piece of data or a certain object. And since the search feature is basically absent from SSMS, you will need to enhance it with an add-in. This is why we picked three top add-ins that help handle search in SSMS, and now we’ll have a brief overview of each. A few words about dbForge Search The first contender is [dbForge Search for SQL Server](https://www.devart.com/dbforge/sql/search/) , a free SSMS add-in that helps you easily locate SQL objects and data across your SQL Server databases. With its help, you no longer need to wade through the entire Object Explorer to locate a required column name or text in a stored procedure. Just enter your search string, whether with wildcards or not, wait a split second, and you’re there. The results can be sorted, filtered, copied—or you can instantly get to the required object in Object Explorer. A few words about Redgate SQL Search Our second contender is Redgate SQL Search, an SSMS/Visual Studio add-in that is generally all about the same things. You can find the required fragments of SQL in your tables, views, stored procedures, functions, jobs, and other objects. You can conduct your search across multiple databases and object types, use wildcards and boolean values in your search, as well as quickly navigate to and find all references to the objects you have found. And it is also available free of charge. In case you would like to test the solution yourself, [download Redgate SQL Search](https://www.red-gate.com/products/sql-development/sql-search/installer/) . A few words about ApexSQL Search This is where we approach the third contender, ApexSQL Search, which is a paid tool that nevertheless comes with a free trial first. It offers text search across database objects and the data they contain, allows editing extended properties of said objects, helps out with the safe renaming of SQL objects, and conveniently visualizes object interdependencies. To check out this solution, feel free to [download ApexSQL Search](https://www.apexsql.com/Download.aspx) . Feature-by-feature comparison Now let’s proceed to a more detailed comparison. Here we tried to single out and compare both basic search features and some of the more advanced ones—all of them would most certainly prove to be worth your attention. Features dbForge Search Redgate SQL Search ApexSQL Search Case-sensitive search + + + Search for an exact word match + + + Search with wildcards + + + Search across multiple databases + + + Search all database objects + + + Search all data + – + Search SQL jobs – + – Navigate to Object Explorer + + + Show matching data for each found object + – – Group search results by object type + – + Copy cells from the grid to clipboard + + + Save search results to XML, CSV, or HTML – – + Show the DDL of each found object + – – Standalone installation + + – Free product + + – Integration with SSMS 2012, 2014, 2016 + + – Integration with SSMS 17, 18, 2016 + + + Note: For this comparison, the following product versions were used: dbForge Search v2.6.3, Redgate SQL Search v3.5.5.2703, and ApexSQL Search v2022.01.0182. Conclusion Each of the three contenders is definitely strong; however, the two free add-ins have the advantage of offering their features for free, and there is no escaping it. Feature-wise, all of them are close, as you could see, but dbForge Search perhaps takes the cake for sheer quantity. Thus your choice is likely to depend on how convenient you will find each of these tools personally. Speaking of dbForge Search—although you can [download it for free](https://www.devart.com/dbforge/sql/search/download.html) and use it as a standalone add-in, it is typically shipped as part of [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , a bundle of 15 essential tools and SSMS add-ins that deliver the following features: IntelliSense-like SQL code completion Easy formatting with custom profiles Smart code refactoring with automatic correction of references to renamed objects Debugging of stored procedures, triggers, and functions Comparison and synchronization of table data and entire database schemas Visual query building Generation of meaningful test data Data analysis and reporting Generation of database documentation Database administration Server performance and event monitoring Integration with DevOps These are only the main features⁠ of dbForge SQL Tools—the full list would be far longer and far more rewarding. Yet we wouldn’t want to waste much of your time, especially since SQL Tools are open for free firsthand exploration. That said, if you are interested in pushing the boundaries and expanding your capabilities in SSMS this far, you might as well check it all out yourself and [download SQL Tools for a free 30-day trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) . Tags [data search](https://blog.devart.com/tag/data-search) [dbForge Search](https://blog.devart.com/tag/dbforge-search) [object search](https://blog.devart.com/tag/object-search) [search data](https://blog.devart.com/tag/search-data) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [SSMS tools and add-in](https://blog.devart.com/tag/ssms-tools-and-add-in) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Search+vs+Redgate+SQL+Search+vs+ApexSQL+Search%3A+A+Comparison+of+SSMS+Add-ins&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html&title=dbForge+Search+vs+Redgate+SQL+Search+vs+ApexSQL+Search%3A+A+Comparison+of+SSMS+Add-ins) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html&title=dbForge+Search+vs+Redgate+SQL+Search+vs+ApexSQL+Search%3A+A+Comparison+of+SSMS+Add-ins) [Copy URL](https://blog.devart.com/dbforge-search-vs-redgate-sql-search-vs-apexsql-search-a-comparison-of-ssms-add-ins.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-source-contol-for-sql-server-21.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) New version of dbForge Source Control for SQL Server released By [dbForge Team](https://blog.devart.com/author/dbforge) June 2, 2020 [0](https://blog.devart.com/dbforge-source-contol-for-sql-server-21.html#respond) 2281 We are excited to announce that a new version of the SSMS add-in for SQL Server database change management, [dbForge Source Control for SQL Server, v2.1](https://www.devart.com/dbforge/sql/source-control/) , has been just rolled out! We keep working on making our [database tools](https://www.devart.com/dbforge/) as useful, convenient, and effective as possible. In this release, we have improved the informativeness of notifications about the errors that may occur under certain circumstances to make their detection and elimination much easier. Availability [Download](https://www.devart.com/dbforge/sql/source-control/download.html) the new version of dbForge Source Control for SQL Server and [share your thoughts](https://www.devart.com/dbforge/sql/source-control/feedback.html) about the product. Your feedback helps us to improve the tool according to your needs. Tags [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-source-contol-for-sql-server-21.html) [Twitter](https://twitter.com/intent/tweet?text=New+version+of+dbForge+Source+Control+for+SQL+Server+released&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-source-contol-for-sql-server-21.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-source-contol-for-sql-server-21.html&title=New+version+of+dbForge+Source+Control+for+SQL+Server+released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-source-contol-for-sql-server-21.html&title=New+version+of+dbForge+Source+Control+for+SQL+Server+released) [Copy URL](https://blog.devart.com/dbforge-source-contol-for-sql-server-21.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-source-control-versus-visual-studio-version-control-whats-different.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Source Control versus Visual Studio Version Control: What’s Different? By [dbForge Team](https://blog.devart.com/author/dbforge) May 26, 2021 [0](https://blog.devart.com/dbforge-source-control-versus-visual-studio-version-control-whats-different.html#respond) 2880 Having a database under source control brings huge benefits. Therefore, we decided to review and compare two popular version control tools, dbForge Source Control add-in for SSMS and Visual Studio Version Control, to help you pick a tool that meets your needs best. A version control system allows a team to work on the same database objects, share code changes among team members, and track the history of database versions. All of these factors contribute to database development efficiency and provide database stability. However, it is important to choose a tool that satisfies your development needs and meets your company’s requirements. So without further ado, let’s plunge into a comparison of the two tools currently popular on the market. Managing Database Objects First and foremost, let’s mention the most striking difference between [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) and [Visual Studio](https://docs.microsoft.com/en-us/visualstudio/version-control/?view=vs-2019) Version Control. Notably, dbForge Source Control, unlike its counterpart, allows working directly with database objects, whereas VS Version Control allows working with files only. This means, if you want to make changes to a particular database object in Visual Studio, you need to make changes to the file directly, generate a script with the changes, save the script with a correct name, and also change the dependent objects. And only after that, you get the possibility to version control your changes. Which is much work that can be avoided. With the dbForge Source Control add-in, you can modify the objects, and let the tool do all the job for you. It decides on what changes are necessary for you by comparing the current state of the database with the version in source control. This allows you to see the changes at once and analyze the results. Besides, you don’t have to worry about dependent objects. Therefore, when it comes to managing database objects, dbForge Source Control is a clear winner as it helps you focus on what’s important and automate routine tasks. Handling Static Data A database is primarily used by us to store and access data. That is why when version controlling a database, it is not only important to keep track of changes in schema objects but also consider the database static data. In that respect, again, dbForge Source Control takes precedence over Visual Studio Version Control. It allows version-controlling static data. Static data, also known as lookup data, represents a set of predefined values that are rarely changed and usually stay fixed. For instance, country or city names, zip codes, names of departments within a company. However, it is important to version-control static data. The reason is static data is critical to proper database processing. It plays a key role in most database transactions. It is of great importance to know who, when, and why updated static data. Version controlling enables us to do that. Within dbForge Source Control, you can easily link your static data to version control, commit any changes, and resolve conflicts if necessary. Plus, you can see the changes in the data grid that is somewhat similar to an Excel spreadsheet. Integration with Version Control System Both tools support the most widely-used modern version control system, Git, as well as cloud-based solutions, such as GitHub, [GitLab](https://www.devart.com/dbforge/sql/source-control/how-to-set-up-source-control-for-gitlab.html) , [BitBucket](https://www.devart.com/dbforge/sql/source-control/how-to-set-up-source-control-for-bitbucket.html) , and [Git in Azure DevOps](https://www.devart.com/dbforge/sql/source-control/version-controlling-git-in-azure-devops.html) in addition to [TFVS in Azure DevOps](https://www.devart.com/dbforge/sql/source-control/version-controlling-tfs-on-azure-devops.html) . However, Source Control has larger integration capabilities that come out of the box. It can also link your database to the following version control systems: Apache Subversion (SVN) Mercurial (Hg) Perforce (P4) SourceGear Vault This said, Visual Studio Version Control provides a rich selection of extensions, including those that refer to the above-mentioned version control systems, that can be integrated into the tool. So, in this respect, although Source Control has its obvious benefits, both tools strive to provide the user with a range of integration choices making it easy to link to any VCS. Working with Git Here, we should talk more about the Git version control system as it has won the hearts of millions of users. Visual Studio Version Control introduced an improved Git experience, which incorporates several new features allowing users to use Git more simply and effectively. First, it became really easy to initialize a local Git repository and push it to GitHub or Azure Repos. Also, they implemented a new Git menu and Git tool window allowing users to create and manage branches as well as commit code changes with just a few clicks. Thus, we may conclude that Visual Studio Version Control has made considerable progress and outperformed its counterpart in this respect. Managing Local Folder It is often necessary to link your database to a local folder that was created with a third-party tool. Let’s see how both tools accomplish this task. dbForge Source Control has a working folder feature. A working folder comprises a pack of SQL script files that represent your database. The tool allows you to source-control a SQL database working folder with your version control system in the same way as you version-control other files. In the Source Control Manager, you can view the changes on the grid, commit them or revert, or identify the owner. The Manager allows you to view the differences between the local and remote copies highlighted in different colors. Unlike Source Control, Visual Studio Version Control doesn’t provide the same feature. However, it has the Workspace Mapping option that allows you to map your entire project to a local folder and then keep track of the changes in this folder. The option is available under the advanced Source Control options. Managing History of Changes When using a version control system, it’s essential to keep the history of changes and know who introduced them and when. Having a complete history of changes allows you to return to older versions, compare the differences between the versions, and troubleshoot any issues. This feature is incorporated in both tools. Using both tools you can do the following: View details of each commit, including date, author, and comment Check a list of code changes introduced with each commit Compare the local version with the remote one However, the dbForge tool has a significant advantage because, among other things, it also allows to: View the changes history for the entire database or for separate objects Check the differences in database objects and not files Identify DDL differences for each database object Hence, dbForge Source Control wins a few extra points in this respect. Database Development Models There are two main database development models, a shared and a dedicated one. While version controlling your database, it is important to have an opportunity to choose between the two of them. Within dbForge Source Control, you can opt for either of the two [database development models](https://blog.devart.com/shared-dedicated-development-models.html?_ga=2.109174861.932011052.1621256666-2002323579.1615292776) . Choosing a development model helps establish the most convenient working environment where you can adjust the database development process to your teams’ needs best. The dedicated model allows developers to work independently on their local copies and then push changes to a central repository. Using this model, a developer can test the changes locally and prevent his push from breaking the code of other developers. Compared to that, a shared model implies that the team works on the same database copy. This development model allows the team to check the latest updates as soon as they arrive. However, within this model, a developer can easily overwrite the changes made by other team members, so one has to be really careful. VS Version Control doesn’t provide a similar feature for version-controlling databases. Handling Conflicts When a team of developers works on a project and commits their changes, it is unavoidable for conflicts to happen. Both tools deliver functionality that allows identifying and resolving conflicts in a convenient way. Visual Studio provides a Merge Editor that represents a three-way merge tool displaying the incoming changes, your current changes, and the result of the merge. Using the interface, you can select to accept all changes from one side or the other as well as select to accept individual changes from either side. In dbForge Source Control, conflicts are displayed in the Conflicts section of the Source Control Document. To resolve a conflict, you can click Get Local to override remote changes with your local ones or Get Remote if you want to accept the remote changes. Besides, the Source Control Document displays DDL diffs for the conflicts so that you can clearly see the reason for the conflict. Conclusion To summarize, the two tools under comparison have their benefits and flaws. However, dbForge Source Control is clearly more convenient when it comes to version controlling a database and the objects it contains. Conversely, Visual Studio Version Control is probably more applicable to managing files. Besides, Visual Studio is a great IDE often used for writing and editing code. Tags [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [ssms](https://blog.devart.com/tag/ssms) [version control](https://blog.devart.com/tag/version-control) [visual studio](https://blog.devart.com/tag/visual-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-source-control-versus-visual-studio-version-control-whats-different.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Source+Control+versus+Visual+Studio+Version+Control%3A+What%E2%80%99s+Different%3F&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-source-control-versus-visual-studio-version-control-whats-different.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-source-control-versus-visual-studio-version-control-whats-different.html&title=dbForge+Source+Control+versus+Visual+Studio+Version+Control%3A+What%E2%80%99s+Different%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-source-control-versus-visual-studio-version-control-whats-different.html&title=dbForge+Source+Control+versus+Visual+Studio+Version+Control%3A+What%E2%80%99s+Different%3F) [Copy URL](https://blog.devart.com/dbforge-source-control-versus-visual-studio-version-control-whats-different.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-sql-complete-extends-functionality.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) SQL Complete extends its functionality By [dbForge Team](https://blog.devart.com/author/dbforge) October 3, 2011 [0](https://blog.devart.com/dbforge-sql-complete-extends-functionality.html#respond) 2986 Devart Team is glad to announce the new release of [SQL Complete v3.1.](https://www.devart.com/dbforge/sql/sqlcomplete/) The new version includes some improvements in both Standard and free Express editions to make your work easier, more effective and convenient. SQL Complete SSMS add-in New Features Linked servers support The new version of SQL Complete features support for IntelliSense when working with linked servers . Just imagine – now you can type your queries with the kind of IntelliSense you are accustomed to even when using all advantages of linked servers configuration! SQL statements support extended Standard edition provides support for GRANT , REVOKE , DENY , ENABLE/DISABLE TRIGGER , and ALTER SCHEMA statements. New Highlight Occurrences option The new option allows you to turn highlighting of occurrences on and off. Besides, all existing options were rearranged to improve product’s usability. Improvements Usability of code completion improvement We revised rules for showing suggestion list especially when [editing existing SQL queries](https://www.devart.com/dbforge/sql/studio/sql-editor.html) . Now the suggestion list does not appear for arithmetic operations, numbers, and semicolons. Installation process customization We improved the product’s usability basing on users’ requests. Now you can select the development environment in which SQL Complete will function when installing the add-in – just select the needed environment from the list of available environments on a special page of the installation wizard. Tracing support We added tracing support for you to help us find and fix errors in our application – if you encountered any problems with SQL Complete, start tracing and send the created file to Devart afterwards. Express Edition Extension SQL statements support extended Support for DROP , EXEC , DECLARE , and SET statements was added. Variables and parameters support Qualify Column Names option is available Availability Consumers can give the updated dbForge SQL Complete a test drive by downloading the 14-day trial Standard edition at the product [download page](http://devart.com/dbforge/sql/sqlcomplete/download.html) . If you don’t need advanced code completion features but want to get more than SSMS IntelliSense can give, you are welcome to try free Express edition. To leave feedback, users can go to the [SQL Complete feedback page](https://www.devart.com/dbforge/sql/sqlcomplete/feedback.html) . The Devart team is looking forward to receiving any comments and suggestions. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql complete](https://blog.devart.com/tag/whats-new-sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-complete-extends-functionality.html) [Twitter](https://twitter.com/intent/tweet?text=SQL+Complete+extends+its+functionality&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-complete-extends-functionality.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-sql-complete-extends-functionality.html&title=SQL+Complete+extends+its+functionality) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-sql-complete-extends-functionality.html&title=SQL+Complete+extends+its+functionality) [Copy URL](https://blog.devart.com/dbforge-sql-complete-extends-functionality.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) SQL ALTER COLUMN Command: Quickly Change Data Type and Size By [Rosemary Asufi](https://blog.devart.com/author/rosemarya) May 5, 2025 [0](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html#respond) 70 ALTER COLUMN is one of the simplest SQL commands to run—and one of the fastest ways to break a system. It looks harmless: one line of code, a new data type, hit execute. But that illusion of simplicity is exactly how production goes down. Behind that single command, you’re rewriting a shared dependency. Every view, every stored procedure, every query, every integration—it all ties back to that column. If you don’t know what’s connected to it, you don’t know what you’re breaking. This is why a precise understanding of ALTER COLUMN is non-negotiable—and why serious engineers rely on tools like [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , to surface every dependency before they make a move. This guide shows you how to alter columns without breaking that trust—through precision, control, and an understanding of everything at stake. Let’s dive in! Table of contents What is the SQL ALTER COLUMN command? How to change data type in SQL with ALTER COLUMN ALTER COLUMN in SQL Server (T-SQL) Changing column size with SQL ALTER COLUMN Common errors with SQL ALTER COLUMN Best practices with SQL ALTER COLUMN Enhance your SQL ALTER COLUMN tasks with dbForge Studio for SQL Server Conclusion Frequently Asked Questions (FAQ) What is the SQL ALTER COLUMN command? The [SQL ALTER COLUMN](https://blog.devart.com/clone-colums-data-in-sql-server.html) clause, used within the broader ALTER TABLE statement, is a key part of how you modify columns in SQL, including changing their data type, size, and nullability. It’s a DDL (Data Definition Language) operation that enables schema-level changes without dropping and recreating the table. This makes it essential for preserving data integrity and minimizing disruption in production. Its primary purpose is to redefine a column’s attributes, including: Data type changes (e.g., INT to BIGINT, or VARCHAR(100) to VARCHAR(255)) Length or precision updates (e.g., for VARCHAR, DECIMAL, FLOAT) Nullability toggles (NULL / NOT NULL) Collation adjustments for character columns While standard SQL supports a limited set of alterations, Transact-SQL (T-SQL) —SQL Server’s proprietary extension—offers broader support with stricter rules and syntax. -- T-SQL example: Increase column length and enforce NOT NULL \n-- Ensure no NULL values exist in LastName before executing \nALTER TABLE dbo.Employees \nALTER COLUMN LastName VARCHAR(200) NOT NULL; This command increases column size without dropping data. Pro tip : Reducing column size or converting between incompatible types (e.g., VARCHAR to INT) can cause errors and may require manual fixes. Typical use cases Schema evolution : Convert INT to DECIMAL(10,2) for currency precision Normalization : Switch from CHAR to VARCHAR to optimize storage Data integrity : Enforce NOT NULL to validate required fields Performance tuning : Adjust types or lengths to improve indexing and reduce I/O In SQL Server, ALTER COLUMN affects keys, indexes, triggers—everything tied to the schema. Next, we’ll walk through how to make those changes safely and without surprises. How to change data type in SQL with ALTER COLUMN Use ALTER COLUMN to change the datatype of a column in SQL without dropping the table. It’s ideal for refining legacy schemas or scaling models, but in production, changes like this require careful planning to avoid data loss or downtime. Step-by-step instructions to change data type To [change data type in SQL](https://blog.devart.com/sql-server-primary-key.html) safely, follow these essential steps. 1. Assess compatibility Check that all existing values can be converted to the target type, and return an error if conversion is not possible. -- Check for invalid conversions \nSELECT * FROM [TableName] \nWHERE TRY_CAST([ColumnName] AS [NewDataType]) IS NULL; 2. Modify the column Update the column’s type using ALTER TABLE. SQL Server requires explicit nullability. ALTER TABLE TableName \nALTER COLUMN ColumnName NewDataType [NULL | NOT NULL]; 3. Check constraints and dependencies If the column is part of a key, index, view, trigger, or computed column, you’ll need to drop and recreate those objects in the right sequence. Use system views like sys.sql_expression_dependencies or sys.dm_sql_referenced_entities to identify them. Avoid deprecated tools like sp_depends. 4. Validate in a staging environment Before applying changes to production, test the alteration on a copy of the table to detect any unexpected data issues, performance degradation, or application breakages. 5. Monitor locking and performance In SQL Server, altering large columns, especially on high-volume tables, can cause schema locks and impact performance. Plan these changes during low-traffic windows or use online schema change tools where possible. Pro tip: Schema changes often go beyond a single column. If you’re also planning to expand the table and wondering [how to add multiple columns in SQL](https://www.devart.com/dbforge/sql/studio/add-column-to-table-sql-server.html) , use one ALTER TABLE command with multiple ADD clauses: ALTER TABLE Employees \nADD StartDate DATE, Status VARCHAR(50); This method is more efficient than issuing separate statements and helps maintain a cleaner version history when working in CI/CD environments. Common ALTER COLUMN changes and when to use them Knowing [how to change data type in SQL](https://blog.devart.com/alias-for-columns-in-sql-query.html) is important—especially when tuning performance, improving precision, or adapting to larger data volumes. Below are typical transformations, real SQL examples, and why they matter. Common code examples -- INT to BIGINT \nALTER TABLE Orders \nALTER COLUMN OrderID BIGINT NOT NULL; \n \n-- VARCHAR(100) to VARCHAR(255) \nALTER TABLE Customers \nALTER COLUMN EmailAddress VARCHAR(255); \n \n-- FLOAT to DECIMAL \nALTER TABLE Transactions \nALTER COLUMN Amount DECIMAL(12,2); Reference table: When to use each change From To Use case INT BIGINT Support larger numeric ranges or ID values CHAR(n) VARCHAR(n) Save space with variable-length strings FLOAT DECIMAL(p,s) Improve accuracy for currency or scientific values TEXT (legacy) VARCHAR(MAX) Replace deprecated types with modern equivalents DATETIME DATETIME2 Gain higher precision and extended date ranges Pro tip : Changing data types can invalidate indexes and execution plans. Always test performance and update statistics after deployment. ALTER COLUMN in SQL Server (T-SQL) In Microsoft SQL Server, altering a column is done using the ALTER TABLE … ALTER COLUMN command—which is part of T-SQL, SQL Server’s proprietary extension of standard SQL. It’s the standard approach if you want to alter a column’s datatype in SQL Server while preserving the table structure. However, before making changes, it’s important to understand the [ALTER COLUMN SQL Server](https://blog.devart.com/sql-database-design-basics-with-example.html) syntax to avoid unexpected errors or data issues. The operation is commonly used to: Expand or reduce the storage size of a column (e.g., VARCHAR(100) → VARCHAR(255)) Change a data type to support different precision (e.g., INT → BIGINT, or FLOAT → DECIMAL) Add or remove NULL constraints T-SQL syntax for ALTER COLUMN ALTER TABLE [schema_name].[table_name] \nALTER COLUMN [column_name] [new_data_type] [NULL | NOT NULL]; Example 1: Increase VARCHAR size ALTER TABLE dbo.Customers \nALTER COLUMN EmailAddress VARCHAR(255) NOT NULL; Example 2: Change numeric precision ALTER TABLE dbo.Payments \nALTER COLUMN Amount DECIMAL(12,2) NOT NULL; Please note: SQL Server requires you to explicitly include NULL or NOT NULL in the ALTER COLUMN statement—even if you’re not changing that part. Leaving it out will trigger an error. Key considerations and restrictions Before using ALTER COLUMN, keep these critical limitations in mind. 1. Column constraints You cannot use ALTER COLUMN to modify certain constraints (e.g., default values, primary keys). These must be dropped and recreated using ALTER TABLE … DROP CONSTRAINT and ADD CONSTRAINT. 2. Indexed columns Altering a column that is part of an index, primary key, or foreign key requires dropping those dependencies first. SQL Server enforces strict validation to prevent structural inconsistencies. 3. Data truncation risk SQL Server does not automatically truncate data when reducing column size. Attempting to reduce a VARCHAR(255) column to VARCHAR(100) with longer existing values will trigger a runtime error. 4. Recompilation and plan caching Schema changes trigger query plan invalidation. After altering a column, SQL Server may recompile affected queries, so performance testing post-deployment is essential. 5. Older compatibility levels Some syntax behaviors may vary based on the database’s compatibility level (e.g., SQL Server 2012 vs. SQL Server 2022). Always validate changes in the context of the actual server version. ALTER TABLE MODIFY COLUMN in SQL Server Although ALTER COLUMN is the correct T-SQL syntax, developers coming from MySQL or Oracle backgrounds might expect to use SQL MODIFY COLUMN instead. This common confusion often arises during cross-platform migrations or when working in polyglot environments. Knowing the difference between [SQL MODIFY COLUMN](https://blog.devart.com/column-level-sql-server-encryption-example-using-sql-complete.html) in MySQL and the correct ALTER COLUMN syntax in SQL Server can help avoid syntax errors early in the development process. -- Invalid in SQL Server \nALTER TABLE Employees MODIFY ColumnName VARCHAR(100); \n \n-- Correct T-SQL syntax \nALTER TABLE Employees ALTER COLUMN ColumnName VARCHAR(100); This distinction is a common point of confusion during database migrations or when working in polyglot environments. Practical example: changing a nullable column to NOT NULL -- Ensure all rows have values before enforcing NOT NULL \nUPDATE dbo.Users \nSET Username = 'N/A' \nWHERE Username IS NULL; \n \n-- Now change the nullability \nALTER TABLE dbo.Users \nALTER COLUMN Username VARCHAR(50) NOT NULL; This sequence is often required in production systems where application logic begins to enforce stricter rules at the database level. Changing column size with SQL ALTER COLUMN Changing a column’s size—like increasing a VARCHAR or DECIMAL—is common, but not risk-free. Without validating existing data, dependencies, or performance impact, these changes can trigger runtime errors and integrity issues. Syntax for adjusting column size ALTER TABLE [schema_name].[table_name] \nALTER COLUMN [column_name] [data_type](new_length_or_precision); Example 1: expanding a VARCHAR column ALTER TABLE dbo.Customers \nALTER COLUMN EmailAddress VARCHAR(255) NOT NULL; This increases the column length—typically from something like VARCHAR(100)—to support longer string values. Since you’re increasing capacity, this operation is non-destructive and preserves existing data. Example 2: increasing DECIMAL precision ALTER TABLE dbo.Orders \nALTER COLUMN TotalAmount DECIMAL(18,4); This revision allows for more precise numeric values, which is often critical in financial applications where rounding errors from floating-point types (like FLOAT or REAL) are unacceptable. What to watch for when changing column size Changing column size seems simple, but it can break systems if you miss these checks. Check performance impact before increasing column size Expanding a column’s size—like going from VARCHAR(100) to VARCHAR(255)—doesn’t delete or modify existing data. SQL Server handles this change without complaint. But don’t assume it’s safe. On large tables, this operation can still lock the table or affect indexes. Always run the change in a staging environment first. Validate existing data before shrinking columns If you reduce the size of a column, SQL Server will block the operation if any data exceeds the new limit. You must check and clean up data beforehand. SELECT EmailAddress \nFROM dbo.Customers \nWHERE LEN(EmailAddress) > 100; If rows are returned, resolve them before altering the column. Otherwise, the command will fail. Drop indexes and constraints when required If the column you’re modifying is part of a primary key, foreign key, unique constraint, or index, SQL Server won’t allow the change. You must: Drop the constraint or index. Alter the column. Recreate the constraint or index. There’s no workaround—SQL Server enforces this to protect data integrity. Update application code and integrations When you change a column’s size, update everything that depends on it. That includes: Input validations in frontend forms API schemas and client-side models ETL processes and reporting tools Stored procedures and views Failing to update upstream and downstream systems will result in broken forms, rejected requests, and inconsistent data handling. Plan for locks and performance hits on large tables Altering a column on a large table can trigger schema locks and high I/O. SQL Server might rewrite rows or escalate locks depending on the data type and structure. Monitor tempdb and the transaction log closely if the table handles high concurrency. Run the change during a maintenance window, and never push it straight to production without testing. Practical tips for enterprise environments In enterprise systems, column changes ripple far beyond the database. Keep these practices in mind to avoid downstream failures. Review all application-layer constraints (e.g., UI field limits, API contracts) that might assume a fixed column size. Run schema diff checks in CI/CD environments using dbForge DevOps Automation for SQL Server to ensure consistent database changes across different environments. Coordinate with data consumers , including reporting systems, BI pipelines, or external integrations, before deploying any structural change. Common errors with SQL ALTER COLUMN Even experienced developers run into issues when altering columns—especially in production systems where constraints, dependencies, and data volume raise the stakes. Below are the most frequent errors you’ll encounter with ALTER COLUMN in SQL Server, along with guidance on how to avoid or resolve them. 1. Incorrect syntax or missing keywords SQL Server enforces strict syntax rules when altering a column. One of the most frequent mistakes is omitting NULL or NOT NULL, even when you’re only changing the data type. SQL Server won’t infer nullability—you must state it explicitly. Example of incorrect syntax -- This will fail \nALTER TABLE Employees \nALTER COLUMN LastName VARCHAR(100); SQL Server requires you to specify the nullability explicitly: Correct syntax ALTER TABLE Employees \nALTER COLUMN LastName VARCHAR(100) NOT NULL; Always verify the current column definition before making changes, so you include the correct nullability in the statement. 2. Reducing column size without validating data One of the riskiest mistakes is shrinking a column without checking whether existing values still fit. SQL Server will block the operation if truncation is possible—but ETL tools or scripts outside SQL Server may still silently cut data. What happens when you skip validation? ALTER TABLE Products \nALTER COLUMN ProductName VARCHAR(50); If any ProductName exceeds 50 characters, SQL Server throws an error. In environments without proper transaction handling this can result in partial, inconsistent changes. What to do instead? Run this check before applying the change. SELECT ProductName \nFROM Products \nWHERE LEN(ProductName) > 50; Clean or trim values manually, or reconsider whether downsizing the column is necessary. 3. Ignoring constraints and dependencies Trying to alter a column involved in a constraint, index, trigger, or computed column will trigger an error. SQL Server blocks the change to preserve structural integrity. The error when constraints are overlooked Msg 5074, Level 16, State 1   \nThe object 'PK_Orders' is dependent on column 'OrderID'. This often appears when altering a column that’s part of a primary key. What to do instead? Drop the constraint. ALTER TABLE Orders DROP CONSTRAINT PK_Orders; Alter the column. Recreate the constraint: ALTER TABLE Orders ADD CONSTRAINT PK_Orders PRIMARY KEY (OrderID); Use system views like sys.foreign_keys or tools like SSMS’s View Dependencies to spot conflicts before making changes. 4. Changing data types that aren’t compatible SQL Server can’t always convert data from one type to another. For example, trying to change a column from VARCHAR to INT will fail if the column contains any non-numeric characters. Error caused by invalid data conversion ALTER TABLE Sales \nALTER COLUMN Quantity INT; If the Quantity column contains ‘ten’, ‘five’, or even empty strings, the command will fail with a conversion error. What to do instead? Use TRY_CAST() or ISNUMERIC() to identify problematic values: SELECT Quantity \nFROM Sales \nWHERE ISNUMERIC(Quantity) = 0; Clean the data or use a staging table to migrate and validate records before applying the type change. 5. Assuming the change is safe because it works in development Just because a column change worked in a development environment doesn’t mean it’s safe in production. Dev databases often have limited data, no indexes, and fewer constraints. In production, the same change could: Lock critical tables Violate constraints Fail due to data size or precision issues What to do instead? Always test on production-like data. Use a copy of the production schema, including indexes and constraints, to validate your change under realistic conditions. Best practices with SQL ALTER COLUMN Altering a column is not a lightweight operation. It impacts schema integrity, data availability, and downstream systems. Treat it like a production deployment—not a simple tweak. Here’s how you execute it like a professional. Backup the database every single time. Always ensure you have a restorable backup. If the operation fails or corrupts data, rollback is your only safety net. BACKUP DATABASE YourDB TO DISK = 'D:\\Backup\\YourDB.bak'; Test the change against production-grade data Dev environments are safe—but misleading. If you want real answers, test in staging that mirrors production. Clone the database, load it with real-world data volumes, rebuild indexes, and enforce constraints then run your change. If it locks tables, breaks views, or tanks performance, you’ll find out now—not after it hits users. Find every dependency before you break something Your column is probably tied to views, triggers, procedures, constraints, or apps. Assume it is—then confirm it. Use this code. SELECT OBJECT_NAME(referencing_id), referenced_entity_name   \nFROM sys.sql_expression_dependencies   \nWHERE referenced_entity_name = 'YourColumn'; Or query sys.dm_sql_referenced_entities. You’re not just changing a column—you’re changing everything connected to it. Don’t break production because you skipped a dependency check. Coordinate and automate your changes Schema changes affect everyone—from BI to DevOps. Communicate the impact early, and script every change into version control. Use dbForge DevOps Automation for SQL Server to automate database deployments, and ensure consistent and efficient CI/CD integration. Monitor for locking and I/O spikes Schema changes can lock tables, inflate transaction logs, and spike I/O—especially on high-volume systems. Watch: sys.dm_exec_requests for blocking Transaction log usage tempdb pressure Execution plans (check for plan invalidation) Run changes during off-peak hours. Or better: use a controlled deployment window. You’re not just changing a column, you’re affecting everything hitting that table. Enhance your SQL ALTER COLUMN tasks with dbForge Studio for SQL Server In enterprise environments, column changes are never casual. They impact data integrity, performance, and uptime. Manual T-SQL may get the job done—but it leaves too much room for error, oversight, and inconsistency. You need tools like dbForge Studio that eliminates guesswork and puts you in full control. dbForge Studio is a SQL editor and [SQL Server GUI](https://www.devart.com/dbforge/sql/studio/) tool—built to handle complex schema changes, including ALTER COLUMN operations, with precision, visibility, and rollback safety. With it, you can: Edit column types, lengths, and nullability through a guided UI—no syntax mistakes. Instantly surface dependencies before making changes. Generate auditable migration scripts with rollback plans. Analyze change impact before execution, so nothing breaks downstream. Synchronize schema changes across environments with precision. Apply changes transactionally, with error handling and traceability built in. [Download the free trial of dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) and upgrade how you manage schema changes—starting today. Video Tutorial: Changing Column Data Types with dbForge Studio Want a quick visual guide? Watch this concise tutorial on how to safely perform a data type change in SQL Server using dbForge Studio. The video highlights how the tool makes the process more user-friendly and error-resistant—ideal for handling schema changes with confidence. Click the thumbnail to watch how to use the graphical interface to execute ALTER COLUMN operations safely and efficiently. Conclusion Altering a column is not a casual operation. It reshapes your schema, touches your constraints, and impacts every query that depends on it. In production, mistakes here cost more than time—they cost trust. Approach every ALTER COLUMN with the same rigor you apply to code: plan it, test it, control it. validate your assumptions, track your dependencies, and own the change from start to finish. Also, remember, relying on raw SQL alone invites risk. You need visibility into every change, consistency across environments, and a reliable audit trail to stay in control. That’s where tools like dbForge Studio for SQL Server come in. They don’t replace best practices, but rather enforce them at every stage. By doing so, you’re not just modifying structure, you’re preserving the integrity of everything built on top of it. Frequently Asked Questions (FAQ) What is the correct syntax for the ALTER COLUMN command in SQL Server? Use the ALTER TABLE statement followed by ALTER COLUMN, specifying the column name, data type, and nullability. For example: ALTER TABLE Employees \nALTER COLUMN LastName VARCHAR(100) NOT NULL; SQL Server requires explicit nullability even if you are not changing that attribute. How do I safely change a column’s data type without losing data? First, check that all existing values can be converted to the new data type. Use queries with TRY_CAST() or ISNUMERIC() to find incompatible data. Always back up the database and validate the change in staging before applying it to production. Can I use ALTER TABLE … MODIFY COLUMN in SQL Server? No. The MODIFY keyword is not valid in T-SQL. Use ALTER TABLE … ALTER COLUMN instead. MODIFY is used in MySQL and Oracle. What’s the difference between standard SQL and T-SQL for ALTER COLUMN? Standard SQL and T-SQL differ in syntax and behavior. For example, T-SQL requires explicit nullability and enforces stricter rules around dependencies. Always consult the SQL Server documentation when working in a Microsoft environment. Can I change a column that’s part of a primary key or index? Yes, but only after dropping the constraint or index. SQL Server will not allow you to alter a column involved in constraints, indexes, or computed columns without removing those dependencies first. How do I modify a column length in a view after altering the base table? You’ll need to recreate the view. SQL Server does not automatically propagate changes to dependent views. Drop and recreate the view with the updated schema. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Rosemary Asufi](https://blog.devart.com/author/rosemarya) As a technical content writer, I bring a unique blend of analytical precision and creativity to every article. I'm passionate about simplifying complex topics around data, connectivity, and digital solutions, making them accessible and practical for audiences across different industries. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-studio-sql-alter-column.html) [Twitter](https://twitter.com/intent/tweet?text=SQL+ALTER+COLUMN+Command%3A+Quickly+Change+Data+Type+and+Size%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-studio-sql-alter-column.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html&title=SQL+ALTER+COLUMN+Command%3A+Quickly+Change+Data+Type+and+Size%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html&title=SQL+ALTER+COLUMN+Command%3A+Quickly+Change+Data+Type+and+Size%C2%A0) [Copy URL](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [Products](https://blog.devart.com/category/products) [Best Synthetic Data Generators](https://blog.devart.com/synthetic-data-generation-tools.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-sql-tools-success-story-streamlined-queries.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How dbForge SQL Tools Accelerated Creating Complex Queries and Boosted Financial Data Management at TC Transcontinental By [Victoria Shyrokova](https://blog.devart.com/author/victorias) October 31, 2024 [0](https://blog.devart.com/dbforge-sql-tools-success-story-streamlined-queries.html#respond) 746 An ability to work with databases with increased efficiency and speed can essentially contribute to a company’s success, assisting with the optimization of the overall workflow and the reduction of operational costs as there is less time one needs to spend on SQL coding, refactoring, and code optimization. However, it takes effort to find a product that would help with these goals while being reliable and stable and providing fast results without requiring any complex onboarding. About client TC Transcontinental is a leading flexible packaging and retail company in North America and Canada that provides extensive services to businesses looking to attract and retain clients. Business need As a Senior Programmer Analyst BI at TC Transconintal, André St-Onge sought an optimal way to streamline financial data management and speed up complex query creation across different environments. Thus, it was important to ensure that the chosen solution could handle the scripting part efficiently and can help with database schema and data comparison. Chosen solution Among a vast abundance of products that promise this functionality, dbForge SQL Tools provide a clear and easy-to-use set of SSMS add-ins that are aimed at assisting in one’s work with SQL Server, including: dbForge SQL Complete add-in for Intellisense-like code suggestions, accurate code autocompletion features, advanced formatting, and reusable code snippets. dbForge Query Builder for visual query construction that doesn’t require manual interference and provides everything to query data efficiently. dbForge Source Control for the ability to commit and revert changes, efficiently resolve any conflicts, and compare database versions. dbForge Shema Compare for efficient comparison between the database structures when different servers are used. Check the full list of add-ins and features available within [dbForge SQL Tools >](https://www.devart.com/dbforge/sql/sql-tools/) Achieved results Implementation of dbForge SQL Tools helped André St-Onge quickly re-write SQL statements to ensure the queries are executed efficiently and eventually increase the speed of the used workflow . Another essential improvement was the ability to compare the schema in databases across different servers and use the Source Control functionality to ensure extra safety when making any modifications to code and then releasing changes from the development environment to Production. The ultimate benefit of the project was an all-encompassing approach that didn’t require major changes to the stack used before while assisting with the challenges at hand. It was possible to use all the features with the SSMS, boosting the overall experience. Get dbForge SQL Tools for a free 30-day trial today! This success story is one of many, and yours can be next. Try dbForge SQL Tools for free to explore the robust functionality and level up your experience with SSMS. You don’t have to choose from 15 tools and add-ins since you get them all to streamline coding, monitor SQL server performance, migrate data, perform testing, and develop documentation. All you need to level up your workflow with SSMS is within reach with [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) ! Tags [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [success story](https://blog.devart.com/tag/success-story) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-tools-success-story-streamlined-queries.html) [Twitter](https://twitter.com/intent/tweet?text=How+dbForge+SQL+Tools+Accelerated+Creating+Complex+Queries+and+Boosted+Financial+Data+Management+at+TC+Transcontinental&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-sql-tools-success-story-streamlined-queries.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-sql-tools-success-story-streamlined-queries.html&title=How+dbForge+SQL+Tools+Accelerated+Creating+Complex+Queries+and+Boosted+Financial+Data+Management+at+TC+Transcontinental) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-sql-tools-success-story-streamlined-queries.html&title=How+dbForge+SQL+Tools+Accelerated+Creating+Complex+Queries+and+Boosted+Financial+Data+Management+at+TC+Transcontinental) [Copy URL](https://blog.devart.com/dbforge-sql-tools-success-story-streamlined-queries.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio Awarded Top Rated Backup Software of 2021 by SoftwareWorld By [dbForge Team](https://blog.devart.com/author/dbforge) August 11, 2021 [0](https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html#respond) 2571 It didn’t take long since the [dbForge Studio](https://www.devart.com/dbforge/studio/) product line [received its latest award](https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html) — and now we’ve got a new one to tell you about. Our all-in-one development IDEs for the most popular types of databases — [MySQL](https://www.devart.com/dbforge/mysql/studio/) and [SQL Server](https://www.devart.com/dbforge/sql/studio/) — were awarded by SoftwareWorld, a review platform that helps businesses of various industries find optimal software solutions, providing unbiased lists of top products by category. Top Rated Backup Software of 2021 One of these lists, [Top Rated Backup Software of 2021](https://www.softwareworld.co/top-rated-backup-software/) , included dbForge Studio among the Top 5 competitors. It is yet another achievement that we owe to the positive reviews of our users (by the way, this list is also based on reviews aggregated from other independent reviewing platforms). Now we would love to say a big thank you to SoftwareWorld and to each and every user who became part of this recognition. About dbForge Studio dbForge Studio is a multifunctional IDE for database development, management, and administration — and its backup and restore functionality is only a small part of what it has to offer. The Studio comprises tools for database design, work with SQL queries, comparison of database schemas and data, database object management, generation of realistic test data and database documentation, data analysis, reporting, and CLI-based automation. dbForge Studio was designed to empower users and help them speed up and streamline their daily routine. You can learn more about the results achieved by actual businesses from our [success stories](https://www.devart.com/success-story/) ; or you can simply [get a 30-day free trial of the required Studio](https://www.devart.com/dbforge/) on our website and evaluate it yourself. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+Awarded+Top+Rated+Backup+Software+of+2021+by+SoftwareWorld&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html&title=dbForge+Studio+Awarded+Top+Rated+Backup+Software+of+2021+by+SoftwareWorld) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html&title=dbForge+Studio+Awarded+Top+Rated+Backup+Software+of+2021+by+SoftwareWorld) [Copy URL](https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Studio for MySQL 4.00 Beta Treats MySQL Data like the Users Want By [dbForge Team](https://blog.devart.com/author/dbforge) December 11, 2009 [0](https://blog.devart.com/dbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html#respond) 3008 Devart today previewed dbForge Studio for MySQL 4.00 , scheduled for release this December, and announced the immediate availability of a beta release that supplies database-dealing people with the very data management capabilities they want. Having accumulated extensive knowledge in data management preferences and routines, Devart has created [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) 4.00 Beta that predicts and delivers expected capabilities in any phase of dealing with data. The users can manage the data from soup to nuts in one place and with the same little effort as usual. The highlights of dbForge Studio for MySQL 4.00 Beta include: Data Import. Guided by a click-and-pick wizard, the users can easily import data from seven widely-popular formats (CSV, DBF, MS Access, MS Excel, ODBC, Text, XML) with a multitude of settings to meticulously tune the data. Improved Data Editor. Up to 40 improvements were implemented to create quicker, easier, and more delighted ways for data management. The Data Editor spares tens of mouse clicks a day for you saying nothing of tens of frowns. Pivot Grid. The users can convert large amounts of data into compact and informative summaries – pivot tables. They can rearrange any obscure data by a simple drag of a mouse to get the layout best for understanding data relations and dependencies. Data Reports. Every user can benefit from a modern way of generating reports. Less time and manual work is required – from getting required data from a database, analyzing, to printing it as a smart, clear, and stylish report tuned for each particular case. Virtual Relations on Database Diagram. Besides physically existing foreign key relations, dbForge Studio for MySQL allows creating virtual relations. They help create relations between those tables which storage engine do not support foreign keys. Later virtual relations can be easily converted into physical foreign keys. Command Line Launch for Data and Schema Synchronization Tools. This gives more freedom for the users who perform repeated synchronizations. It takes only to set a desired time to launch the process and then come back to check the desired result at a proper time. Schema Comparison Reports. Keeping records of schema changes is now automated. dbForge Studio for MySQL can generate a clear and attractive comparison reports upon the comparison completion. Improvements to Better Serve Specific Users’ Needs. The new version contains many essential improvements that increase the efficiency in working with data. Here are some of them: addition of active database selector to SQL Document, SQL formatting improvements, redesign of stored procedure/function editor and SQL execution parameter editor. Download a fully-functional 30-day trial version of [dbForge Studio for MySQL 4.00 Beta](https://www.devart.com/dbforge/mysql/studio/download.html) and add its features to the arsenal of your tools. Tell us what you think about the new version at [dbForge Studio for MySQL Feedback Page](https://www.devart.com/dbforge/mysql/studio/feedback.html) . We are looking forward to your comments and suggestions. Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+4.00+Beta+Treats+MySQL+Data+like+the+Users+Want&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html&title=dbForge+Studio+for+MySQL+4.00+Beta+Treats+MySQL+Data+like+the+Users+Want) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html&title=dbForge+Studio+for+MySQL+4.00+Beta+Treats+MySQL+Data+like+the+Users+Want) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-4-00-beta-treats-mysql-data-like-the-users-want.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-90.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) Discover new dbForge Studio for MySQL 9.0 hugely packed with new stuff By [dbForge Team](https://blog.devart.com/author/dbforge) May 15, 2020 [0](https://blog.devart.com/dbforge-studio-for-mysql-90.html#respond) 3568 We are excited to announce that the new version of our MySQL IDE, [dbForge Studio for MySQL, v9.0](https://www.devart.com/dbforge/mysql/studio/) , has been just rolled out! The new version delivers completely new functionality, including Find Invalid MySQL Objects , provides new connectivity options , and contains massive improvements of existing features, including MySQL Data and Schema Compare, Data Import and Export, Code Completion, and much more. So, without further ado, let’s take a look at what version 9.0 has in store! Find invalid Objects This brand new feature allows searching through a database schemas for invalid objects that require recompilation. Support for New Objects The updated dbForge Studio for MySQL allows working with the following objects established in MariaDB 10.3: Packages Sequences Connectivity Improvements C onnectivity support for MariaDB 10.5 and SkySQL has been implemented to ensure that the users of dbForge Studio for MySQL can work with the latest database engines and cloud database solutions. The Property window now displays Server type for MySQL, MariaDB, Percona, Amazon, Alibaba Cloud, and Tencent Cloud. The Connections section of the System window displays Server Type for MySQL, MariaDB, Percona, Amazon, Alibaba Cloud, and Tencent Cloud. The Database Connection Properties and Test Connection windows have been completely redesigned. SQL Document Improvements The – nowarn and – endnowarn tags have been added to the Execution warnings functionality for excluding blocks of code from the check. An option to export execution history in the CSV file format has been added. Code Completion Improvements Code completion is now available even in the body of triggers and events. Data Editor Improvements Cached Updates mode allows you to control data editing within a single database object. For the object you have turned the mode on, updates are stored locally on the client side until you click the Apply Changes button. Schema Compare Improvements S cripts Folder Comparison allows comparing MySQL database schemas with the ones stored locally in the script folders and vice versa. Object Filter allows filtering objects right in Comparison Document. Applying advanced filters makes the analysis of schema comparison results more effective, informative, and bespoke. The feature also allows applying multiple filers and creating custom filters with Filter Editor which can be saved, and used for further comparisons. Schema Comparison Report window has been completely redesigned and the report generation options neatly regrouped. In addition, users can select how script diffs will look in their reports. HTML reports got a new smooth design and became more informative: apart from information about objects,  the HTML reports now include actual script differences. Ignore DEFINER and SQL SECURITY clauses option has been modified and was split into two options, Ignore DEFINER clauses and Ignore SQL SECURITY clauses . New Ignore row format table option. New Ignore AUTO_INCREMENT option. Data Compare Improvements S cripts Folder Comparison allows comparing MySQL data with the ones stored locally in the script folders and vice versa. Redesigned Data Compare Control. Viewing of exact data differences has become more smooth and clear: tabs of the grid became more informative with more crisp highlighting of data differences. In the redesigned Data Comparison Report window all report generation options have been neatly regrouped. In addition, users can select how script diffs will look in their reports. Redesigned Data Comparison Report in CSV. Now, dbForge Studio for MySQL generates several report files in CSV format.  One of them contains summary results, and the rest of the files contain specific data diffs info. Data Synchronization Wizard includes an option to add a timestamp to the name of the data synchronization script file . The wizard has also been extended with the option to set default values that helps synchronizing a NULL source column and a NOT NULL target column. Database Backup Improvements An option to disable reference tables from the backup has been added. Since the reference data is not modified or updated regularly, they do not require backup as often as the rest database objects. Data Export/Import Improvements The new Output tab has been added to Data Import Wizard and Data Export Wizard to specify the path to a generated file with data. The Options window has been expanded with the Data Export\\CSV tab to set default delimiter options for the CSV file format. All tabs of Data Export Wizard now show a data export format selected. Other Improvements Command Line prompt has been expanded with a full list of available exit codes. Activation of dbForge Studio for MySQL can now be done via the command-line interface. Availability [Download](https://www.devart.com/dbforge/mysql/studio/download.html) the new version of dbForge Studio for MySQL and [share your thoughts](https://www.devart.com/dbforge/mysql/studio/feedback.html) about the product. Your feedback helps us to improve the tool according to your needs. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [Releases](https://blog.devart.com/tag/releases) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-90.html) [Twitter](https://twitter.com/intent/tweet?text=Discover+new+dbForge+Studio+for+MySQL+9.0+hugely+packed+with+new+stuff&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-90.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-90.html&title=Discover+new+dbForge+Studio+for+MySQL+9.0+hugely+packed+with+new+stuff) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-90.html&title=Discover+new+dbForge+Studio+for+MySQL+9.0+hugely+packed+with+new+stuff) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-90.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) dbForge Studio for MySQL Awarded G2 Summer 2021 Badges By [dbForge Team](https://blog.devart.com/author/dbforge) August 26, 2021 [0](https://blog.devart.com/dbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html#respond) 2608 It is a pleasure to announce that G2 awards dbForge Studio for MySQL with the ‘Leader Summer 2021’ and ‘Momentum Leader Summer 2021’ badges in the [DB Backup](https://www.g2.com/categories/database-backup) category and ‘Users Love Us’ badge in the [Database Management Systems (DBMS)](https://www.g2.com/categories/database-backup) and [Database Comparison](https://www.g2.com/categories/database-comparison) categories. G2 is a peer-to-peer review website focusing on business services and software. It is noteworthy that G2 has a very responsible approach to collecting and processing users’ reviews. The final results are obtained based on a combination of the authorized users’ reviews and additional material from other trusted sources (for example, from social media). It allows the platform to create a credible and impartial image of the products, rather than follow subjective opinions. DB Backup: Database backup solutions help businesses to keep their data safe and sound. Be it user error, corrupt data, or hardware failure, [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) has got your back. Database Management Systems (DBMS) and Database Comparison: G2 users consider [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) to be one of the most convenient and reliable tools on market for the database management and comparison. [dbForge Studio](https://www.devart.com/dbforge/studio/) is a product line of advanced solutions with an intuitive graphical user interface created to serve as assistants in relational database development, design, analysis, automation, administration, and more. If you are interested in checking out Studio for MySQL, a [30-day free trial](https://www.devart.com/dbforge/mysql/studio/download.html) is available for your convenience. To sum up, we would like to point out that these achievements would not be possible without our loyal users, their feedback, and constructive criticism. These are the things that keep us going and give us the endurance to overcome any challenges we might face on the way. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+Awarded+G2+Summer+2021+Badges&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html&title=dbForge+Studio+for+MySQL+Awarded+G2+Summer+2021+Badges) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html&title=dbForge+Studio+for+MySQL+Awarded+G2+Summer+2021+Badges) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-awarded-g2-summer-2021-badges.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) dbForge Studio for MySQL Gains More User Recognition with 3 Prestigious Awards By [dbForge Team](https://blog.devart.com/author/dbforge) December 23, 2020 [0](https://blog.devart.com/dbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html#respond) 2711 [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is one of the most famous and valuable Devart products – a comprehensive solution with an intuitive interface for MySQL and MariaDB database developers, administrators, and managers. This product is a user favorite when it comes to creating and executing queries, developing and debugging procedures, and managing database objects. The recognition comes with 3 brand new awards that we were happy and proud to receive from the leading industry platforms. The Leader Winter 2021 award by G2 Crowd G2 is the most recognized tech reviewing platform in the world, with more than 4 million visitors and over 1 million original reviews. It is the primary place to go for unbiased information about software products and services. Countless businesses rely on G2 reviews when they search for software solutions to integrate with their operations. Besides getting [the Leader Winter 2021 award](https://www.g2.com/products/dbforge-studio-for-mysql-2018-12-04/reviews) , dbForge Studio for MySQL has entered Top 10 in the Database Management Systems (DBMS) Software rankings: Database Management Systems (DBMS) Software Easiest to Use Database Management Systems (DBMS) Software Free Database Management Systems (DBMS) Software G2 calculated these rankings with a comprehensive algorithm based on the satisfaction of real users. The jury evaluated hundreds of DBMS solutions. The ease of use, setup, and administration were the essential factors, as well as the satisfaction of requirements, the quality of support, and the overall ease of doing business with the solution provider. All of these criteria were highly rated by the users of dbForge Studio for MySQL. The Best Database Management Software Companies of 2021 by Digital.com Digital.com is a software reviewing platform aiming to help small businesses and startups to find the most efficient database management software for their needs. To define the latest [Best Database Management Software Companies](https://digital.com/database-management-software/) list, they examined more than 140 software companies worldwide and surveyed the most vital features of their solutions, such as deployment, backup and recovery functionality, and reporting. The reviewers appreciated the speed, simplicity, and safety of copying and moving data from one server to another. They also highly evaluated the built-in backup and recovery features and performance reports that monitor and display the availability and integrity of their databases. The Best Value of Fall 2020 by SoftwareSuggest SoftwareSuggest is a resource that lists, reviews, compares, and provides consultations on B2B software solutions. It considers numerous functional options of each software to provide comprehensive information about them. The scope of SoftwareSuggest covers nearly every business domain. According to the results of their latest in-depth investigations, dbForge Studio for MySQL entered the [Best Value category of SoftwareSuggest Recognition Awards](https://www.softwaresuggest.com/us/dbforge-studio-mysql) . For more than 20 years, Devart has been creating and improving innovative software solutions for more than 500,000 users across 120 countries. We appreciate the professional recognition of our products. We are grateful to G2, Digital.com, and SoftwareSuggest for placing dbForge Studio for MySQL among the best solutions for database-related tasks in the world. The support of our users inspires our team to develop more helpful [database management tools](https://www.devart.com/dbforge/) for database experts globally! If you are one of those experts who closely work with MySQL, we gladly invite you to [get a free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) of dbForge Studio and see the value it delivers. Stay tuned for further updates! Tags [Awards](https://blog.devart.com/tag/awards) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+Gains+More+User+Recognition+with+3+Prestigious+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html&title=dbForge+Studio+for+MySQL+Gains+More+User+Recognition+with+3+Prestigious+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html&title=dbForge+Studio+for+MySQL+Gains+More+User+Recognition+with+3+Prestigious+Awards) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-gains-more-user-recognition-with-3-prestigious-awards.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-is-a-dbta-2019-finalist.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) dbForge Studio for MySQL is a DBTA 2019 Finalist By [dbForge Team](https://blog.devart.com/author/dbforge) October 21, 2019 [0](https://blog.devart.com/dbforge-studio-for-mysql-is-a-dbta-2019-finalist.html#respond) 3691 DBTA (Database Trends and Applications) is a magazine delivering news and analysis on data science, big data, information management, and analytics. Every year, they conduct surveys among readers about various database-related software solutions to choose the best offers in a variety of categories such as ‘Best BI Solution’, ‘Best Cloud Database’, ‘Best Data Analytics Solution’, and others. One of these categories is called ‘Best DBA Solution’. It focuses on tools that would allow DBAs to increase their productivity, automate routine tasks, raise the code’s quality, and resolve performance issues. We are proud to tell that dbForge Studio for MySQL [became a finalist](http://www.dbta.com/Editorial/Trends-and-Applications/21-Best-DBA-Solution-133065.aspx) in this category. As DBTA’s readers proved, our IDE provides robust functionality for MySQL and MariaDB database development, management, and administration. Test it fo r y o urself and evaluate the tool’s capabilities by downloading a [free 30-day trial](https://www.devart.com/dbforge/mysql/studio/download.html) . You can also [learn more about dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , its features, and how it can help with all sorts of database-related tasks. Tags [Awards](https://blog.devart.com/tag/awards) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-is-a-dbta-2019-finalist.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+is+a+DBTA+2019+Finalist&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-is-a-dbta-2019-finalist.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-is-a-dbta-2019-finalist.html&title=dbForge+Studio+for+MySQL+is+a+DBTA+2019+Finalist) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-is-a-dbta-2019-finalist.html&title=dbForge+Studio+for+MySQL+is+a+DBTA+2019+Finalist) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-is-a-dbta-2019-finalist.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Studio for MySQL v 3.60 – higher performance and new freedom in working with remote servers By [dbForge Team](https://blog.devart.com/author/dbforge) August 12, 2009 [0](https://blog.devart.com/dbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html#respond) 2863 Optimized tool facilitates database developers and web masters to efficiently work with remote databases at a lower cost with a greater speed. Devart today announced the release of [dbForge Studio for MySQL v 3.60](https://www.devart.com/dbforge/mysql/studio/) , a cutting-edge tool for administration and development of MySQL databases. The new upgrade is the contemporary answer for common bottlenecks originated from employing remote MySQL Servers while developing, updating, and using modern large databases. dbForge Studio for MySQL v 3.60 offers new freedom for those who work with large schemas and data amounts especially in cases of using remote MySQL servers, for example serving web sites. Unlimited database connectivity It is good news for web developers working with remote databases, as employment of remote connections becomes as simple as local ones. dbForge Studio v 3.60 offers totally redesigned HTTP tunneling with simple tunneling script uploading to the web site. No need to remember multi-step procedures using other tools. To provide more flexibility while using SSH connection, public key authentication is supported. Higher performance on large databases Having passed multiple performance tests on large databases with all database objects, including hundreds of tables, thousands of records, possessing various data types, dbForge Studio for MySQL v 3.60 proved marked performance improvement as compared with the previous version and beneficial leading among competitive programs. The results of performance tests show that dbForge Studio for MySQL v 3.60 combines high performance with accuracy and correctness to ensure high quality and efficient database management. For example, dbForge Studio v 3.60 has 3 times faster data comparison and 4 times faster schema export compared to the previous versions. One more improvement is asynchronous execution of stored procedure. It is a great benefit when working with slow procedures. Optimized database connectivity In contrast to the prior version dbForge Studio v 3.60 reduced network traffic up to several times for performing such common tasks as opening connections, managing database objects, performing schema and data export, retrieving data from tables, executing procedures or stopping the execution. This optimization makes possible to use dbForge Studio v 3.60 in cases of traffic limitations. [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) 3.60 and expand your freedom while working with MySQL databases. Use the [dbForge Studio for MySQL Feedback Page](https://www.devart.com/dbforge/mysql/studio/feedback.html) to tell us what you think about the new version. We are looking forward to your comments and suggestions. Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+v+3.60+%E2%80%93+higher+performance+and+new+freedom+in+working+with+remote+servers&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html&title=dbForge+Studio+for+MySQL+v+3.60+%E2%80%93+higher+performance+and+new+freedom+in+working+with+remote+servers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html&title=dbForge+Studio+for+MySQL+v+3.60+%E2%80%93+higher+performance+and+new+freedom+in+working+with+remote+servers) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-v-3-60-higher-performance-and-new-freedom-in-working-with-remote-servers.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-v4-50-more-tools-available.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) More Tools Available: dbForge Studio for MySQL, v4.50! By [dbForge Team](https://blog.devart.com/author/dbforge) June 21, 2010 [0](https://blog.devart.com/dbforge-studio-for-mysql-v4-50-more-tools-available.html#respond) 4862 Devart today releases [dbForge Studio for MySQL, v4.50](https://www.devart.com/dbforge/mysql/studio/) – cutting-edge administration tool and development environment for professional working with MySQL databases. With dbForge Studio, Devart continues its initiative to produce efficient database experiences for all the people in MySQL world. New features in dbForge Studio for MySQL, v4.50 include: More freedom for backing up schemas Schema Export wizard has been totally redesigned to [Database Backup](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) to enable users to back up schemas in automatic mode using Windows task scheduler, save backup options for future use, view automatically complied log file. Besides old backup files are automatically removed based on date or quantity. How to backup MySQL database New tool for database developers – Query Profiler dbForge Studio offers results of internal [MySQL tools](https://www.devart.com/dbforge/mysql/) like SHOW PROFILE and EXPLAIN in a convenient and clear GUI. Besides, you get STATUS variables for the required query automatically calculated. Additional benefits: Plan of the query displayed in the tree view for easy review Profiling history that can be saved for further analysis Capability to compare profiling results in two clicks Capability to print profiling results How to profile MySQL query Data comparison and synchronization of any databases Diverse testing and close interaction with database developers, admins and casual users resulted in thoughtful redesign and enhancement of [Data Compare tool](https://www.devart.com/dbforge/mysql/studio/database-synchronization.html) . Now it compares and synchronizes database of any length with significant performance improvement. To customize comparison and synchronization, the users can use new options, change synchronization direction in one click, and quickly filter tables in comparison results. An additional benefit is generating accurate comparison reports in HTML and Excel formats. Advanced query building Now [Query Builder](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) , a powerful tool for visual query creating, is tailored for creating complex conditions with several clicks. The new power is based on optimized performance of the Selection tab in the expression editor, visual addition of subqueries to any part of the main query, new Wrap to Subquery option to wrap tables into a subquery, optimized navigation in the editor, particularly between subqueries and other features. Quick generating template SQL scripts for database objects Thanks to this new functionality, you can save your time while working with database objects. For example, you can quickly generate template SQL scripts CREATE, DROP, SELECT, INSERT, UPDATE or DELETE scripts for tables. This option is available in the context menu of Database Explorer and called ‘Generate Script As’. Improved schema compare tool Extended capabilities of Schema Comparison wizard New comparison options to ignore some table options, DEFINER and SQL SECURITY expressions, default values for columns Check the benefits yourself, [download dbForge Studio for MySQL, v4.50 now for free](https://www.devart.com/dbforge/mysql/studio/download.html) . Tell us what you think about the new version at [dbForge Studio feedback page](https://www.devart.com/dbforge/mysql/studio/feedback.html) . We are looking forward to your comments and suggestions. Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v4-50-more-tools-available.html) [Twitter](https://twitter.com/intent/tweet?text=More+Tools+Available%3A+dbForge+Studio+for+MySQL%2C+v4.50%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v4-50-more-tools-available.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-v4-50-more-tools-available.html&title=More+Tools+Available%3A+dbForge+Studio+for+MySQL%2C+v4.50%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-v4-50-more-tools-available.html&title=More+Tools+Available%3A+dbForge+Studio+for+MySQL%2C+v4.50%21) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-v4-50-more-tools-available.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-mysql-v81.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) Connectivity to MariaDB Keeps Extending in dbForge Studio for MySQL By [dbForge Team](https://blog.devart.com/author/dbforge) January 15, 2019 [0](https://blog.devart.com/dbforge-studio-for-mysql-v81.html#respond) 11110 We are thrilled to inform our MySQL users that fresh and new [dbForge Studio for MySQL, v8.1](https://www.devart.com/dbforge/mysql/studio/) has been just rolled out! To ensure that the users of dbForge Studio for MySQL can work with the most up-to-date database engines, we keep expanding connectivity options for our MySQL management tool. In this version, we have extended the MariaDB connectivity options with the connectivity support for the latest MariaDB 10.4 . Tell Us What You Think We welcome you to [try the new version](https://www.devart.com/dbforge/mysql/studio/download.html) of dbForge Studio for MySQL and [share your thoughts](https://www.devart.com/dbforge/mysql/studio/feedback.html) about the release with us. This will help us improve the tool as per with your needs! Tags [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v81.html) [Twitter](https://twitter.com/intent/tweet?text=Connectivity+to+MariaDB+Keeps+Extending+in+dbForge+Studio+for+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-mysql-v81.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-mysql-v81.html&title=Connectivity+to+MariaDB+Keeps+Extending+in+dbForge+Studio+for+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-mysql-v81.html&title=Connectivity+to+MariaDB+Keeps+Extending+in+dbForge+Studio+for+MySQL) [Copy URL](https://blog.devart.com/dbforge-studio-for-mysql-v81.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-oracle-v43.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Discover dbForge Studio for Oracle v4.3 with CLI Activation By [dbForge Team](https://blog.devart.com/author/dbforge) January 27, 2021 [0](https://blog.devart.com/dbforge-studio-for-oracle-v43.html#respond) 2574 We are delighted to inform our Oracle users that a new [dbForge Studio for Oracle, v4.3](https://www.devart.com/dbforge/oracle/studio/) has been just released. The major improvements introduced in the new version feature an option to activate the tool via the command-line interface and enhanced comparison and virtual columns. CLI Activation dbForge Studio for Oracle, v4.3 can be activated and deactivated via the command-line interface using the /activate and /deactivate command-line switches. Improved Comparison and Sync of Virtual Columns In the new version of dbForge Studio for Oracle, virtual column values are regarded as objects that can be easily synchronized due to the object dependencies being generated within the schema synchronization process. Availability [Download](https://www.devart.com/dbforge/oracle/studio/download.html) the new version of dbForge Studio for Oracle and [share your thoughts](https://www.devart.com/dbforge/oracle/studio/feedback.html) about the product. Your feedback helps us find the right direction with future updates and improve the tools according to the needs of our users. Tags [Oracle](https://blog.devart.com/tag/oracle) [studio for oracle](https://blog.devart.com/tag/studio-for-oracle) [what's new oracle studio](https://blog.devart.com/tag/whats-new-oracle-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-oracle-v43.html) [Twitter](https://twitter.com/intent/tweet?text=Discover+dbForge+Studio+for+Oracle+v4.3+with+CLI+Activation&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-oracle-v43.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-oracle-v43.html&title=Discover+dbForge+Studio+for+Oracle+v4.3+with+CLI+Activation) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-oracle-v43.html&title=Discover+dbForge+Studio+for+Oracle+v4.3+with+CLI+Activation) [Copy URL](https://blog.devart.com/dbforge-studio-for-oracle-v43.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-postgresql-v23.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) New Features in dbForge Studio for PostgreSQL v2.3 By [dbForge Team](https://blog.devart.com/author/dbforge) August 12, 2020 [0](https://blog.devart.com/dbforge-studio-for-postgresql-v23.html#respond) 3533 We are glad to announce that the new version of [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) has been just released! Here’s what you can find in our updated IDE: Script Generator Improvements New options for script generation were added: Generate Script as DROP and CREATE SEQUENCE Generate Script as DROP and CREATE TABLE Generate Script as DROP and CREATE for all Source Objects Generate Script as CREATE INDEX SQL Formatter Improvements In the new version of dbForge Studio for PostgreSQL, the SQL Formatter was updated to work with the following statements: CREATE TRIGGER CREATE INDEX CREATE SEQUENCE CREATE TABLE CREATE VIEW CREATE\n\tMATERIALIZED VIEW PROCEDURE\\FUNCTION Query\nProfiler Improvements You can now get a plan of\nany query without actually running it. This can help with graphically\nanalyzing and then improving high-cost queries. Availability [Download](https://www.devart.com/dbforge/postgresql/studio/download.html) the\nnew version of dbForge Studio for PostgreSQL\nand [share\nyour thoughts](https://www.devart.com/dbforge/postgresql/studio/feedback.html) about\nthe product. Your feedback helps us find\nthe right direction with future updates and improve the tools\naccording to the\nneeds of\nour users. Tags [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [studio for postgresql](https://blog.devart.com/tag/studio-for-postgresql) [what's new postgresql studio](https://blog.devart.com/tag/whats-new-postgresql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-postgresql-v23.html) [Twitter](https://twitter.com/intent/tweet?text=New+Features+in+dbForge+Studio+for+PostgreSQL+v2.3&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-postgresql-v23.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-postgresql-v23.html&title=New+Features+in+dbForge+Studio+for+PostgreSQL+v2.3) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-postgresql-v23.html&title=New+Features+in+dbForge+Studio+for+PostgreSQL+v2.3) [Copy URL](https://blog.devart.com/dbforge-studio-for-postgresql-v23.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio for SQL Server Awarded G2 Summer 2021 Badges By [dbForge Team](https://blog.devart.com/author/dbforge) July 20, 2021 [0](https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html#respond) 2213 We are honored and thrilled to announce that dbForge Studio for SQL Server has received recognition as “High Performer” and “Easiest to Do Business With” in Summer 2021 Reports. G2 (formerly known as G2 Crowd) is one of the most trusted peer-to-peer review platforms letting businesses obtain reliable information about software solutions in order to make smart and balanced purchasing decisions. After receiving the [G2 High Performer Spring 2021 award](https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html) , we have achieved another milestone in the G2 Summer 2021 Reports – dbForge Studio for SQL Server has been named “High Performer” and “Easiest to Do Business With”. These awards show high customer satisfaction with the Studio and the effectiveness of Devart’s business communications. G2 rankings are based exclusively on real customer ratings. G2 team verifies each review to make sure they are authentic and unbiased. We, at the dbForge team, are excited about this positive and encouraging feedback from our customers. This recognition means a lot to us. We are really grateful to our users for the trust they place in us and our product. This ongoing support inspires us to develop dbForge Studio further and add new features to it to be the perfect fit for the user. About dbForge Studio for SQL Server dbForge Studio for SQL Server is a feature-rich IDE for SQL Server development, administration, and maintenance. It incorporates a set of tools tailored to suit the specific needs of database developers, DBAs, and data analysts. We invite you to visit our [G2 Crowd listing](https://www.g2.com/products/dbforge-studio-for-sql-server-2018-12-04/reviews) and check the reviews or download a [free 30-day trial of dbForge Studio](https://www.devart.com/dbforge/sql/studio/download.html) and see for yourself how our solution can contribute to driving your teams’ productivity and efficiency. [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Awarded+G2+Summer+2021+Badges&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html&title=dbForge+Studio+for+SQL+Server+Awarded+G2+Summer+2021+Badges) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html&title=dbForge+Studio+for+SQL+Server+Awarded+G2+Summer+2021+Badges) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-crozdesk-happiest-users-award.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio for SQL Server Earns Crozdesk Happiest Users Award By [dbForge Team](https://blog.devart.com/author/dbforge) December 9, 2021 [0](https://blog.devart.com/dbforge-studio-for-sql-server-crozdesk-happiest-users-award.html#respond) 2347 Hardly had the winter started when [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) received yet another award. According to Crozdesk users, the IDE has everything you might need to set up your SQL development environment, manage and develop SQL Server databases. About dbForge Studio for SQL Server dbForge Studio for SQL Server is a powerful GUI tool that will make SQL Server administration, development, and management routine easier and a lot more pleasant. The tool can be used by SQL developers and database administrators to perform complex database tasks and speed up almost any database experience. Designing databases, writing SQL code, comparing databases, synchronizing schemas and data, generating meaningful test data – all this is brought together in one IDE for your convenience. Crozdesk Happiest Users Award Crozdesk is a web service that connects buyers and sellers of business software and offers an algorithm to rate different platforms. dbForge Studio currently scores 90/100 in the [Backend/Database category](https://crozdesk.com/software/dbforge-studio-for-sql-server) . This is based on user satisfaction (91/100), press buzz (50/100), recent user trends, and other relevant information on the product gathered from around the web. dbForge Studio Accomplishments As you might remember, this is not the first (and we hope, not the last) Devart’s Crozdesk award. Earlier this year dbForge Studio for SQL Server earned other two valuable badges: Quality Choice and Trusted Vendor . You will find more information regarding this achievement in [this blog post](https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html) . Users’ Feedback We would like to express our sincerest gratitude to our users. Devart wouldn’t be where we are now without your constructive criticism and detailed feedback. If have something to say about our products, you can easily write and publish a review on similar platforms such as: [G2](https://www.g2.com/products/dbforge-studio-for-sql-server-2018-12-04/reviews) [TrustRadius](https://www.trustradius.com/products/dbforge-studio-for-sql-server/reviews) [Capterra](https://www.capterra.com/p/196325/dbForge-Studio/) [Software Suggest](https://www.softwaresuggest.com/dbforge-studio-sql-server) In case you have not tried dbForge Studio for SQL Server yet, you are welcome to test a fully-functional [30-day trial version](https://www.devart.com/dbforge/sql/studio/download.html) . Devart also offers IDEs for [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . Tags [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [Crozdesk badges](https://blog.devart.com/tag/crozdesk-badges) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-crozdesk-happiest-users-award.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Earns+Crozdesk+Happiest+Users+Award&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-crozdesk-happiest-users-award.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-crozdesk-happiest-users-award.html&title=dbForge+Studio+for+SQL+Server+Earns+Crozdesk+Happiest+Users+Award) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-crozdesk-happiest-users-award.html&title=dbForge+Studio+for+SQL+Server+Earns+Crozdesk+Happiest+Users+Award) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-crozdesk-happiest-users-award.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio for SQL Server Earns Two Weighty Crozdesk Awards By [dbForge Team](https://blog.devart.com/author/dbforge) September 9, 2021 [0](https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html#respond) 2223 Another award, another milestone for dbForge Studio for SQL Server! We are thrilled to announce that our flagship solution received the “Quality Choice” and the “Trusted Vendor” badges from Crozdesk. Earlier this year dbForge Studio for SQL Server has achieved an array of well-deserved awards: [Top Rated Backup Software of 2021](https://blog.devart.com/dbforge-studio-awarded-top-rated-backup-software-of-2021-by-softwareworld.html) from SoftwareWorld, [Front Runners 2021 and Top Database Tools of 2021](https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html) from Capterra, [High Performer and Easiest to Do Business With](https://blog.devart.com/dbforge-studio-for-sql-server-awarded-g2-summer-2021-badges.html) from G2. And today we are happy to get recognition from another reputable platform—Crozdesk. About Crozdesk Crozdesk is a well-known trustworthy business software review and discovery platform connecting software buyers with B2B software products. It is bound to help businesses make balanced decisions and facilitate the search for software. Crozdesk’s Software Awards rank the very best software solutions per category once a year. Our Crozdesk’s awards dbForge Studio for SQL Server [earned two badges from Crozdesk](https://crozdesk.com/software/dbforge-studio-for-sql-server) : the Quality Choice award for setting apart from the competitive products on the market and the Trusted Vendor award for high estimated market presence. Achievements like Crozdesk’s badges show Devart’s dedication to providing high-quality software and services that take the user experience to a whole new level. About dbForge Studio for SQL Server dbForge Studio for SQL Server by Devart is a powerful IDE for Microsoft SQL Server management, administration, development, data reporting, analysis, and a lot more. Being a premium all-in-one database development solution, it encompasses all the features required for effective database development and eliminates the need to switch between different tools for different tasks. To our users We’d like to take a moment and thank our incredible users for their amazing feedback. We are tirelessly honing our product to perfection and it’s exciting to see our efforts reflected in your comments, reviews, and votes. We are immensely proud to receive that kind of recognition from you. How to test-drive the Studio If you want to learn more about our award-winning product and see how it can help your teams accelerate database development, [download a free fully-functional 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) of dbForge Studio for SQL Server from our website. Tags [Crozdesk Award](https://blog.devart.com/tag/crozdesk-award) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [Quality Choice Award](https://blog.devart.com/tag/quality-choice-award) [Trusted Vendor Award](https://blog.devart.com/tag/trusted-vendor-award) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Earns+Two+Weighty+Crozdesk+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html&title=dbForge+Studio+for+SQL+Server+Earns+Two+Weighty+Crozdesk+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html&title=dbForge+Studio+for+SQL+Server+Earns+Two+Weighty+Crozdesk+Awards) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-earns-two-weighty-crozdesk-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio for SQL Server Has Been Awarded G2 High Performer Spring 2021 By [dbForge Team](https://blog.devart.com/author/dbforge) April 5, 2021 [0](https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html#respond) 2192 [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is an all-encompassing IDE, designed for efficient development, management, and administration of SQL Server databases. It comprises a set of tools that accelerate and automate a number of related tasks, including data and schema comparison, analysis, and reporting. Today we add another valuable entry to the list of its achievements – [the G2 High Performer Spring 2021 award](https://www.g2.com/products/dbforge-studio-for-sql-server-2018-12-04/reviews) . G2 is known as the most recognized tech reviewing platform in the world. More than 4 million visitors use it as a source of unbiased information regarding software products and services, amounting to 1 million original reviews. The G2 rankings are based on real user satisfaction. The evaluated factors include the capabilities and performance of tools; the simplicity of usage, setup, and administration; finally, the quality of support and communication with the vendor. The very fact that all of these criteria reflect the high opinions of our customers makes us proud of what we do and inspires us to do even more. We are thankful for this ongoing support. And if you work closely with SQL Server but haven’t tried any of our [database development tools](https://www.devart.com/dbforge/) yet, we gladly offer you a [free 30-day trial of dbForge Studio](https://www.devart.com/dbforge/sql/studio/download.html) – just give it a spin and see all of its capabilities for yourself. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Has+Been+Awarded+G2+High+Performer+Spring+2021&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html&title=dbForge+Studio+for+SQL+Server+Has+Been+Awarded+G2+High+Performer+Spring+2021) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html&title=dbForge+Studio+for+SQL+Server+Has+Been+Awarded+G2+High+Performer+Spring+2021) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-has-been-awarded-g2-high-performer-spring-2021.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Studio for SQL Server Is Ready for DevOps Automation By [dbForge Team](https://blog.devart.com/author/dbforge) February 23, 2021 [0](https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html#respond) 2632 We’re pleased to present dbForge Studio for SQL Server v6.0 hugely packed with new features and major improvements. Database DevOps integration is one of the most demanded features requested by our users, and today we are making it happen. New and Extended Features DevOps Automation – NEW! Code Completion SQL Formatter Execution History Data Editor Data Export/Import Schema Compare Data Compare Data Generator Documenter Execution Warnings – NEW ! Find invalid objects – NEW! Run scripts on multiple targets – NEW! Other minor features and improvements Database Continuous Integration has already become an industry-standard practice in database development. It implies the quick integration of database schema, data, and logic changes into application development and provides immediate feedback to developers on any issues that may arise. The integration of database changes used to be an impediment to the full adoption of DevOps in application development. At the same time, the existing tools didn’t provide an integrated solution to SQL Server database continuous integration and delivery. Meeting the industry challenges, the dbForge team rolls out a shiny new version of its flagship database development solution. The release of dbForge Studio for SQL Server v6.0 primarily addresses DevOps automation compliance and is bound to help teams adopt DevOps practices in the best possible way. DevOps Automation Implement database continuous integration and improve performance by automating build and deployment processes using the newly released dbForge Studio for SQL Server v6.0. Code Completion Column Sensitivity Classification Information Available in the Completion List and Quick Info Column sensitivity classification information is now available in the completion list and quick info, enabling you to achieve better data safeguarding and visibility. Jump Between CASE and END The CASE expressions used in statements can be quite long and navigating between their beginnings and ends can be a daunting task. To solve this problem, we have introduced jumping between CASE and END in the new version of dbForge Studio for SQL Server. Jump Between BEGIN TRY/END TRY and BEGIN CATCH/END CATCH When working with large scripts, it is important to be able to quickly navigate between paired keywords in an SQL statement. With dbForge Studio v6.0, you can now jump between BEGIN TRY/END TRY and BEGIN CATCH/END CATCH quickly and easily. The Extend Insert Highlight Occurrences Feature to Show a Popup With the Column Name in the Values Area In the updated dbForge Studio for SQL Server, you will be prompted to enter the column name after opening the bracket in the VALUES clause. A hint will appear displaying the column names according to the cursor position in the VALUES clause as you enter values. This will help you quickly navigate when entering column values. Suggest Properties for Built-in Metadata Functions In dbForge Studio for SQL Server v6.0,  properties for built-in metadata functions (SERVERPROPERTY, FILEPROPERTY, DATABASEPROPERTYEX, etc) are now prompted, allowing for the full use of the system catalog to find out more about a database. The Support for the Following Functions Is Added: NEWSEQUENTIALID CERTPRIVATEKEY CERTENCODED PWDENCRYPT PWDCOMPARE The Support for the Following SQL Statements Is Added: GRANT ALTER ANY SECURITY POLICY GRANT ALTER ANY SENSITIVITY CLASSIFICATION GRANT ALTER ANY DATABASE SCOPED CONFIGURATION GRANT ALTER ANY COLUMN MASTER KEY GRANT ALTER ANY COLUMN ENCRYPTION KEY GRANT EXECUTE ANY EXTERNAL SCRIPT GRANT ALTER ANY EXTERNAL DATA SOURCE GRANT ALTER ANY EXTERNAL FILE FORMAT GRANT ALTER ANY EXTERNAL LANGUAGE GRANT ALTER ANY EXTERNAL LIBRARY GRANT ADMINISTER DATABASE BULK OPERATIONS The Expanded Support for the Function PREDICT for SQL Server 2019 The Studio now supports the PREDICT T-SQL function, allowing you to generate a predicted value or scores based on a stored model. The Expanded Support for the Following Statements Is Added: CREATE USER for Azure SQL Database CREATE INDEX for SQL Server 2019 CREATE EXTERNAL LIBRARY for SQL Server 2019 ALTER DATABASE for SQL Server 2019 ALTER TABLE for SQL server 2019 ALTER EXTERNAL LIBRARY for SQL Server 2019 ALTER AVAILABILITY GROUP for SQL server 2019 DROP EXTERNAL RESOURCE POOL for SQL Server 2019 DROP EXTERNAL LIBRARY for SQL Server 2019 ACCELERATED_DATABASE_RECOVERY in ALTER DATABASE for SQL Server 2019 Temporary Tables Suggestion Is Introduced In the new dbForge Studio for SQL Server, the suggestion list shows temporary tables variables and columns. OPENJSON Objects Suggestion Is Introduced The Studio can now suggest SQL server objects when working with OPENJSON SQL Server table-valued function. Prompting Hints Names for the USE HINT Option The USE HINT query hint argument provides a method to add behavior to a single query and lets you drive the query optimizer without elevated credentials or without being a member of the sysadmin server role. Prompting Time Zones in AT TIME ZONE AT TIME ZONE converts an inputdate to the corresponding datetimeoffset value in the target time zone. The dbForge Studio for SQL Server functionality has been extended to suggest time zones when writing SQL queries, helping format your SQL instances and handle all time zone calculations. Support for the MIN_ACTIVE_ROWVERSION function Is Added MIN_ACTIVE_ROWVERSION is a non-deterministic function that returns the lowest active rowversion value in the current database. With dbForge Studio for SQL Server 6.0, you can now benefit from MIN_ACTIVE_ROWVERSION function. Displaying MS_Description for the Azure Objects MS_Description extended property stores a basic description of an object. It is widely used for documentation and content purposes. The support for MS_Description for Azure objects was one of our clients’ eagerly-awaited functions. Prompting Objects in the Context of DBCC SHOW_STATISTICS DBCC SHOW_STATISTICS displays current query optimization statistics for a table or indexed view. This statement is one of the most common database scripts and the updated dbForge Studio for SQL Server now suggests the objects in the query, substantially accelerating database development. The CREATE/ALTER/DROP EXTERNAL LANGUAGE Statements Are Supported for SQL Server 2019 New Options for the ALTER DATABASE SCOPED CONFIGURATION Statement Are Supported Convert EXEC to script This feature will allow you to simplify debugging by replacing the call to a stored procedure with the stored procedure body. It takes the contents of the stored procedure and replaces the call in your query with them. SQL Formatter New Formatting Profiles Added At our users’ request, Devart introduces span-new formatting profiles that let you tune your SQL code like never before. Now you can not only format your code exactly as you want but also switch quickly to an alternative style or apply formatting specifically to certain parts of your SQL script if required. Formatting Profile Quick Select The Formatting Profile Quick Select feature allows users to quickly switch between active formatting profiles, as well as edit profiles by opening the Formatting Profiles window. Execution History Export of Execution History to CSV File Format Is Added With the updated dbForge Studio for SQL Server, you can now export the Query Execution History to a .csv file. Data Editor The Keyboard Shortcut for the Apply Changes Command Is Added To quickly apply changes when working in Data Editor, press Ctrl + S . The Cached Updates Mode Is Introduced The Cached updates mode allows you to control data editing within a single database object. For the object you have turned this mode on, updates are stored locally on the client side until you click the Apply Changes button. If you do not need the changes to be applied, click the Cancel Changes button. When you close the document in the cached updates mode, you cancel all the changes made since the last Apply command. Unified Display of DATE in the Results Grid and Data Viewer Results Grid Data Visualizers Are Added The Data visualizers feature that is tailored to suit the needs of the most demanding SQL developers allows viewing data in 10 common data formats: Hexadecimal, Text, XML, HTML, Web, Rich Text, PDF, JSON, Image, and Spatial. Export Data From a Grid to the CSV, XML, HTML, and JSON Formats The Copy Data As Feature Is Introduced Now the data from a cell or entire table can be copied from the results grid to a file or clipboard in any of the available formats (CSV, XML, HTML, JSON). The Ability to Customize Colors for Data Viewer JSON View in Dark Skin Is Added CSV Export Settings Following our users’ requests, we’ve added a possibility to configure data export to CSV files. Now you can tailor data export to CSV options to suit your needs. The most beneficial thing is the ability to select a delimiter to separate data values and specify the characters that will surround data values. Web View Is Added Switching to the Web View will display the cell contents as a web page in a flash. Data Export/Import The Information About the Export Format Is Added to the Data Export Wizard Header The Output Settings Tab Is Added to the Data Export Wizard To facilitate the data export process and make it more comprehensive, we have added the Output settings tab to the Data Export Wizard. On this tab, you can specify the file location for the data to be exported to. Data Export and Import to Google Sheets Remember the Save an Export Project Checkbox Value Import Wizard Pages Refactoring The Export Settings for the CSV File Format Are Added Following our users’ requests, we’ve added a possibility to configure data export to CSV files. Now you can tune data export to CSV options to suit your needs. The most beneficial thing is the ability to select a delimiter to separate data values and specify the characters that will surround data values. Schema Compare Text Compare Control Options In the updated dbForge Studio for SQL Server, you can customize the colors of the comparison results output. Append Timestamp to the File Name Is Added A New Progress Window for Schema Comparison and Synchronization Pre- and Post-Scripts Execution In the updated dbForge Studio for SQL Server, you can add scripts to be executed before or after schema synchronization. You can either import the necessary scripts or enter them directly in the Schema Synchronization Wizard. Data Compare A New Progress Window for Data Comparison and Synchronization Pre- and Post-Scripts Execution The newly released dbForge Studio for SQL Server allows adding scripts to be executed before or after data synchronization. You can either import the necessary scripts or enter them directly in the Data Synchronization Wizard. These scripts can be used for a number of purposes. For example, to copy data from a table that is going to be changed into a temporary table, to add some reference data, to enable or disable triggers, etc. Exclude Newly Added Objects When this newly added option is enabled, dbForge Studio for SQL Server will deploy only those tables and views that are explicitly included in the saved comparison project (*.dcomp). This allows you to make sure that those objects that were added to a database after the comparison project had been created will not get into synchronization. Ignore Whitespaces Option Is Added Tolerance Interval Option Is Added Data Generator A New Progress Window for Data Generation Documenter Auto-Line-Break of Long Headings In the updated dbForge Studio for SQL Server, the Auto-line-break of Long Headings improvement is introduced. Thanks to it, the whole name of the document will be displayed giving you more clarity when working with database documentation. Execution Warnings The icing on the cake of this release is the newly introduced Execution Warnings functionality. The Studio analyzes potentially dangerous statements (DELETE, DROP, TRUNCATE, and UPDATE) and generates a pop-up alert if a user is about to execute a statement that may cause data loss. Find Invalid Objects Another newly added functionality, that is bound to help you quickly detect and fix invalid objects (for example, objects that reference already dropped objects), which are quite common in the database development process. Run scripts on multiple targets The capability to run scripts on multiple targets is one more striking functionality of this feature-heavy release. Just select databases on the current server and execute a script against them from one query window. You can also specify the mode, in which the script will be executed for the selected databases: parallel or sequential. Hungry for More Features and Improvements? We have been working hard on this version of dbForge Studio for SQL Server for you to enjoy the improved performance of our product. So, here are some more features we’re shipping in this release. Generate Script As for DML Commands Access to SQL Designer From the Context Menu Is Added Search for Options Is Introduced Heroku Cloud Connectivity Support With the Heroku cloud platform, you obtain a rich ecosystem of pre-integrated extensions and fully managed services. It lets you roll your code quickly into production and focus on product development rather than managing infrastructure. Remember the Size of the Column Properties Window Display the SQL Server Version When Testing Connection New Buttons on the Toolbar To ensure a smooth user experience and facilitate access to the popular functionality, we are adding the new buttons to the toolbar: Begin Transaction , Commit , and Rollback . DevExpress Has Been Updated to v20 New Vector HiDPI Skins A Unified Product Installer Is Introduced Unlock your DevOps potential with the newly released dbForge Studio for SQL Server! [Download](https://www.devart.com/dbforge/sql/studio/download.html) Studio 6.0 and enjoy a free trial for the whole month. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [DevOps Automation](https://blog.devart.com/tag/devops-automation) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-is-ready-for-devops-automation.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Is+Ready+for+DevOps+Automation&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-is-ready-for-devops-automation.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html&title=dbForge+Studio+for+SQL+Server+Is+Ready+for+DevOps+Automation) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html&title=dbForge+Studio+for+SQL+Server+Is+Ready+for+DevOps+Automation) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-to-try.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Studio for SQL Server is Ready for Work! By [dbForge Team](https://blog.devart.com/author/dbforge) September 14, 2012 [0](https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-to-try.html#respond) 3247 Devart proudly presents [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) — a new set of SQL Server tools for database developers and administrators. All users who have used our previously elaborated solutions (such as SQL Complete, Schema Compare, Data Compare and Query Builder for SQL Server) can now benefit from having them at hand in a single SQL Server integrated development environment. A new GUI tool by Devart for SQL Server is ready. Product Editions Studio for SQL Server comes in 4 editions so that each user could decide which set of [database tools](https://www.devart.com/dbforge/) meets their requirements best: Free Express edition provides basic functionality for database development Standard edition extends the Express edition’s functionality with [T-SQL Debugger](https://www.devart.com/dbforge/sql/studio/tsql-debugger.html) , [Query Profiler](https://www.devart.com/dbforge/sql/studio/sql-query-profiler.html) , advanced [SQL Editor](https://www.devart.com/dbforge/sql/studio/sql-editor.html) features, etc. Professional is a fully-featured edition that provides additional functionality Enterprise edition includes all the functionality of the discontinued Studio for SQL Server Not only have we integrated the existing products into one, but also added a number of new features . Major New Features [Table Designer](https://www.devart.com/dbforge/sql/studio/table-designer.html) that allows to: Set table properties in the visual editors Edit a script that creates the table Rebuild tables when complex changes are introduced Preview changes before modifying a database object [Database Diagram](https://www.devart.com/dbforge/sql/studio/database-diagram.html) for when you need to get a quick look at the database structure. It provides: Visual editing, easy manipulation and scaling Containers for grouping objects Printing large diagrams Virtual connections T-SQL Debugger A must-have tool for building server-side logic. Being integrated into the stored procedure editor it allows to start debugging by clicking on the Database Explorer tree Query Profiler to locate bottlenecks and optimize slow query execution time with many advanced options unavailable in a standard tool Security Manager that incorporates visual editors for logins, users, and roles; batch object editing, etc. Availability Consumers can give dbForge Studio for SQL Server 3.0 a test drive by downloading the 30-day trial Professional edition at the product [download page](https://www.devart.com/dbforge/sql/studio/download.html) . Tell us what you think about the new release, on the [dbForge Studio for SQL Server feedback page](https://www.devart.com/dbforge/sql/studio/feedback.html) . We are looking forward to your comments and suggestions! Tags [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-is-ready-to-try.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+is+Ready+for+Work%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-is-ready-to-try.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-to-try.html&title=dbForge+Studio+for+SQL+Server+is+Ready+for+Work%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-to-try.html&title=dbForge+Studio+for+SQL+Server+is+Ready+for+Work%21) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-to-try.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Studio for SQL Server Scored 3 Crozdesk Awards By [dbForge Team](https://blog.devart.com/author/dbforge) May 11, 2022 [0](https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html#respond) 2750 Today we would like to tell you more about the three awards that [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) has recently scored on [Crozdesk](https://crozdesk.com/software/dbforge-studio-for-sql-server) , one of the world’s top platforms that helps companies find and purchase the best business software for project and customer management, marketing, and accounting. The names of these awards are rather indicative of what all dbForge products stand for — Trusted Vendor , Happiest Users , and Quality Choice . But what exactly do these badges mean and how are they awarded? Let’s have a brief overview of each. Trusted Vendor is a badge that can be scored by vendors with a considerable market presence or market share. This metric is based on an algorithmic estimate of the number of users, which is calculated by Crozdesk’s AI-driven ranking algorithm. No more than 20% of all software solutions can get this badge as an award. Happiest Users is a badge that goes hand in hand with consistently high product ratings and reviews. A software product that aspires to get this badge must have an average user rating of at least 4.5/5.0 across a high number of ratings. This can be achieved by no more than 10% of all solutions on Crozdesk. Quality Choice is a badge that is earned by achieving a Crozscore (the platform’s internal scoring system) of 80/100 or higher. Roughly one third of all software products on Crozdesk can achieve this. But regardless of what other people may say, there is nothing quite like firsthand experience. That’s why we gladly invite you to [get your free 30-day trial of dbForge Studio today](https://www.devart.com/dbforge/sql/studio/download.html) and evaluate its capabilities yourself. Tags [Crozdesk awards](https://blog.devart.com/tag/crozdesk-awards) [Crozdesk badges](https://blog.devart.com/tag/crozdesk-badges) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-scored-3-crozdesk-awards.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+SQL+Server+Scored+3+Crozdesk+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-for-sql-server-scored-3-crozdesk-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html&title=dbForge+Studio+for+SQL+Server+Scored+3+Crozdesk+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html&title=dbForge+Studio+for+SQL+Server+Scored+3+Crozdesk+Awards) [Copy URL](https://blog.devart.com/dbforge-studio-for-sql-server-scored-3-crozdesk-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studio-mysql-on-linux-operating-systems.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) dbForge Studio for MySQL on Linux Operating Systems By [dbForge Team](https://blog.devart.com/author/dbforge) June 22, 2011 [6](https://blog.devart.com/dbforge-studio-mysql-on-linux-operating-systems.html#comments) 16960 Despite the fact that the dbForge [database tools](https://www.devart.com/dbforge/) were developed only for Windows platforms, our active users (thanks to Tsvetkov) have found a way to use some features of dbForge Studio for MySql on Linux family operating systems. Tests were run under .Net Framework 2.0 on Wine emulator. The following command line functionality works with no visible issues: /backup – backups a database /restore – restores a database /datacompare – launches a data comparison /datacompare /sync – launches a database synchronization /schemacompare – launches a schema comparison /schemacompare /sync – launches a database synchronization /dataexport (starting from version 5.0) – exports data /dataimport (starting from version 5.0) – imports data /execute – executes a script Testing of the following GUI functionality completed with full success : Import Wizard Export Wizard Backup Wizard Restore Wizard SQL Formatter Wizard Data comparison . We were able to perform data comparison operation with some limitations: there is no possibility to view differences between the records of the relevant objects Data Synchronization Wizard doesn’t work at all Schema comparison . Testing of the schema comparison functionality finished with the following results: Schema Comparison Wizard works properly Schema Synchronization Wizard is supported excepting the possibility to view synchronization script for the object. However, you can save documents and use them as a command line argument Stored procedures debugger . Works properly if you disregard the artifacts. The rest of the functionality is associated with tool-windows or documents and mostly blocked because of problems with rendering. To avoid the issues associated with tool-windows, change the window status from Docked to Floating and vice versa. If you are interested in using [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) on Linux , you can use the following installation recommendations: 1. Build wine with the following options: app-emulation/wine-1.3.3 USE=\"X alsa dbus gecko jpeg ncurses opengl perl png ssl threads\ntruetype win32 xinerama xml (-capi) -cups -custom-cflags (-esd) -fontconfig -gnutls\n(-gphoto2) -gsm (-hal) -jack -lcms -ldap -mp3 -nas -openal -oss -pulseaudio -samba (-scanner)\n -test -win64 -xcomposite\" 2. Install to a new folder .wine files (optionally, but preferably) 3. Run: - sh winetricks gdiplus Note: Do not use gdiplus from wine (dbforge crashes with an error when starting) 4. Install dotnet20 (in Fedora13 the installation interrupts with an error): - sh winetricks dotnet20 5. Run: wine ./dbforgemysqlru.exe (installs with no visible issues) To further enhance your MySQL experience on Linux, we have also included a step-by-step instruction on [how to install MySQL on Linux](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-linux/) . Tags [command line](https://blog.devart.com/tag/command-line) [linux](https://blog.devart.com/tag/linux) [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-mysql-on-linux-operating-systems.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+for+MySQL+on+Linux+Operating+Systems&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-mysql-on-linux-operating-systems.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-mysql-on-linux-operating-systems.html&title=dbForge+Studio+for+MySQL+on+Linux+Operating+Systems) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-mysql-on-linux-operating-systems.html&title=dbForge+Studio+for+MySQL+on+Linux+Operating+Systems) [Copy URL](https://blog.devart.com/dbforge-studio-mysql-on-linux-operating-systems.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025 6 COMMENTS Rui Marques August 30, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 11:40 am Greate. Waiting for a full functional version on linux. With current mono evolution why not a full port to Mono enabling MAC, WIN and Linux users to enjoy this great tool? Dejan Lekic April 29, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 4:27 pm This is a major stopper for me to buy this product, FYI. As Rui wrote above, I honestly see no reason why not simply port to Mono and provide support to all platforms Mono works on… frozenjim August 9, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 11:48 pm Given the high percentage of developers who use Linux and Mac as their preferred platform, it seems less profitable to write this type of tool for only the Windows environment. Andrey Langovoy August 10, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 3:47 pm Hi, Currently we do not have any plans to develop native applications for MAC and Linux. However, you can use dbForge Studio for MySQL together with MAC. For this you need Parallels Desktop for Mac. If you have additional questions, please let us know. Thank you! M.Weineman February 23, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 2:46 pm did not work with actual wine versions and prevent me from buying. I really like dbForge but i don`t pay for a Windows exclusive program, since i mostly work on my Ubuntu Notebook. sad, but i`ve to evaluate some other, linux able SQL Studio Ian April 29, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 12:28 am Work requires that I use Linux, instead of Windows. dbForge was by far my favorite mysql editor when I was using Windows. But the inability to run it natively on Linux means I’ll need to find a replacement for dbforge studio. I don’t trust the integrity of running very important queries through WINE. Why windows only? it seems that a large portion of your target demographic are going to be linux users… Comments are closed."} {"url": "https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) dbForge Studio Shortlisted by Capterra and Awarded Front Runners 2021 by Software Advice By [dbForge Team](https://blog.devart.com/author/dbforge) July 27, 2021 [0](https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html#respond) 2552 [dbForge Studio](https://www.devart.com/dbforge/studio/) is a product line designed to encompass every aspect of database development and administration, providing versatile functionality in a single IDE. It is available for all major database management systems: [SQL Server](https://www.devart.com/dbforge/sql/studio/) , [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . One of the ways to evaluate its hard-earned reputation is to get acquainted with its accolades — and two of them are the subject of our today’s story. Front Runners 2021 by Software Advice Let’s start with Software Advice, a platform that continuously highlights the top-rated database management tools in North America based on real user reviews. Their list of [Front Runners 2021](https://www.softwareadvice.com/database-management-systems/#top-products) included dbForge Studio, which achieved a noteworthy balance of excellent usability and high customer satisfaction. Top Database Tools of 2021 by Capterra The second award is the inclusion of dbForge Studio in the [2021 Capterra Shortlist](https://www.capterra.com/database-management-software/#shortlist) . After a thorough analysis of all solutions in Capterra’s database software directory, they eliminated those that failed to meet their requirements for functionality and reviews. Finally, they made a shortlist of the most popular and highest-rated solutions — and dbForge Studio was included as one of the Emerging Favorites. This category includes products that were rated highly in terms of user satisfaction, but are not as well-known as the Top Performers. We are grateful to Capterra, Software Advice, and all of our users for this trust and recognition. It is a great incentive to aspire to new heights and keep our products ever relevant. About dbForge Studio dbForge Studio is a comprehensive IDE for database development and administration. It offers a set of varied tools for database developers, DBAs, and data analysts. The most notable functionality includes database design, management of database objects, tools for effective work with SQL queries and smart code autocompletion, test data generation, development and debugging of stored procedures, data analysis, reporting, comparison of database schemas and data, and CLI-based automation of most operations. Read our [success stories](https://www.devart.com/success-story/) to learn more about the experience of our customers with dbForge Studio — or simply [get a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and see its value for yourself. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studio+Shortlisted+by+Capterra+and+Awarded+Front+Runners+2021+by+Software+Advice&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html&title=dbForge+Studio+Shortlisted+by+Capterra+and+Awarded+Front+Runners+2021+by+Software+Advice) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html&title=dbForge+Studio+Shortlisted+by+Capterra+and+Awarded+Front+Runners+2021+by+Software+Advice) [Copy URL](https://blog.devart.com/dbforge-studio-shortlisted-by-capterra-and-awarded-front-runners-2021-by-software-advice.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) dbForge Studios Add Support for a Slew of Cloud Services and Databases By [dbForge Team](https://blog.devart.com/author/dbforge) January 27, 2023 [0](https://blog.devart.com/dbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html#respond) 2490 We’ve got some good news for you today! The extensive compatibility of our database tools became even broader with a good number of cloud services that you can now seamlessly use with dbForge Studio for PostgreSQL (and one of them is also supported by dbForge Studio for MySQL). That said, be sure you check the list below—perhaps you’re in for a nice surprise. DigitalOcean Managed Database First off, both [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) and [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) provide support for DigitalOcean Managed Database – a powerful database cluster service that offers multi-region, scalable managed databases as an alternative to installing, configuring, maintaining, and securing databases by hand. All of the remaining entries on this list are supported by [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) —and we’ll start with YugabyteDB. YugabyteDB YugabyteDB is yet another high-performance transactional SQL database designed for cloud-native applications. It is a cloud-native, distributed SQL database that aims to support all PostgreSQL features. AWS Babelfish AWS Babelfish is a solution that provides the capability for PostgreSQL to understand queries from applications written for Microsoft SQL Server—namely, to understand the SQL Server Wire Protocol and T-SQL, the Microsoft SQL Server’s proprietary SQL dialect. All in all, with the help of Babelfish, applications currently running on SQL Server can now run directly on PostgreSQL with fewer changes in the code, which makes migration faster and more cost-effective. CitusData Our next stop is Citus, a free and open-source extension of PostgreSQL, which enables flexible node-based scaling by distributing data and queries. Additionally, it delivers parallelized performance that speeds up queries, allows utilizing a single database for both transactional and analytical workloads, and can be used to run applications in the cloud with Azure Cosmos DB for PostgreSQL. Neon Tech Neon Tech delivers multi-cloud serverless PostgreSQL with on-demand scalability, simple data branching, auto-backups, and bottomless storage, built from the ground up as a fault-tolerant system for the cloud. ApsaraDB RDS ApsaraDB RDS is a stable and scalable online relational database service based on the Apsara Distributed File System and high-performance SSDs. It provides support for the most popular database engines (including PostgreSQL, MySQL, and SQL Server) and offers features such as instance and database management, performance optimization, account management, backup and recovery, monitoring and alerting, as well as data migration to facilitate database O&M. PolarDB Another database now supported by dbForge Studio is PolarDB, a cloud-native relational database compatible with PostgreSQL, MySQL, and Oracle. It combines the performance and availability of traditional enterprise databases and the flexibility and cost-effectiveness of open-source databases. Generally, PolarDB is designed for business-critical database applications that require good performance, high concurrency, and autoscaling. AnalyticDB One more service we would like to mention is AnalyticDB, a real-time data warehousing solution that can process data with high concurrency and low latency. It is designed to perform instant multidimensional analysis and business exploration for huge amounts of data. bit.io Last but not least, we have bit.io, another database service that delivers easily shareable and scalable serverless PostgreSQL. The key notion in this case is simplicity, and bit.io aims to help users set up their databases and upload data in the fastest way possible and without much effort. Download dbForge Studio for a free trial today! So, now all users of dbForge Studio for PostgreSQL can check these compatibility options in action. And if you are not using it yet, today’s a great day to start! Just [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) that will help you get acquainted with its capabilities and see how it meets your database development and management needs. All these platforms and databases are supported by [dbForge Edge](https://www.devart.com/dbforge/edge/) – Devart’s newest product designed to handle database-related tasks on four major database management systems. With its support for SQL Server, MySQL, Oracle, and PostgreSQL databases, Edge provides its users with a unified set of software tools to perform all tasks they need with a single solution. To try out the software, you can refer to the [free trial of dbForge Edge](https://www.devart.com/dbforge/edge/download.html) – get it for 30 days to test the full range of features, with no limitations on workload. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [MySQL](https://blog.devart.com/tag/mysql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Studios+Add+Support+for+a+Slew+of+Cloud+Services+and+Databases&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html&title=dbForge+Studios+Add+Support+for+a+Slew+of+Cloud+Services+and+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html&title=dbForge+Studios+Add+Support+for+a+Slew+of+Cloud+Services+and+Databases) [Copy URL](https://blog.devart.com/dbforge-studios-add-support-for-a-slew-of-cloud-services-and-databases.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/dbforge-team-releases-improved-transaction-log-v2-1.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Team Releases Improved Transaction Log v2.1 By [dbForge Team](https://blog.devart.com/author/dbforge) June 24, 2021 [0](https://blog.devart.com/dbforge-team-releases-improved-transaction-log-v2-1.html#respond) 2597 We are pleased to present the new version of Transaction Log 2.1 – our advanced tool for reading transaction logs and analyzing the history of data changes. What’s new in Transaction Log 2.1 Execute Large Script functionality Now dbForge Transaction Log 2.1 allows you to execute large scripts without opening SQL Editor and loading the whole script from memory. When you try to open a large script, you will be prompted to execute it with the help of the Execute Script Wizard. And to make things even better, we are adding the support for script execution from the command line. Command-line activation With this release, the dbForge team introduces the ability to activate the Transaction Log tool from the command line. Import and Export Settings feature To offer you a better user experience with the tool, we are adding the Import and Export Settings Wizard that will allow you to export, import, or reset the specific categories of the Transaction Log settings. Start protecting your data with dbForge Transaction Log [Download](https://www.devart.com/dbforge/sql/transaction-log/download.html) the latest version of the tool and enjoy the new functionality absolutely for free during a 30-day trial period. [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-team-releases-improved-transaction-log-v2-1.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Team+Releases+Improved+Transaction+Log+v2.1&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-team-releases-improved-transaction-log-v2-1.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-team-releases-improved-transaction-log-v2-1.html&title=dbForge+Team+Releases+Improved+Transaction+Log+v2.1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-team-releases-improved-transaction-log-v2-1.html&title=dbForge+Team+Releases+Improved+Transaction+Log+v2.1) [Copy URL](https://blog.devart.com/dbforge-team-releases-improved-transaction-log-v2-1.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [Product Release](https://blog.devart.com/category/product-release) [What’s New](https://blog.devart.com/category/whats-new) dbForge Tools for Oracle 6.0: A Myriad New Options and Enhancements to Make Your Daily Work Faster and Easier By [dbForge Team](https://blog.devart.com/author/dbforge) March 13, 2025 [0](https://blog.devart.com/dbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html#respond) 762 We’ve got another big update coming your way. This time it’s all about [dbForge tools for Oracle](https://www.devart.com/dbforge/oracle/) , which have received a huge number of new options, functional enhancements, and subtle design tweaks, all of which will work together to make your daily work with Oracle Database exceptionally easy and convenient. Without further ado, let’s see what we’ve got for you today. Contents SQL Document Code Completion Database Explorer Data Editor Data Compare Schema Compare Database Diagrams Data Generator Pivot Tables & Data Reports Script Generation Miscellaneous Enhancements Devart Academy SQL Document We’ll start with a set of new features that will speed up your work with queries in SQL documents. First off, now you can duplicate, remove, and join the current line using handy shortcuts . Next, you can use a new shortcut CTRL+/ to comment a selection of code inside a line . You can select the entire current line with a triple click . Three more shortcuts— Ctrl+C , Ctrl+V , and Ctrl+X —will help you copy, insert, and cut the entire current line to the clipboard , respectively. You can do it without selecting any particular text. You can quickly navigate between matching brackets using Ctrl+F12 . Next, we have tweaked highlighting within INSERT statements . Whenever you add a new value to the VALUES clause, the corresponding column that the value must be inserted into will be pinpointed and highlighted. And if, for instance, you add an extra value that has no matching column, there will be no highlighting. Next, we have improved the general behavior of syntax check . Now it detects errors even more precisely. Finally, we have added support for highlighting and quick navigation between syntax pairs , which include IF … ELSE, BEGIN … END, BEGIN TRY … END TRY, BEGIN CATCH … END CATCH, and CASE WHEN … ELSE … END. Code Completion Now let’s proceed to code completion, where we have improved the parsing of UNION and EXPRESSIONS . Next, we have implemented suggestions of non-aggregated columns in GROUP BY statements . With their help, you can quickly add non-aggregated columns after the GROUP BY keyword via a dropdown list, skipping the routine of seeking them out and inserting them manually. Instead, you can either add all the suggested columns from the SELECT List with a single click or add them one by one in the preferred order. We have added a new snippet, ssf , which expands to a SELECT * FROM block. Meanwhile, the identical sel snippet remains available. Now you can use whichever you like better. Next, for your convenience, we have replaced the Refresh Local Cache and Reset Suggestion Cache buttons with a single Refresh Suggestions button. We’ll conclude this section with the newly added suggestions , which cover the JSON data type for Oracle 21c as well as the Boolean and Vector data types for Oracle 23ai . Database Explorer You might have faced a situation when you have lots of nodes and tables in Database Explorer, and you need to filter out the stuff you don’t need, keeping only the stuff you’re currently working on. To do that once, you could always configure filter settings . But now, things are even better—you can save these filter settings to a file and reuse them at any given moment by simply loading this file. We have also improved the behavior of quick data retrieval. For instance, you can get data from tables, views, and materialized views by right-clicking them in Database Explorer and selecting Open Data in Editor from the context menu. To configure the behavior of this feature, go to Options > Database Explorer > General > Table and View Default Action . Data Editor To make filtering in Data Editor easier and more flexible, we have replaced the Custom AutoFilter window with a far more advanced and convenient Filter Editor window. You also have the option to set the value of a cell to a unique identifier by selecting Set Value To > Unique Identifier from the context menu. Data Compare In the New Data Comparison wizard, we have added a new option to ignore computed columns , available in the Auto Mapping section of the Options page. Next, we have made automated generation of comparison reports much more convenient with the familiar Save Command Line button, accessible directly in the Comparison Report wizard. And if you take a look at the Command line execution file settings dialog, you will see plenty of new customization options. You can scrutinize them in the following screenshot. In the data diff grid, you can quickly hide empty columns (both source and target ones) with a handy button. The Data Synchronization Wizard now includes a new page called Issues ; on this page, you can configure the default behavior in case of potential NULL/NOT NULL conflicts that may occur during the synchronization. Finally, there is a new option called Ignore internal spaces . When this option is selected, spaces, tabs, and other non-printable characters in the middle of a string will be ignored during the comparison. The option can be applied to the (N)CHAR, (N)VARCHAR2, (N)CLOB, and LONG columns. Schema Compare Schema Compare also delivers a few useful enhancements. First, we’ve added an option called Show Ignored Differences , which can be enabled by selecting the corresponding checkbox in Options > Schema Comparison > General . Once you enable this option, the application will highlight possible differences in DDL. Similarly to Data Compare, you can use the Save Command Line button in the Comparison Report wizard to create a command-line script for recurring reporting operations. The Command line execution file settings dialog has plenty of new options to offer. More top stuff includes a highly convenient visual Object Filter , which allows excluding objects from the synchronization according to specified custom criteria. You can save custom filters to files for further convenience. Moreover, you can apply filtering from the command line. Finally, when checking differences in a schema comparison document, you can quickly proceed to each subsequent difference with a handy shortcut Alt+↓ . Database Diagrams We have updated the design of database diagrams and implemented a few new tricks to make your work with database diagrams easier. For instance, table and view blocks are now semi-transparent to help you better see all the relationships between them. The same goes for containers ; what’s more, you can specify the required opacity from the context menu. Next, you get a new option that’s called Select All Relations , which does what it says—selects all relations on your diagram, including virtual relations and foreign keys. Another new option is Clear Waypoints , which eliminates all waypoints that have been manually created for a selected relation. Finally, the update introduces linear zoom factor and an overhauled diagram skins for further convenience. Data Generator We’ve got a few new tricks in Data Generator as well. For instance, you have a brand new JSON generator . Another new feature was based on an idea put forth by one of our users. Namely, we have expanded the list of masks by implementing support for FName and LName (as the first name and last name, respectively). Next, when you need to filter out tables and columns during population , you can use the standard set of shortcuts: Ctrl+A – select all Ctrl+← – move back word by word Ctrl+→ – move forward word by word Ctrl+Shift+← – select the entered text word by word (back) Ctrl+Shift+→ – select the entered text word by word (forward) Ctrl+Backspace – delete an entire word Finally, in Data Population Wizard , you get an option to append a timestamp to the name of the file that you save your data population script to. Pivot Tables & Data Reports Following another user request, we have upgraded Chart Designer to a new version, more advanced and easy on the eye. Additionally, we have adapted Data Report Wizard for the 4K resolution. Script Generation Now, a bit more on script generation. First off, we’ve implemented a new option to enclose identifiers within square brackets . To enable it, go to Options > Generate Scripts > General and find it in the Common group. Additionally, you have a new option to include DML triggers in your scripts via Options > Generate Schema Script > General . Miscellaneous Enhancements Our last stop for today is a multitude of miscellaneous enhancements that will make your experience with our tools as smooth as ever. First comes the newly implemented, more advanced algorithm for creating your database projects . Next, if you go to Options > Environment > Tabs and Windows , you will encounter a few new customization options, which include Tab layout , Show tabs in multiple rows , and Close tabs with middle-click . From the same Tabs and Windows , you can specify the order of opening new tabs. Thus, if you would prefer to insert new tabs to the right of the existing tabs , simply select the corresponding option checkbox. In the same Query Profiler, we’ve improved the display of estimated and actual query execution plans to help you analyze and work with them most effectively. Next, we have redesigned Query History . It delivers a new toolbar that includes an updated range selection and a handy Clear button that helps clear the history right away. Note that you can also export history to a CSV file directly from the toolbar. Whatever [dbForge tool for Oracle](https://www.devart.com/dbforge/oracle/) you’re using, you can export data directly from the grid to CSV . Just select the required range in the grid and proceed to Copy Data As > CSV from the context menu. Now, all commercial editions of dbForge tools for Oracle feature the comprehensive, unlimited functionality of the integrated SQL Formatter . Next, you get an improved search-and-filtering algorithm for the options available in formatting profiles . We have also implemented search for option names in Tools > Options (as opposed to the previously available search for sections only). Another useful option allows you to drop the destination object during object duplication. Next, you can easily sort your snippets by shortcut name in Snippets Manager. We have also optimized the installation process. Now, in order to install 32-bit executable modules alongside the Studio, you only need to select the corresponding checkbox during the installation. And finally, from now on, you can work with database snapshots in the Express Edition of dbForge Studio for Oracle. Devart Academy We will conclude our journey with a link to [Reinventing Oracle Database Management With dbForge Studio](https://www.devart.com/academy/oracle-studio/) , a course of Devart Academy that will helps you master the subtleties of Oracle Database by means of the Studio. Now you can access it directly from the Studio via Help > Demos and Video Tutorials . Get the updated dbForge tools for Oracle 6.0 right now! When it comes to developing and managing Oracle databases, our key product is [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , which basically combines everything you might need for your daily work. You can give the updated Studio a go right away by [downloading it for a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) . And if you are already using the Studio, the update is already waiting for you. Once you try it, please let us know what you think about it; we’d be glad to hear your feedback. Also note that the updated dbForge Studio for Oracle is available as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , our multidatabase solution that covers a number of other database systems (including SQL Server, MySQL, MariaDB, and PostgreSQL) alongside a rich variety of cloud services. dbForge Edge consists of four feature-rich IDEs that are a perfect fit for beginners and power users alike. And you don’t need to take our word for it—simply [download dbForge Edge for a free 30-day trial](https://www.devart.com/dbforge/edge/download.html) and see for yourself. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbforge studio](https://blog.devart.com/tag/dbforge-studio) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+Oracle+6.0%3A+A+Myriad+New+Options+and+Enhancements+to+Make+Your+Daily+Work+Faster+and+Easier&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html&title=dbForge+Tools+for+Oracle+6.0%3A+A+Myriad+New+Options+and+Enhancements+to+Make+Your+Daily+Work+Faster+and+Easier) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html&title=dbForge+Tools+for+Oracle+6.0%3A+A+Myriad+New+Options+and+Enhancements+to+Make+Your+Daily+Work+Faster+and+Easier) [Copy URL](https://blog.devart.com/dbforge-tools-for-oracle-6-0-a-myriad-new-options-and-enhancements-to-make-your-daily-work-faster-and-easier.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-tools-for-postgresql-got-a-new-update.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Tools for PostgreSQL Got a New Update! By [dbForge Team](https://blog.devart.com/author/dbforge) May 24, 2023 [0](https://blog.devart.com/dbforge-tools-for-postgresql-got-a-new-update.html#respond) 2474 Coming on the heels of our [recent big update of dbForge tools for MySQL](https://blog.devart.com/watch-out-for-a-big-new-update-of-dbforge-tools-for-mysql.html) , we’d love to share some more good news—and this time it’s all about our product line for PostgreSQL databases, which comprises three tools: [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) , and [dbForge Schema Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/schemacompare/) . Today’s update introduces several enhancements that will surely come in handy in your daily work. CONTENTS Newly added support for Aiven Data Cloud Newly added support for EDB BigAnimal Newly added support for SHA-256/512 Newly added support for Transport Layer Security (TLS) 1.3 An updated collection of predefined code snippets Automatic highlighting of syntax pairs Saved filters for Database Explorer folders User-defined column layout in Query History A handful of new options in ‘Tabs and Windows’ Newly added support for the UNLOGGED keyword in the CREATE UNLOGGED SEQUENCE command Optimized row count retrieval in Data Generator Improved behavior of the ‘Disable DML trigger’ option in Data Population Wizard Newly added support for Aiven Data Cloud First of all, let’s talk about expanded compatibility, with one more cloud data platform added to the canon. It’s called Aiven Data Cloud , and it provides the most comprehensive set of popular open-source database and streaming solutions as a managed service. And now you can access it using our tools. Newly added support for EDB BigAnimal We have also covered EDB BigAnimal , a fully managed PostgreSQL cloud database that helps businesses get rid of the database management overhead, rapidly migrate from Oracle to PostgreSQL (if necessary), and deploy their databases on multiple clouds across multiple regions. Newly added support for SHA-256/512 Next, we’ve upgraded the built-in SSH compatibility of our PostgreSQL tools and added support for SHA-256/512 . From now on, the latest, safest, and most advanced hash algorithm is at your service. Newly added support for Transport Layer Security (TLS) 1.3 We’ve also added support for the latest version of the Transport Layer Security cryptographic protocol—namely, version 1.3 . Now you can specify it on the Security tab of the Database Connection Properties window while connecting to the required database with our tools. An updated collection of predefined code snippets Now let’s move on to functional enhancements. First of all, we have expanded the integrated collection of predefined code snippets . Now you have 60 most useful snippets always at hand, plus you can create your own custom snippets and add them to your collection. Automatic highlighting of syntax pairs Who doesn’t love small yet highly useful enhancements that make daily work easier? We’ve got a few of those for you today. First off, from now on, as you type SQL code, syntax pairs will be highlighted automatically . Saved filters for Database Explorer folders Also, for your convenience, we’ve added the ability to save your filters for Database Explorer folders . User-defined column layout in Query History The following enhancement was suggested by one of our users, and we thought you all would find it useful—the opportunity to keep the user-defined column layout in Query History . This includes the actual column layout, grouped and sorted items, and applied filters. That said, now you can configure it all just once, and it’s going to be intact each time you go to Query History. Big thanks for the idea! A handful of new options in ‘Tabs and Windows’ If you go to Options > Environment > Tabs and Windows , you will encounter a few new options, which include Tab layout , Show tabs in multiple rows , and Close tabs with middle-click . Newly added support for the UNLOGGED keyword in the CREATE UNLOGGED SEQUENCE command The title of this new feature basically says it all, so we’ll just illustrate it with the screenshot below. Optimized row count retrieval in Data Generator Now let’s move on to the integrated Data Generator, where we have optimized row count retrieval that occurs while describing the data to be generated. Now the said retrieval is performed only by the generators that require it. Improved behavior of the ‘Disable DML trigger’ option in Data Population Wizard Our last stop is Data Population Wizard, where you can configure population options for newly generated data. Previously, the Disable DML trigger option was available only for triggers that have been created on tables. Now, for your convenience, it is available for triggers on views just as well. Get the updated dbForge tools today! If you are an active user of our [PostgreSQL tools](https://www.devart.com/dbforge/postgresql/) , you can update them, as usual, in the Help menu > Check for Updates . And if you are here for the first time, we gladly invite you to join in and see all of our PostgreSQL tools in action during a free 30-day trial , which is a nice way to learn all about their capabilities and overall performance. And in case you are looking for an integrated solution that would encompass maximum features across a variety of database management systems and cloud services⁠—including Microsoft SQL Server, MySQL, MariaDB, Oracle, Amazon Redshift, and PostgreSQL⁠—we suggest you try out [dbForge Edge](https://www.devart.com/dbforge/edge/) , a bundle of four IDEs that will satisfy even the most demanding functional requirements and make you productive from day one. Tags [dbForge Data Compare for PostgreSQL](https://blog.devart.com/tag/dbforge-data-compare-for-postgresql) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-postgresql-got-a-new-update.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+PostgreSQL+Got+a+New+Update%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-postgresql-got-a-new-update.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-for-postgresql-got-a-new-update.html&title=dbForge+Tools+for+PostgreSQL+Got+a+New+Update%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-for-postgresql-got-a-new-update.html&title=dbForge+Tools+for+PostgreSQL+Got+a+New+Update%21) [Copy URL](https://blog.devart.com/dbforge-tools-for-postgresql-got-a-new-update.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Product Release](https://blog.devart.com/category/product-release) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Tools for SQL Server 7.0: Big Release Overview By [dbForge Team](https://blog.devart.com/author/dbforge) September 6, 2024 [0](https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html#respond) 1024 Are you a user of [dbForge tools for SQL Server](https://www.devart.com/dbforge/sql/) ? If you are, then we’ve got a great slice of news for you—our entire product line for SQL Server has just been updated, and you’re welcome to get your update right now. If you are not, you may still want to take a look at all the goodies that come with this release. Our tools come with a hefty free trial, and we gladly invite you to give them a go. Down with the intros, let’s get started! Contents T-SQL Code Analyzer Connectivity DevOps & CLI Automation Code Completion Schema Compare Source Control Database Diagrams SQL Query History Documents Database Explorer Data Generator Script Generation Data Editor Pivot Tables Find Invalid Objects Search Index Manager Application Startup Time T-SQL Code Analyzer First and foremost, we’d love to tell you about T-SQL Code Analyzer , a comprehensive tool that helps developers and DBAs scrutinize and optimize T-SQL scripts, making sure they conform to precisely defined rules, guidelines, and best practices. T-SQL Code Analyzer is available in [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) (updated) and [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) (newly introduced). You should definitely check it out if you are seeking to improve your T-SQL coding practices and optimize database performance. With the help of the Analyzer, you will easily identify potential issues and performance bottlenecks in your T-SQL scripts; things like inefficient queries and missing indexes will no longer stand in your way. Connectivity As you might know, Azure Active Directory has been renamed to Microsoft Entra ID . Now this change is fully reflected in the interface of our tools, namely, in the Database Connection Properties dialog and in Options . We’ve also expanded the compatibility of our products to include native support for ApsaraDB . Thus there will be no more workarounds or challenges with integrating dbForge tools into your work with databases hosted on Alibaba Cloud. DevOps & CLI Automation Next, we have good news for Atlassian Bamboo users. If you have migrated from Bamboo Server to Bamboo Data Center —or if you’re right in the midst of this process—you ought to know that [dbForge DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) (available as part of the [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle) is now fully compatible with Data Center, and you can safely build your CI/CD around them. We’ve also got a few new automation features worth your attention. The first of these, available in [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , is Find Invalid Objects , which does exactly what it says on the tin; it’s a feature that helps identify and manage invalid database objects in the most efficient way. Now it’s been powered up with CLI automation to reduce manual intervention and thus help you get your job done even faster. Simply run the script against the required database and get the list of invalid objects that hinder its performance. If you’re a user of [dbForge Index Manager](https://www.devart.com/dbforge/sql/index-manager/) , you’ll be glad to know that we’ve reduced your routine index defragmentation to just a few clicks. Now you only need to configure it in Options just once, autogenerate a reusable script via Save Command Line , and simply run it from the command line whenever you need to take care of your indexes. Check the following screenshot to see how fast and easy it is. Last but not least, now you can generate comparison reports from the command line in [dbForge Data Compare](https://www.devart.com/dbforge/sql/datacompare/) , [dbForge Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) , and [dbForge Studio](https://www.devart.com/dbforge/sql/studio/) . Similarly to the previous case, you can configure them, generate a comparison script, and run it whenever you want. Code Completion Our next stop is a set of handy code prompting enhancements. The first one comprises the newly supported suggestions of non-aggregated columns in GROUP BY statements . With their help, you can quickly add non-aggregated columns after the GROUP BY keyword via a dropdown list, skipping the routine of seeking them out and inserting them manually. Instead, you can either add all the suggested columns from the SELECT List with a single click or add them one by one in the preferred order—as fast and convenient as it can get. Next, you can view the MSDescription property for the database you’re working with. What if you are using graph databases? Then we’ve got a way to simplify your work by adding support for the entire variety of T-SQL graph functions : EDGE_ID_FROM_PARTS GRAPH_ID_FROM_EDGE_ID GRAPH_ID_FROM_NODE_ID NODE_ID_FROM_PARTS OBJECT_ID_FROM_EDGE_ID OBJECT_ID_FROM_NODE_ID This is what it looks like. Next, we have added support for the COLUMN MASTER KEY server object. There’s a handful of other newly supported stuff, including: The TERTIARY_WEIGHTS function The RSA_OAEP algorithm in CREATE COLUMN ENCRYPTION KEY statements The PERSISTENT_LOG_BUFFER construct in CREATE DATABASE statements The AVAILABILITY GROUP construct (extended support) Finally, we have implemented suggestions for implicit procedure execution , ensuring the validity of all stored procedures that may not contain EXEC or EXECUTE. Schema Compare Now let’s proceed to the new features and enhancements awaiting you in Schema Compare. First, we’ve added an option called Show Ignored Differences , which can be enabled by selecting the corresponding checkbox in Options > Schema Comparison > General . Once you enable this option, the application will highlight possible differences in DDL. When checking differences in a schema comparison document, you can now go to each subsequent difference with a handy shortcut ALT+↓ . Next, we have introduced support for the ADD SENSITIVITY CLASSIFICATION command, which allows you to efficiently tag columns based on data sensitivity levels and information types. That’s good news in terms of data security and compliance—and you no longer need to detect and classify sensitive data manually. Both Schema and Data Synchronization Wizards now include a new page called Issues ; on this page, you can configure the default behavior in case of potential NULL/NOT NULL conflicts that may occur during the synchronization. Other new features related to Schema Compare include: A new index option STATISTICS_INCREMENTAL A new group of options called Sequences , comprising Ignore START WITH in sequences and Ignore MIN VALUE in sequences New comparison options: Ignore MIN VALUE , Ignore START WITH , Ignore CYCLE , Ignore INCREMENT BY , and Ignore CACHE Source Control Let’s move on to a few useful enhancements we’ve prepared for you in Source Control. First, we have implemented support for XML and HASH indexes for Azure . Second, you can now freely use the PERSISTED construct for table variables. Third, you have a handy Hide empty columns button in the static data diff grid. Database Diagrams If you’re an avid user of database diagrams, you will be pleasantly surprised by their updated design, featuring new element skins , opacity controls , and linear zoom factor . Now your diagrams will be even more engaging visually and much easier to examine. For your convenience, we have also added handy options like Select All Relations , which selects all relations on your diagram, and Clear Waypoints , which eliminates all waypoints that have been manually created for a selected relation. SQL Query History Now a few words about SQL Query History, where we’ve tweaked the interface a bit—namely, redesigned the toolbar with an updated date range selection and a handy new Clear button to help you free your storage by clearing the history. Additionally, following the requests of our users, a reopened Studio now keeps the user-defined layout of columns in SQL Query History. Documents Your work with documents in the Studio must be just as perfectly convenient as well. That’s why we have added some options to help you arrange your documents to your preferences. If you go to Options > Environment > Tabs and Windows , you will encounter a few new customization options, which include Tab layout , Show tabs in multiple rows , and Close tabs with middle-click . Next, if you right-click a table and select Open Data in Editor from the shortcut menu, the contents of the table will be opened in a document tab that will have the table name. Finally, here’s a handy new shortcut CTRL+SHIFT+/ to help you instantly comment a selection of code within a line . Database Explorer What do we have for you in Database Explorer? Well, first of all, we’ve implemented a solution to the situation when you have lots of nodes and tables in the Explorer, and you need to filter out the stuff you don’t need and keep only the stuff you’re working on. To do that, you could always configure filter settings. But now, things are even better—you can save these filter settings to a file and thus reuse them at any given moment by loading the said file. Also, when you duplicate a database object, you’ll see a new option called Drop destination object . When it is turned on, the duplication behavior is as follows: if there is an object with the same name in the target schema, it will be dropped before duplication. Next, since Search Property Lists are now fully supported by Azure SQL Database , you’ve got them in Database Explorer. Data Generator Let’s see what’s new in Data Generator. First off, in Data Population Wizard , you’ve got a new option that appends a timestamp to the name of the file that you save your data population script to. Second, when you look for the required tables and columns via the search box, you can use the following shortcuts : CTRL+A – select all text in the search box CTRL+← – go to the previous word CTRL+→ – go to the next word CTRL+SHIFT+← – select the previous word CTRL+SHIFT+→ – select the next word CTRL+BACKSPACE – delete the previous word Script Generation Now, a bit more on script generation. First off, the Studio automatically wraps CRUD templates into named regions . Next, you’ve got a new option to include DML triggers in your scripts via Options > Generate Scripts > General . Similarly, you can select a checkbox on the same list to include security permissions . Finally, we’ve added a third option to include authorization in your scripts. Data Editor In Data Editor, you’ve got a new option to quickly set a cell value to a unique identifier from the shortcut menu. Pivot Tables Following the requests of our users, we’ve upgraded our Chart Designer to a newer, more advanced version with an improved appearance. Find Invalid Objects When browsing databases in Find Invalid Objects, you can quickly Check All/Uncheck All databases on the list with the corresponding new buttons. Search In Search, we have added a button to export your search results to CSV . Index Manager In Index Manager, we have implemented our standard shortcut menu that will help you quickly arrange, sort, group, and filter entries in the grid. Application Startup Time The final new feature for today is that now, besides the startup timestamp, you can check the actual time it took the Studio to start. Get the updated dbForge Studio for SQL Server, dbForge SQL Tools, and dbForge Edge today! Although this post mostly focuses on [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) as our flagship all-in-one IDE, some of the described new features and enhancements duly appear in the corresponding apps and SSMS add-ins from our [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle. So if you’re looking to augment SSMS without switching to any alternative products, the updated SQL Tools might be just what you need, with a nice 30-day trial to get some firsthand experience with their power. And if you’re already using SQL Tools, the update is already waiting for you. We’ll be glad to hear your feedback and new feature suggestions. One last thing we’d like to mention today is that you can get the updated dbForge Studio for SQL Server as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , our multidatabase bundle that covers SQL Server, MySQL, MariaDB, Oracle, PostgreSQL, and a slew of other databases and cloud services with four definitive Studios that will make your daily database management easy and comfortable. Don’t take our word for it— [get dbForge Edge for a free 30-day trial](https://www.devart.com/dbforge/edge/download.html) and see for yourself. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [product release](https://blog.devart.com/tag/product-release) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-7-0-big-release-overview.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+SQL+Server+7.0%3A+Big+Release+Overview&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-7-0-big-release-overview.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html&title=dbForge+Tools+for+SQL+Server+7.0%3A+Big+Release+Overview) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html&title=dbForge+Tools+for+SQL+Server+7.0%3A+Big+Release+Overview) [Copy URL](https://blog.devart.com/dbforge-tools-for-sql-server-7-0-big-release-overview.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Tools for SQL Server Get Even More Powerful With a New Update By [dbForge Team](https://blog.devart.com/author/dbforge) December 30, 2022 [0](https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html#respond) 2672 The dbForge team is thrilled to announce the latest release of its dbForge tools for SQL Server product line. New Year is here soon, and we have a present for you – the entire dbForge for SQL Server product line got updated to meet the latest market requirements. The most noteworthy things about this release are that all the add-ins now support SSMS 19 Preview 4, and the Studio can now be run not only on Windows but also on Linux and macOS. However, let’s start with the beginning! What’s new in dbForge SQL Complete v.6.14 Support for SSMS 19 Preview 4 We, at Devart, always try to keep a finger on the pulse and deliver tools that meet the users’ needs and expectations. SQL Complete now supports the latest preview release of SSMS – SSMS 19 Preview 4 which was rolled out on December 15, 2022. Columns suggestion in the ORDER clause for the CREATE CLUSTERED INDEX statement We have added the suggestion of columns in the ORDER clause for a table specified after the ON operator in the CREATE CLUSTERED INDEX statement. What’s new in dbForge Studio for SQL Server v.6.4 Support for macOS & Linux via CrossOver With the latest release, dbForge Studio for SQL Server got a significant update in terms of OS compatibility – the application can now be installed and run on macOS and Linux via CrossOver. This long-awaited enhancement will allow those who prefer Linux and macOS to unlock the potential of the Studio. We sincerely hope this journey will be pleasant. Improved Code Completion on the MFA connections We enhanced the Studio’s code completion behavior on the Multi-Factor Authentication connections for our users to be even more efficient and productive. Optimized system resources consumption when preparing the Quick Info and Code Completion hints We reduced system resource usage required to deliver the Quick Info and Code Completion hints to make your work with the Studio’s Text Editor more cost-effective. Improved the behavior of the Ctrl+C/Ctrl+X shortcut to copy the entire line of text if nothing is selected To make the user experience with our application even smoother and the transition from SSMS to the Studio easier, we introduced this SSMS-like feature. Now if you press Ctrl+C or Ctrl+X in the Text Editor and no text is selected, the entire line gets copied to the clipboard. Columns suggestion in the ORDER clause for the CREATE CLUSTERED INDEX statement Now, in the ORDER clause, the Studio supports the suggestion of columns for a table specified after the ON operator in the CREATE CLUSTERED INDEX statement. How to get the latest release We really worked hard to make these updates available for you this year. You can get the newly released tools in the following packages. [Download a 14-day free trial of dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and get a super-powerful code completion and code formatting add-in for SSMS and VS. [Download a 30-day free trial of dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and get an all-in-one IDE for all possible SQL Server-related tasks. [Download a 30-day free trial of dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/download.html) and get two robust tools for comparing SQL Server databases: Schema Compare and Dara Compare. [Download a 30-day free trial of dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) and get the ultimate SQL Server developer toolkit that contains 15 essential tools. Those who already use our tools can upgrade by just clicking Help and then selecting Check for Updates on the menu. Tags [what's new in dbForge SQL Complete](https://blog.devart.com/tag/whats-new-in-dbforge-sql-complete) [what's new in dbForge SQL Tools](https://blog.devart.com/tag/whats-new-in-dbforge-sql-tools) [what's new in dbForge Studio for SQL Server](https://blog.devart.com/tag/whats-new-in-dbforge-studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+SQL+Server+Get+Even+More+Powerful+With+a+New+Update&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html&title=dbForge+Tools+for+SQL+Server+Get+Even+More+Powerful+With+a+New+Update) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html&title=dbForge+Tools+for+SQL+Server+Get+Even+More+Powerful+With+a+New+Update) [Copy URL](https://blog.devart.com/dbforge-tools-for-sql-server-get-even-more-powerful-with-a-new-update.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-tools-for-sql-server-scored-three-more-g2-badges.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) dbForge Tools for SQL Server Scored Three More G2 Badges By [dbForge Team](https://blog.devart.com/author/dbforge) March 28, 2022 [0](https://blog.devart.com/dbforge-tools-for-sql-server-scored-three-more-g2-badges.html#respond) 2507 We’ve got yet another achievement to tell you about! [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) have been awarded the High Performer Spring 2022 badges on G2, the world’s largest tech marketplace for business software. Additionally, dbForge Studio for SQL Server has been included by G2 in their Small-Business Grid Report as a High Performer. It is very encouraging to know that users rank dbForge tools among the best ones they have ever used; and here is some evidence to back it up—several random excerpts from the reviews of SQL Complete : “This is a must-have tool for SQL Server developers. It covers all the bases from automatically suggesting objects to code suggestions, making it easier to code your linked servers. I particularly like the ability to see object info at a glance.” “The tool is so well thought of and highly engineered, it’s a pleasure to use, and it improves your productivity by a significant margin.” “The features of SQL Complete completely changed my workflow. The formatting, snippets, and color coding saves me hours every week.” “Love using the execution history to find recent SQL statements executed for customer support or system installations. With this feature, I can easily continue where I left and don’t have to rewrite stuff.” “Thoughtfully designed and easy to use, yet highly sophisticated in its capabilities.” And now a few excerpts about dbForge Studio : “By far the best tool for querying and working with results. Database import/export and comparison has save hundreds of hours of work.” “I primarily need dbForge Studio to help migrate data between environments and it is fantastic for this purpose. I have tried other tools and this is by far the best that I’ve found for this.” “Firstly, I like its code completion and navigation features. Very handy. Initially, you took them for granted, but all the difference is felt only when you return to SSMS. Secondly, the solution has almost everything I need, like a Swiss knife – schema and data synchronization, database comparison, backup and restoration. And thirdly, the tool supports many data formats, for example, you can format cell content as JSON, XML, HTML, and if it contains an image, you can see the image in the data viewer window. Again, very handy.” “The query profiling is also great as it keeps track of changes and previous runs ensuring you can visually see changes as you optimize or alter a given query, which is simply fantastic.” “It’s a complete solution. I don’t need anything else.” A few words about dbForge SQL Complete Aren’t you using [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) yet? Then let us give you a brief overview. This is an advanced SSMS add-in that delivers a big boost to your coding speed, helps you beautify and refactor your SQL code, and delivers productivity enhancements and tools for data analysis. We would gladly like to invite you to try it. [Get your free 14-day trial of SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and see it in action! A few words about dbForge Studio [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is an integrated development environment for SQL databases. Think SSMS, but with far more features at your service: IntelliSense-like code completion, easy formatting, smart code refactoring, comparison and synchronization of table data and entire schemas, test data generation, database design, visual query building, data analysis, database administration, and many, many more. [Get your free 30-day trial of dbForge Studio](https://www.devart.com/dbforge/sql/studio/download.html) —and you’ll see why users can’t live without it! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [g2 awards](https://blog.devart.com/tag/g2-awards) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-scored-three-more-g2-badges.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+SQL+Server+Scored+Three+More+G2+Badges&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-for-sql-server-scored-three-more-g2-badges.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-for-sql-server-scored-three-more-g2-badges.html&title=dbForge+Tools+for+SQL+Server+Scored+Three+More+G2+Badges) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-for-sql-server-scored-three-more-g2-badges.html&title=dbForge+Tools+for+SQL+Server+Scored+Three+More+G2+Badges) [Copy URL](https://blog.devart.com/dbforge-tools-for-sql-server-scored-three-more-g2-badges.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbforge-tools-v7-1-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Product Release](https://blog.devart.com/category/product-release) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility! By [dbForge Team](https://blog.devart.com/author/dbforge) April 7, 2025 [0](https://blog.devart.com/dbforge-tools-v7-1-released.html#respond) 861 We are excited to announce the release of our ultimate tools for database management, administration, development, and deployment – [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . The new version, 7.1, mainly focuses on extended connectivity and compatibility. It allows SQL developers, database and system administrators to stay tuned and highly productive with the latest server features and quickly connect to databases without handling compatibility or connectivity issues. Let’s look at what’s been introduced in this update. SQL Server 2025 connectivity dbForge tools now support connectivity to SQL Server 2025, allowing you to perform database operations on the latest version with the same smooth compatibility and enhanced productivity you have experienced before. SSMS 21 Preview 5 integration dbForge tools are fully integrated into the SSMS 21 Preview 5, providing access to all the features and capabilities, including SQL Complete’s context-aware code completion, from a single environment for a better user experience. Windows Server 2025 compatibility With dbForge tools for SQL Server, you can take full advantage of managing databases in a stable environment on the Windows Server 2025 platform without worrying about compatibility issues. Availability Want to stay up to date? Download a free 30-day trial of the newly updated [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) from our website, give them a go in your daily work, and share your feedback about them. And if you’re already using our tools, simply get the update like you always do and let us know what you think. In addition, we have updated dbForge Studio for SQL Server as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , a multidatabase bundle, which you can easily use to perform a wide range of tasks on different database systems, including Microsoft SQL Server, MySQL, MariaDB, Oracle, PostgreSQL, and Amazon Redshift. With seamless cross-database support, intuitive UI, and advanced features, such as data generators and visual editors, dbForge Edge simplifies daily database tasks, reduces context switching, and improves overall development efficiency. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-v7-1-released.html) [Twitter](https://twitter.com/intent/tweet?text=dbForge+Tools+for+SQL+Server+v7.1+Released%3A+Extended+Connectivity+and+Compatibility%21&url=https%3A%2F%2Fblog.devart.com%2Fdbforge-tools-v7-1-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbforge-tools-v7-1-released.html&title=dbForge+Tools+for+SQL+Server+v7.1+Released%3A+Extended+Connectivity+and+Compatibility%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbforge-tools-v7-1-released.html&title=dbForge+Tools+for+SQL+Server+v7.1+Released%3A+Extended+Connectivity+and+Compatibility%21) [Copy URL](https://blog.devart.com/dbforge-tools-v7-1-released.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) 2011 DevProConnections nominates dbForge Studio for MySQL as Best IDE Product! By [dbForge Team](https://blog.devart.com/author/dbforge) August 16, 2011 [0](https://blog.devart.com/dbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html#respond) 3130 dbForge Studio for MySQL – Best IDE Product Support Your Favorite Devart Products in 2011 DevProConnections Voting! We would like to invite you to take part in 2011 DevProConnections voting. The Community Choice Awards, as presented by Windows IT Pro, SQL Server Magazine, and DevProConnections, allow you to decide, which IT products get chosen for acclaim and recognition. This year Devart [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is nominated in the following category. 2011 DevProConnections Community Choice Awards: Category 14 (page 2): Best IDE Product – dbForge Studio for MySQL Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html) [Twitter](https://twitter.com/intent/tweet?text=2011+DevProConnections+nominates+dbForge+Studio+for+MySQL+as+Best+IDE+Product%21&url=https%3A%2F%2Fblog.devart.com%2Fdbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html&title=2011+DevProConnections+nominates+dbForge+Studio+for+MySQL+as+Best+IDE+Product%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html&title=2011+DevProConnections+nominates+dbForge+Studio+for+MySQL+as+Best+IDE+Product%21) [Copy URL](https://blog.devart.com/dbfsmysql-nominated-devproconnections-as-best-ide-product-2011.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/debug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Debug Easily in Visual Studio Using dbForge Fusion Plugin Debugger By [dbForge Team](https://blog.devart.com/author/dbforge) June 25, 2021 [0](https://blog.devart.com/debug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html#respond) 2690 The article presents a detailed overview of the Code Debugger functionality that comes with dbForge Fusion for MySQL – a powerful MySQL development add-in for Visual Studio. During the development of database elements such as procedures, functions, triggers, and mere SQL scripts, there is a high probability of errors. The more complex your code is, the more difficult it is to debug it. Thus, sometimes a significant part of the development time is spent on finding and fixing errors. What is debugging in Visual Studio? Visual Studio has an in-built debugger. It is supported in all versions of the Studio and the manufacturer claims it is the core functionality of the IDE. Like other debugging tools, Visual Studio debugger provides capabilities for main debugging tasks: live debugging, inserting breakpoints, stepping into individual statements (Step Into, Step Out, Step Over), evaluating and changing variable values, etc. Visual Studio debugger freezes and failures However, lots of users claim to be experiencing various issues when actively debugging – the debugger in Visual Studio simply isn’t working properly. Those problems vary from “Visual Studio cannot start debugging” to “the IDE freezes on debug mode”. That forces developers to look for alternatives. And one of those is the dbForge Fusion for MySQL plugin. Visual Studio Debugger in dbForge Fusion  for MySQL plugin dbForge Fusion for MySQL is designed to simplify MariaDB and MySQL database development and optimize data management. With the plugin, you can perform all database development and administration tasks directly from Visual Studio. Lightweight and flexible dbForge Fusion for MySQL provides a convenient way to explore and maintain existing databases, design SQL statements and queries, as well as manipulate data in different ways. dbForge Fusion for MySQL contains a bunch of useful [MySQL tools for Visual Studio](https://www.devart.com/dbforge/mysql/fusion/features.html) , the most valuable of them being Code Debugger. The Code Debugger functionality that comes with dbForge Fusion for MySQL is bound to reduce time spent on searching for bugs and investigating why the code fails. Compatibility: Visual Studio 2019 Visual Studio 2017 Visual Studio 2015 Visual Studio 2013 Debugging doesn’t necessarily need to be painful. And we will prove it. How to debug a stored procedure Step 1. Compile a stored procedure for debugging Before starting to debug a procedure, you need to compile it for debugging. To compile a stored procedure for debugging, right-click it in Database Explorer, point to Debug , and then select Compile for Debugging as shown on the screenshot below. Step 2. Set breakpoints Breakpoints mark the places in your code where program execution pauses letting you examine debugging data and program behavior. Breakpoints can be simple (for example, pausing the program on reaching a specific line of code) or entail more complex logic (checking against additional conditions, creating log messages, etc.). You can insert a breakpoint by using one of the following methods: right-click a line of executable code where you want to set a breakpoint and select Insert Breakpoint place the cursor where you want to set a breakpoint and press F9 click in the far left margin next to a line of code where you want to set a breakpoint. Step 3. Start debugging Press F11 to start debugging. It is the Step Into command and it makes the app execute one statement at a time. If the line contains a call to any stored database object, you can go to the definition of that object. All you need is to place a cursor on that object and press F12 or right-click the object and click Go to Definition . In our case, the script calls a stored function. Step 4. Add variables You can trace the changes to variables on the Autos tab. The tool also allows you to add a variable or cursor to a separate watchlist and see how the values change after the manipulations with that variable. If something goes wrong, you will immediately understand where the error occurs. You can see instantly see how the variables change by hovering a cursor on them in the script. Step 5. Analyze the results Upon the successful completion of the debugging operation, you will see a window with the value returned by the procedure. Conclusion Debugging is definitely one of the most tiresome parts of any database development process.  Writing code can be exciting, but combing through the scripts in an attempt to find bottlenecks is far from that. That’s where a robust debugging tool would be of great help. dbForge Fusion for MySQL is a feature-rich [MySQL Visual Studio plugin](https://www.devart.com/dbforge/mysql/fusion/) offering its users mighty debugging capabilities. Try it out and bring your MySQL and MariaDB database development to a whole new level. Stop compromising on code quality right now! [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdebug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html) [Twitter](https://twitter.com/intent/tweet?text=Debug+Easily+in+Visual+Studio+Using+dbForge+Fusion+Plugin+Debugger&url=https%3A%2F%2Fblog.devart.com%2Fdebug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/debug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html&title=Debug+Easily+in+Visual+Studio+Using+dbForge+Fusion+Plugin+Debugger) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/debug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html&title=Debug+Easily+in+Visual+Studio+Using+dbForge+Fusion+Plugin+Debugger) [Copy URL](https://blog.devart.com/debug-easily-in-visual-studio-using-dbforge-fusion-plugin-debugger.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dedicated-link-source-control-repository.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Dedicated Link Source Control Repository By [dbForge Team](https://blog.devart.com/author/dbforge) May 6, 2021 [0](https://blog.devart.com/dedicated-link-source-control-repository.html#respond) 2604 This is the fifth part of an extensive article dedicated to database versioning. Before you proceed with it, we suggest that you read the previous parts: [Part I: Version Control with Git](https://blog.devart.com/version-control-system-version-control-with-git.html) [Part II: Database Versioning with Examples](https://blog.devart.com/database-versioning-with-examples.html) [Parts III & IV: Database Version Control Using Source Control for SQL Server](https://blog.devart.com/database-version-control-using-source-control-for-sql-server.html) In the final part, let’s review the Dedicated mode for repository connection. It implies that every developer works with their own local copy of the database. By the way, if you are still not familiar with dbForge Source Control, feel free to watch this [introductory video](https://youtu.be/YhyqFYy5_XI) . First, we unlink the JobEmplDB database from the source control repository. Next, right-click the database, point to [Source Control](https://www.devart.com/dbforge/sql/source-control/) , and then click Unlink Database from Source Control: Picture 1. Unlinking the database from the source control repository A new window will appear asking to confirm the operation. Click OK: Picture 2. Confirm unlinking the database from the source control repository We have unlinked the JobEmplDB database. Now, let’s link it again to the source control repository: right-click the JobEmplDB database, point to Source Control, and then click Link Database to Source Control in the context menu: Picture 3. Linking the database to the source control repository In the window that appears, configure the necessary parameters and select the Dedicated mode. Click Link: Picture 4. Configure the database linking to the source control repository Also, if you want to know how to collaborate on the same database project using Devart’s Source Control for SQL Server, feel free to watch [this video](https://youtu.be/2lc4W2yiUf0) . Now, let’s change the Company table by adding the Source column. Use the following syntax: ALTER TABLE [dbo].[Company] ADD [Source] NVARCHAR(255); To commit the changes, right-click the JobEmplDB database again, point to Source Control, and then click Commit: Picture 5. Viewing changes for commit In the window that appears, we can see all the changes already selected. We only need to add a comment and click Commit to send them to the repository: Picture 6. Selected changes for commit When the commit process is over, we’ll get a new window informing us that the changes have been committed successfully. Click OK: Picture 7. Changes committed successfully Now, let’s check that the Company table definition has been changed in the repository at [GitHub](https://github.com/jobgemws/JobEmplDB/blob/master/Tables/dbo.Company.sql) : Picture 8. The Company table definition in the GitHub repository More information on the development models is available at [Shared vs Dedicated Development Models: Key Differences](https://blog.devart.com/shared-dedicated-development-models.html) . Conclusion In this article, we reviewed the Dedicated mode for repository connection using [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) . It can be integrated into [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) to help the user version-control databases. Source Control is part of [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/download.html) developed by [Devart](https://www.devart.com/) . Tags [database versioning](https://blog.devart.com/tag/database-versioning) [dedicated database development model](https://blog.devart.com/tag/dedicated-database-development-model) [shared database development model](https://blog.devart.com/tag/shared-database-development-model) [source control](https://blog.devart.com/tag/source-control) [version control](https://blog.devart.com/tag/version-control) [versioning](https://blog.devart.com/tag/versioning) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdedicated-link-source-control-repository.html) [Twitter](https://twitter.com/intent/tweet?text=Dedicated+Link+Source+Control+Repository&url=https%3A%2F%2Fblog.devart.com%2Fdedicated-link-source-control-repository.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dedicated-link-source-control-repository.html&title=Dedicated+Link+Source+Control+Repository) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dedicated-link-source-control-repository.html&title=Dedicated+Link+Source+Control+Repository) [Copy URL](https://blog.devart.com/dedicated-link-source-control-repository.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/delete-duplicate-rows-in-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Remove Duplicate Rows in MySQL By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) March 28, 2024 [0](https://blog.devart.com/delete-duplicate-rows-in-mysql.html#respond) 1936 Can it happen that the data stored in databases include identical records? Yes, it happens frequently. However, having duplicate records in databases is a scenario that should be avoided. Duplicates pose an ongoing risk to the data consistency and the overall database efficiency. Database administrators (DBAs) spend a significant portion of their time identifying and removing these duplicates. This article will explore the issue of duplicate records, including their origins, the effects they have on databases, and strategies for swiftly detecting and permanently removing duplicates. Table of contents Duplicate records in MySQL: Causes and consequences Preparing the test table and the environment Method 1: Using DELETE JOIN Method 2: Using temporary tables Method 3: Using GROUP BY with aggregation Method 4: Using ROW_NUMBER() window function How visual database managers transform data handling Conclusion Duplicate records in MySQL: Causes and consequences Duplicate records in MySQL refer to identical rows within a specific SQL table. Before we proceed to explore methods to remove duplicate records from the databases, we need to understand their origins. Early detection and prevention of duplicates is the most effective approach. Factors leading to duplicates include: Lack of unique identifiers : Fields that should be unique (such as user IDs, SSNs, email addresses, etc.) are crucial. The system should verify the uniqueness of each entry against existing records. Without this mechanism, duplicates are likely to occur. Insufficient validation checks : Having unique identifiers alone doesn’t guarantee the absence of duplicates if they fail to meet strict requirements and integrity constraints to be effective. Data entry errors : Even with proper identifiers and validation checks in place, mistakes during data entry can still lead to duplicates. Ideally, each database record should be unique, representing a distinct entity. When records get duplicated, it leads to data redundancies and inconsistency: Data redundancies : This issue arises when the same data is stored multiple times, wasting storage space and causing confusion. Data inconsistency : Duplicates can corrupt the results of data retrieval operations. Unfortunately, no single method can completely (entirely) prevent duplicate records. The focus is on reducing their occurrence and manually addressing them when they arise. Consequently, DBAs face a dual challenge: identify and eliminate duplicate records, and mitigate their effects on the dataset. Let’s examine some practical examples of detecting and deleting duplicates. Preparing the test table and the environment We will use the standard MySQL test database sakila . To demonstrate various methods of deleting duplicate results, we also apply [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , a powerful, multi-featured integrated development environment (IDE) designed to handle all database tasks in MySQL and MariaDB. In the sakila test database, we have created a new test table called customer_movie_rentals . It has duplicate records: We can also check for duplicates using the below script – the output will inform us how many duplicate rows the table has. This option is convenient when dealing with large tables: SELECT\n customer_id,\n first_name,\n last_name,\n COUNT(customer_id)\nFROM customer_movie_rentals\nGROUP BY customer_id\nHAVING COUNT(customer_id) > 1; Another popular method of searching for duplicates involves the EXISTS statement: -- EXISTS\nSELECT t1.duplicate_column\nFROM table_name t1\nWHERE EXISTS (\n SELECT 1\n FROM table_name t2\n WHERE t1.duplicate_column = t2.duplicate_column\n AND t1.unique_column < t2.unique_column); In our case, we modify this query to target the customer_movie_rentals table. The EXISTS query will find records with the same customer_id , film_id , and rental_date values but with a different rental_id (assuming rental_id is the unique identifier for each rental): SELECT t1.customer_id, t1.first_name, t1.last_name, t1.film_id, t1.title, t1.rental_date\nFROM customer_movie_rentals t1\nWHERE EXISTS (\n SELECT 1\n FROM customer_movie_rentals t2\n WHERE t1.customer_id = t2.customer_id\n AND t1.film_id = t2.film_id\n AND t1.rental_date = t2.rental_date\n AND t1.rental_id != t2.rental_id); The output shows us all duplicate records in the customer_movie_rentals table: Now we need to get rid of those duplicates. Several methods are available in MySQL: DELETE JOIN Temporary Table Using GROUP BY Using ROW NUMBER() Let us try these methods and also consider the best scenarios for each of them. Method 1: Using DELETE JOIN One of the most common methods of deleting duplicate records from MySQL tables is using DELETE JOIN – the INNER JOIN clause in the DELETE statement. The DELETE JOIN method allows for deleting rows matching other rows from the same table. The basic syntax of this query is as follows: DELETE t1\n FROM table_name AS t1\n INNER JOIN table_name AS t2\nWHERE t1.unique_column < t2.unique_column\n AND t1.duplicate_column = t2.duplicate_column; Parameters: t1 and t2 are aliases for the table containing duplicate rows. These aliases are necessary to represent two logical instances of the same table to allow the SELF JOIN operation. INNER JOIN is necessary to find duplicate rows in the table in question. WHERE clause is the condition specifying which rows should be deleted. In this case, the query should keep the first record and delete the subsequent duplicate records. Now, our goal is to delete the duplicates from our customer_movie_rentals table using the DELETE JOIN statement. The query is as follows: DELETE cr1\n FROM customer_movie_rentals cr1\n INNER JOIN customer_movie_rentals cr2\n ON cr1.customer_id = cr2.customer_id\n AND cr1.rental_id > cr2.rental_id; In that query, cr1 and cr2 are aliases for the customer_movie_rentals table, and the condition cr1.rental_id > cr2.rental_id serves to retain only the row with the lowest rental_id value for each group of duplicates. Here the output of the query is: The duplicates for the customer_id values 333, 222 , 126, 142, and 319 have been successfully deleted. Using DELETE JOIN is most suitable for removing duplicates based on a specific join condition, especially when it is clear how tables (or data portions within one table) relate to each other. Also, this method is useful for complex queries, so you can control precisely which records to delete. Learn [how to restore a MySQL database using CLI](https://blog.devart.com/how-to-restore-mysql-database-from-backup.html) to revert the changes. Method 2: Using temporary tables [Temporary tables in MySQL](https://www.devart.com/dbforge/mysql/studio/mysql-temporary-table.html) are a good choice when you need to retrieve, manipulate, and store some data portion temporarily without cluttering the databases. They can also help us detect and remove duplicates. As a rule, the process of deleting duplicate records with the help of temporary tables is as follows: Create a new temporary table with the same structure as the original table (with duplicates to delete). Insert unique rows from the original table into the temporary table. Truncate the original table and insert unique rows back to it from the temporary table. This method is helpful when the duplicate rows are identical in all column values. Then we can use the SELECT DISTINCT command to copy only the unique rows into the temp table. See the test table in the screenshot below. As you can see, some rows have identical values in all columns. First, we create a temp table and copy only the unique rows from the customer_movie_rentals table into it: -- 1. Create a temporary table\nCREATE TEMPORARY TABLE temp_customer_movie_rentals LIKE customer_movie_rentals;\n\n-- 2. Insert distinct rows into the temporary table\nINSERT INTO temp_customer_movie_rentals\nSELECT DISTINCT *\nFROM customer_movie_rentals;\n\n-- 3. View the temp table\nSELECT * FROM temp_customer_movie_rentals; We can see that the temp table has only 15 rows, and all of them are unique. Now we can transfer this dataset back into the original table. -- 1. Truncate the original table\nTRUNCATE TABLE customer_movie_rentals;\n\n-- 2. Insert records back into the original table\nINSERT INTO customer_movie_rentals\nSELECT * FROM temp_customer_movie_rentals;\n\n-- 3. View the original table\nSELECT * FROM customer_movie_rentals; As you see, the original customer_movie_rentals table now has 15 rows instead of 20 and does not have duplicates. The advantage of using a temporary table to delete duplicate rows is that it offers better data safety. You copy the unique records into a new temp table, and it won’t affect the original table. Then you can view the records in the temp table separately, compare it with the original table, and make sure that you have all the necessary data. Method 3: Using GROUP BY with aggregation The purpose of the [GROUP BY clause in MySQL](https://blog.devart.com/mysql-group-by-tutorial.html) is to group table rows by one or several columns. When dealing with the duplicate rows issue, we can apply GROUP BY to identify the duplicates and then construct a query to delete them. This procedure involves several steps: Define the criteria for identifying duplicate rows. For instance, when you have a set of rows with identical values in all relevant columns, these rows are duplicates. Define the criteria for the row to keep when deleting duplicates. Usually, it is the smallest or the largest unique identifier. Delete all rows except for those matching the specific criteria. The below script uses the aggregate MIN() function to determine the smallest unique identifier for the group of duplicates, and then it defines the range of rental_id values for unique rows. Finally, the query deletes rows matching the non-unique criterion. The below script will remove duplicate rows from the customer_movie_rentals table using the GROUP BY clause with the aggregate function MIN(): DELETE\n FROM customer_movie_rentals\nWHERE rental_id NOT IN (SELECT\n rental_id\n FROM (SELECT\n MIN(rental_id) AS rental_id\n FROM customer_movie_rentals\n GROUP BY CONCAT(customer_id, first_name, last_name)) AS duplicate_customer_ids); Using GROUP BY with aggregate functions is suitable when you want to identify duplicates based on the specific combination of columns, and where the criteria for duplicates are clear. However, it involves subqueries, which is not the most effective option especially when dealing with large tables. Therefore, it is not as straightforward as using temporary tables or DELETE JOIN in MySQL. Method 4: Using ROW_NUMBER() window function The ROW_NUMBER() function is one of the basic [MySQL window functions](https://www.devart.com/dbforge/mysql/studio/mysql-window-functions.html) that assigns unique numbers to each row, starting with 1 and further, sequentially. If we have duplicate records, their unique numbers will be > 1. This allows us to quickly identify duplicate records. Let us apply ROW_NUMBER() to our customer_movie_rentals table: SELECT\n rental_id,\n ROW_NUMBER() OVER (\n PARTITION BY customer_id\n ORDER BY customer_id\n ) AS row_num\nFROM customer_movie_rentals; The output shows us the list of all rental_id values and we can quickly define the duplicate ones by simply sorting the results by the row_num column: Now we can remove the duplicates using the DELETE command with the subquery: DELETE\n FROM customer_movie_rentals\nWHERE rental_id IN (SELECT\n rental_id\n FROM (SELECT\n rental_id,\n ROW_NUMBER() OVER (\n PARTITION BY customer_id\n ORDER BY customer_id\n ) AS row_num\n FROM customer_movie_rentals cmr) t\n WHERE row_num > 1); The output shows us no duplicates in the table. Using the ROW_NUMBER() window function allows us to identify duplicate rows by assigning them unique numbers within the partition of the results set. This method is applicable when you have a complex set of duplicate criteria. Again, using the DELETE command with subqueries can consume more user time to construct the query and more resources to execute it. How visual database managers transform data handling We used dbForge Studio for MySQL to demonstrate how to remove duplicate rows. It is just one of many features available in this tool. dbForge Studio is a comprehensive integrated development environment (IDE) that is [more advanced than the default MySQL Workbench](https://www.devart.com/dbforge/mysql/studio/alternative-to-mysql-workbench.html) . It provides users with a wide range of tools for managing MySQL and MariaDB databases, from SQL coding to version control, and more. In our work, we primarily applied the coding assistance module of dbForge Studio, which offers extensive features to accelerate coding tasks, such as phrase auto-completion, context-sensitive suggestions, syntax checking, code formatting, and more. The functionality available in this module aims to enhance efficiency and ensure high-quality results. The software’s graphical user interface presents query results in an easily interpretable format. Specifically, it enabled us to visually identify duplicate rows and formulate queries with greater precision to eliminate them. While this example highlights just one aspect of dbForge Studio for MySQL, the IDE has consistently demonstrated its value to database developers, managers, and administrators globally, streamlining their tasks, optimizing workflows, and reducing manual labor. Conclusion While it is crucial to implement efficient data validation and cleaning processes to minimize the occurrence of duplicate records, some level of data entry errors causing these duplicates may still occur. There are multiple approaches to detecting and removing duplicate rows in MySQL tables, and we have reviewed them in this article. The choice depends on your unique conditions, work requirements, and preferences. As eliminating duplicates is necessary to maintain data integrity, modern tools like dbForge Studio for MySQL can simplify this process, making it quicker and more efficient. This Studio is a comprehensive solution for managing database tasks, and you can assess its effectiveness in your work environment with a [30-day free trial](https://www.devart.com/dbforge/mysql/studio/download.html) . This trial provides full access to advanced features and professional support from the vendor, so you can integrate this tool effectively into your workflows. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelete-duplicate-rows-in-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Remove+Duplicate+Rows+in+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fdelete-duplicate-rows-in-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delete-duplicate-rows-in-mysql.html&title=How+to+Remove+Duplicate+Rows+in+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delete-duplicate-rows-in-mysql.html&title=How+to+Remove+Duplicate+Rows+in+MySQL) [Copy URL](https://blog.devart.com/delete-duplicate-rows-in-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/delphi-component-for-dbf.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Access DBF Databases in Delphi By [DAC Team](https://blog.devart.com/author/dac) October 30, 2019 [0](https://blog.devart.com/delphi-component-for-dbf.html#respond) 5056 xBase is the generic term for all databases deriving from the original dBase database format (.dbf). The list includes Visual FoxPro, Clipper, dBase III, dBase IV, and many others. These databases are informally known as dBase clones because they are either direct descendants of dBase or mimick it. xBase databases store large amounts of formatted data in a structured form in .dbf data files. In dBase-like databases, a .dbf file stores a single table where the table description, field descriptors, and records are kept. Modern dBase-like databases also have files for large text fields (memos), commands, procedures, backups, etc. There are various database engines that can read and manipulate data in DBF files, but none of them understands all formats of xBase databases – most of these database  engines can interact with one or two dialects of the xBase family. Unlike other existing solutions, the Delphi data access provider TDBFUniProvider in [UniDAC](https://www.devart.com/unidac/) provides an engine that understands DBF files across many dBase-like databases. TDBFUniProvider provides direct access to xBase databases and supports all dBase native data types (character, numeric, logical, data, memo). It serves as a SQL engine that executes your commands against database files. The Delphi code in your project is compiled into an executable file that doesn’t need any other external files to access and manipulate data in DBF files. [UniDAC Delphi component](https://www.devart.com/unidac/) for accessing xBase database files supports multiple database file formats: dBase III, dBase IV, dBase V, dBase VII, FoxPro2, Visual FoxPro, Clipper, CodeBase, HiPer-Six. Besides allowing a developer to use the standard SQL-92 syntax, it offers a fast way to rebuild a table and remove deleted records to reduce the database file size. Creating a Sample Delphi App to Access xBase Databases We’ll create a simple Delphi application that will connect to a database in Visual FoxPro format and display records from a table when you click the Display button. 1. Find the TUniConnection, TUniQuery, TUniDataSource, TDBFUniProvider, TDBGrid and TButton components in the Tool Palette and drop them on the form. 2. Double-click the UniConnection1 component on the form. Switch to the Options tab and set Direct to True. 3. Go back to the Connect tab, choose DBF as Provider and enter the path to Visual FoxPro (or any other xBase database) on your machine. Click Connect. If everything goes well, the red circle will become green. 4. In UniDataSource1, set the DataSet property to UniQuery1. 5. Select DBGrid1 and set the DataSource property to UniDataSource1. 6. Choose the UniQuery1 component and set the Connection property to UniConnection1, then double-click the component and enter your SQL statement. Click OK. 7. Change the Caption property of the button to Display in Object Inspector. Double-click the button, switch to the Code tab and add UniQuery1.Open; to the OnClick event handler code. 9.  Press F9 to compile and run your application. If the program compiles without the errors, you should see the compiled form application.  Click Display to fetch and view the data in your Visual FoxPro table. Retrieval of Corrupted Data and Metadata xBase dialects have a long history, and situations where fields in .dbf files contain data of unsupported type are not so rare. To resolve any issues with unsupported data types, UniDAC provides two options: IgnoreDataErrors and IgnoreMetaDataErrors . The former option forces UniDAC to ignore corrupted data errors when opening a DBF table, and the latter forces UniDAC to ignore metadata errors:  when both options are set to True, corrupted data will be skipped, and other data will be properly retrieved. Another feature is the ability to automatically determine the dialect of an xBase database when you don’t exactly know the format of your DBF files – the dfAuto value in the DBFFormat option. Supported Target Platforms UniDAC supports multiple target platforms: you can create an application that accesses DBF databases for: Windows, 32-bit and 64-bit macOS, 32-bit and 64-bit iOS, 32-bit and 64-bit Android, 32-bit and 64-bit Linux, 32-bit and 64-bit Tags [clipper](https://blog.devart.com/tag/clipper) [dac](https://blog.devart.com/tag/dac) [dbase](https://blog.devart.com/tag/dbase) [dbf](https://blog.devart.com/tag/dbf) [delphi](https://blog.devart.com/tag/delphi) [foxpro](https://blog.devart.com/tag/foxpro) [visual foxpro](https://blog.devart.com/tag/visual-foxpro) [xbase](https://blog.devart.com/tag/xbase) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-component-for-dbf.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Access+DBF+Databases+in+Delphi&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-component-for-dbf.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-component-for-dbf.html&title=How+to+Access+DBF+Databases+in+Delphi) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-component-for-dbf.html&title=How+to+Access+DBF+Databases+in+Delphi) [Copy URL](https://blog.devart.com/delphi-component-for-dbf.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/delphi-dac-for-firebird-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) Delphi DAC for Firebird 4: Using the new INT128, DECFLOAT, and large NUMERIC types By [DAC Team](https://blog.devart.com/author/dac) October 5, 2021 [0](https://blog.devart.com/delphi-dac-for-firebird-4.html#respond) 2812 This article describes how to work with Firebird 4 in Delphi data access components. The latest version of [IBDAC](https://www.devart.com/ibdac/) and [UniDAC](https://www.devart.com/unidac/) supports Firebird 4 and the new data types: INT128, DECFLOAT, and NUMERIC. INT128 is a 128-bit integer data type that isn’t specified in the SQL standard. It’s worth mentioning that unsigned integer data types aren’t supported by Firebird. DECFLOAT is a data type based on the SQL standard that stores decimal floating values with precision. It should be used if you need to precisely calculate and store numbers. Thus, the DECFLOAT is a great option for financial applications, for example, where the precision of the calculations is a key requirement. According to IEEE standards, Firebird implements 16-digit and 34-digit DECFLOAT types’ encoding. But, to perform any interim calculations, only 34-digit values are used. NUMERIC is a 16-, 32-, 64-, or 128-bite decimal point number, depending on the precision. NUMERIC limits are the following: precision must be from 1 to 38; it defines a possible number of digits to store; the scale must be from 0 to 38; it sets the digits’ number after the decimal point. For scale, there is an important condition – it must equal the precision or be less. The new data types work as follows. By default, INT128, DECFLOAT, and NUMERIC with PRECISION 19 or more are returned as a string and are represented in the DataSet as a TStringField since TFMTBCDField fields are not created if the EnableFMTBCD option is not enabled. The goal is to maintain backward compatibility with the previous versions of Firebird and with UniDAC and IBDAC based applications that are being migrated to the new version of the components. To represent INT128, DECFLOAT, and NUMERIC with PRECISION 19 as a TFMTBCDField in DataSet and to have parameters with these types described as ftFMTBCD, you need to set the following connection option: for IBDAC : IBCConnection.Options.EnableFMTBCD := True; for UniDAC : UniConnection.Options.EnableFMTBCD := True; Also, you can use the data mapping feature to fine-tune the mapping. For example, if you want to represent INT128 as ftLargeInt, DECFLOAT as Double, and Numeric with PRECISION over 19 as ftFMTBCD, set the following mapping rules: for IBDAC : // INT128 \nIBCConnection.DataTypeMap.AddDBTypeRule(ibcInt128, ftLargeint); \n// DECFLOAT \nIBCConnection.DataTypeMap.AddDBTypeRule(ibcDecFloat16, ftFloat); \nIBCConnection.DataTypeMap.AddDBTypeRule(ibcDecFloat34, ftFloat); \n// DECIMAL and NUMERIC with PRECISION 19 and more \nIBCConnection.DataTypeMap.AddDBTypeRule(ibcDecimal, 19, 38, ftFMTBcd); \nIBCConnection.DataTypeMap.AddDBTypeRule(ibcNumeric, 19, 38, ftFMTBcd); for UniDAC : // INT128 \nUniConnection.DataTypeMap.AddDBTypeRule(ibcInt128, ftLargeint); \n// DECFLOAT \nUniConnection.DataTypeMap.AddDBTypeRule(ibcDecFloat16, ftFloat); \nUniConnection.DataTypeMap.AddDBTypeRule(ibcDecFloat34, ftFloat); \n// DECIMAL and NUMERIC with PRECISION 19 and more \nUniConnection.DataTypeMap.AddDBTypeRule(ibcDecimal, 19, 38, ftFMTBcd); \nUniConnection.DataTypeMap.AddDBTypeRule(ibcNumeric, 19, 38, ftFMTBcd); For more information about the data type mapping, see [Data Type Mapping in Delphi DAC](https://blog.devart.com/data-type-mapping-in-delphi-data-access-components.html) . The data mapping rules can also be set up in design time. Tags [delphi](https://blog.devart.com/tag/delphi) [how to](https://blog.devart.com/tag/how-to) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-for-firebird-4.html) [Twitter](https://twitter.com/intent/tweet?text=Delphi+DAC+for+Firebird+4%3A+Using+the+new+INT128%2C+DECFLOAT%2C+and+large+NUMERIC+types&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-for-firebird-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-dac-for-firebird-4.html&title=Delphi+DAC+for+Firebird+4%3A+Using+the+new+INT128%2C+DECFLOAT%2C+and+large+NUMERIC+types) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-dac-for-firebird-4.html&title=Delphi+DAC+for+Firebird+4%3A+Using+the+new+INT128%2C+DECFLOAT%2C+and+large+NUMERIC+types) [Copy URL](https://blog.devart.com/delphi-dac-for-firebird-4.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in Delphi DAC: Support for RAD Studio 10.4.2 and Multiple Performance Improvements By [DAC Team](https://blog.devart.com/author/dac) March 3, 2021 [0](https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html#respond) 2911 Following the release of RAD Studio 10.4.2 Sydney from last week, we are excited to announce support for the new versions of Delphi and C++ Builder IDEs in our data access [components](https://www.devart.com/dac.html) . DAC products are now also compatible with macOS Big Sur, iOS 14, and Android 11. Additionally, PostgreSQL 13 was supported in PgDAC. Data access speed with default settings was significantly increased in LiteDAC and the SQLite provider. The performance of batch operations was improved in all DAC products. During batch processing, SQL statements are grouped into a single unit of work, known as a batch, and submitted to the database server in a single call, thereby reducing the network latency. We also reduced memory consumption in batch operations for InterBase and Firebird. The LOBs read/write speed was improved for Oracle, SQL Server, DBF files, and ODBC drivers. The data fetch speed was also improved for Oracle and ODBC drivers. The PrefetchRows property, which allows you to set the number of rows to be prefetched during query execution, was supported in the Direct mode (previously available only in the OCI mode). The FindFirst, FindNext, FindLast, and FindPrior methods to search for records in a dataset using filters now work much faster in all DACs. The Over-the-Wire (OTW) encryption feature of InterBase was supported in IBDAC to allow you to secure your data during the transmission process with SSL/TLS encryption. Also automatic detection of computed fields when generating update statements was improved in IBDAC. In PgDAC, a single TPgConnection object can now be used in multiple threads, and a new property called MultipleConnections allows or denies creation of additional internal connections. Three new properties were added in LiteDAC: JournalMode, LockingMode, and Synchronous. The Unicode standard was supported in the TVirtualQuery component. The LastWarningCount property and the OnWarning event were added in MyDAC to get the number of warnings received from the MySQL server and define an event handler method. Devart Delphi Data Access Components are suites of components for direct access to common databases and cloud services and allow creating multi-platform database applications in Embarcadero RAD Studio, Delphi, C++Builder, Lazarus, and Free Pascal for Windows, Linux, macOS, iOS, and Android, both 32-bit and 64-bit. To try out these features, you are welcome to download new versions of our data access components: [UniDAC 8.4](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 11.4](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 9.4](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 10.4](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 7.4](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 6.4](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 4.4](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 11.4](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] Join our [forum](https://forums.devart.com/viewforum.php?f=42) to discuss database application development. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [macOS](https://blog.devart.com/tag/macos) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-support-for-rad-studio-10-4-2.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4.2+and+Multiple+Performance+Improvements&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-support-for-rad-studio-10-4-2.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html&title=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4.2+and+Multiple+Performance+Improvements) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html&title=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4.2+and+Multiple+Performance+Improvements) [Copy URL](https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/delphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in Delphi DAC: Support for the Latest Versions of Lazarus and FPC By [DAC Team](https://blog.devart.com/author/dac) August 28, 2020 [0](https://blog.devart.com/delphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html#respond) 3524 We are excited to announce that new versions of Delphi data access components are now generally available. All DAC products received support for the latest versions of Lazarus (2.0.10) and Free Pascal Compiler (3.2.0). Industry standard security features were added to ODAC — SSL/TLS, SSH, and HTTP/HTTPS tunneling. ODAC relies on components of our other product, SecureBridge, to establish secure connections to Oracle Database. We significantly improved performance of Insert, Update, and Delete operations in all products. The latest versions of Oracle (20c) and SQL Server (2019) were supported in this release. We also implemented support for native dBase functions in SQL statements in the DBF provider of UniDAC. Devart Delphi Data Access Components are suites of components for direct access to multiple databases and cloud services, which allow developing multi-platform database applications in Embarcadero RAD Studio, Delphi, C++Builder, Lazarus, and Free Pascal for Windows, Linux, macOS, iOS, and Android, both 32-bit and 64-bit. You are welcome to download and try the new versions. [UniDAC 8.3](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 11.3](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 9.3](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 10.3](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 7.3](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 6.3](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 4.3](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 11.3](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] Join our [forum](https://forums.devart.com/viewforum.php?f=42) to discuss database application development. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [lazarus](https://blog.devart.com/tag/lazarus) [macOS](https://blog.devart.com/tag/macos) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+Delphi+DAC%3A+Support++for+the+Latest+Versions+of+Lazarus+and+FPC&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html&title=New+in+Delphi+DAC%3A+Support++for+the+Latest+Versions+of+Lazarus+and+FPC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html&title=New+in+Delphi+DAC%3A+Support++for+the+Latest+Versions+of+Lazarus+and+FPC) [Copy URL](https://blog.devart.com/delphi-dac-support-for-the-latest-versions-of-lazarus-and-fpc.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/delphi-data-access-components-for-rad-studio-alexandria.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) Delphi Data Access Components for RAD Studio 11 Alexandria with support for Firebird 4 and Apple M1 By [DAC Team](https://blog.devart.com/author/dac) September 13, 2021 [0](https://blog.devart.com/delphi-data-access-components-for-rad-studio-alexandria.html#respond) 2856 After only a few days since the release of RAD Studio 11 Alexandria, we are ready to share the availability of Delphi Data Access Components for the new RAD Studio with support for Firebird 4 and the ARM platform (Apple M1). Our components fully support all the new features introduced in RAD Studio Alexandria. Devart is the first vendor of Delphi components to support Firebird 4 which was released on June 1, 2021. Firebird 4 introduces new data types and many improvements, such as logical replication, longer metadata identifiers, timeouts for connections and statements, and more. Other important updates include new data providers for BigQuery and HubSpot in UniDAC. You will be able to connect to these data sources from your Delphi or C++ Builder applications provided that the respective drivers are installed on the computer. The latest version of NexusDB 4.50.27 was supported in UniDAC. The IntegerAsLargeInt connection option, which maps SQLite INTEGER columns to fields of type ftLargeInt, was added in LiteDAC and UniDAC. Also, a new demo project for FastReport FMX was added to all DAC products. You are welcome to download the new versions: [UniDAC 9.0](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 12.0](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 10.0](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 11.0](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 8.0](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 7.0](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 5.0](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 12.0](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [macOS](https://blog.devart.com/tag/macos) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-data-access-components-for-rad-studio-alexandria.html) [Twitter](https://twitter.com/intent/tweet?text=Delphi+Data+Access+Components+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-data-access-components-for-rad-studio-alexandria.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-data-access-components-for-rad-studio-alexandria.html&title=Delphi+Data+Access+Components+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-data-access-components-for-rad-studio-alexandria.html&title=Delphi+Data+Access+Components+for+RAD+Studio+11+Alexandria+with+support+for+Firebird+4+and+Apple+M1) [Copy URL](https://blog.devart.com/delphi-data-access-components-for-rad-studio-alexandria.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/delphi-data-access-components-with-support-for-rad-studio-10-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in Delphi DAC: Support for RAD Studio 10.4 By [DAC Team](https://blog.devart.com/author/dac) June 2, 2020 [0](https://blog.devart.com/delphi-data-access-components-with-support-for-rad-studio-10-4.html#respond) 2712 We are very excited to announce the release of our Delphi DAC products with support for the newly released RAD Studio 10.4 Sydney. We also have exciting news for users of Lazarus: our Delphi data access components now officially support macOS 64-bit in the latest version 2.0.8 of Lazarus. Other updates in this release include: support for the Pipe, Secure Pipe, and Secure TCP protocols in NexusDB; support for the Line geometric type in PgDAC; a new option AllFieldsAsNullable in DBF. You are welcome to download and try the new versions. [UniDAC 8.2](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 11.2](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 9.2](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 10.2](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 7.2](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 6.2](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 4.2](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 11.2](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] You are welcome to join our [forum](https://forums.devart.com/viewforum.php?f=42) to discuss database application development. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [lazarus](https://blog.devart.com/tag/lazarus) [macOS](https://blog.devart.com/tag/macos) [rad studio](https://blog.devart.com/tag/rad-studio) [what's new delphi dac](https://blog.devart.com/tag/whats-new-delphi-dac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-data-access-components-with-support-for-rad-studio-10-4.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-data-access-components-with-support-for-rad-studio-10-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-data-access-components-with-support-for-rad-studio-10-4.html&title=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-data-access-components-with-support-for-rad-studio-10-4.html&title=New+in+Delphi+DAC%3A+Support+for+RAD+Studio+10.4) [Copy URL](https://blog.devart.com/delphi-data-access-components-with-support-for-rad-studio-10-4.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) Delphi InterBase: Comprehensive Guide to Features, Installation, and Usage By [Victoria Shyrokova](https://blog.devart.com/author/victorias) December 27, 2024 [0](https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html#respond) 650 Delphi InterBase is a well-known SQL database for developers who want to embed a lightweight database management system into their cross-platform applications. Its popularity comes from its high performance, fast data processing, and seamless integration with Delphi, offering a hassle-free setup without the administrative overhead of other systems. Now, the whole process is even easier if you use the right tools. In this article, we’ll walk you through everything you need to know about Delphi InterBase, including the step-by-step integration process. To help you get your projects up and running faster, we’ll use InterBase Data Access Components (IBDAC) for the integration. It’s a comprehensive library that connects your IBDAC-based apps straight to the InterBase server via the native client —no need for extra layers and data access solutions. Table of contents Key features of InterBase Editions and licensing options Getting started with InterBase 2020 Using InterBase with Delphi Conclusion Key features of InterBase InterBase is built on the following elements: Native SQL capabilities: InterBase supports SQL standards, including complex joins and Unicode, along with advanced features like stored procedures and triggers. High database performance: Its multigenerational architecture allows multiple users to access data simultaneously without locking, which is crucial for high-traffic applications. Fast speed: InterBase uses modern multicore CPUs for quick data access and updates while maintaining a small footprint, with an option to run entirely in memory. Flexible data accessibility: It allows cross-platform deployment with a consistent on-disk format, plus offline capabilities and easy synchronization. Admin-free management: It handles thousands of transactions per second without constant backups, featuring quick crash recovery and incremental backups for data safety. Robust security : It provides AES 256-bit encryption for data in transit and at rest, along with a separate security login system for easy user access control. Reliable disaster recovery: It lets you perform live backups without interrupting users, allowing quick data dumps and multithreaded restores to ensure swift recovery from failures. Efficient replication: InterBase’s Change Views feature allows you to track changes at the field level easily, creating subscriptions for specific tables and columns to ensure secure updates. Editions and licensing options InterBase offers [four editions](https://www.embarcadero.com/products/interbase/product-editions) to choose from. Take a look at the table below to see the pricing and licensing options available: InterBase Edition Pricing Licensing Developer Free Up to 20 users and 8 CPUs. IBLite Free 1 user with a maximum of 1 CPU. IBToGo $64 per new user; $32 per upgrade user. 1 user and up to 4 CPUs. Desktop $64 per new user; $32 per upgrade user. 1 user and up to 4 CPUs; limited to local database connections. Server Custom Depends on purchase, up to 32 CPUs. Getting started with InterBase 2020 InterBase 2020 doesn’t require extensive configurations. If you choose the Desktop or Server editions, the installation process is typically a simple wizard-driven setup. You can also integrate them into your custom installer. Things are even easier if you go for the ToGo or IBLite editions — you just have to link to the InterBase libraries. Installation requirements Before going full in, double-check the installation requirements for your InterBase edition. While all versions come with RAD Studio support, there are some differences in terms of what operating systems and hardware they work with. InterBase Developer, Desktop and Server Edition A Windows (from 2008 to 11) or Linux (RHEL 8, SuSE 11.3, or Ubuntu 18 or 20) system with an Intel x86/x86-64 processor About 250MB of free space on your hard disk InterBase ToGo Edition Works on the same Windows and Linux versions, plus macOS (10.15 to 12), Android (8 to 13), and iOS (14 to 17) If you’re using a Java-based app, install the Java Runtime Environment (JRE) library To learn more about the specific installation requirements for your edition, check Embarcadero’s [InterBase documentation](https://docwiki.embarcadero.com/InterBase/2020/en/Installation,_Registration,_and_Licensing_Information#System_Requirements.2FPrerequisites) . Installation Follow the steps below to install InterBase through their installer: Download your InterBase’s edition from the official site. Keep your serial number and Embarcadero Developer Network username and password for later. Double-click on install_windows.exe (Windows) or ib_install.exe (macOS). Click on Install InterBase 2020 once the installer launches. In Unix, you must first provide your Administrator password. Follow the on-screen instructions. In the Choose Install Set panel, select whether you want a Server and Client or Client only install. Then, you’ll be asked if you want to set up multiple instances. Choose Yes or No and click Next . In the next panel, you can either change the installation location or keep the default. After that, review your choices in the Pre-Installation Summary . If everything looks good, click Install . Once the installation is complete, the InterBase License Registration window pops up. Add your serial number, username, and password, then click Finish . If you want to do a silent install or need help with troubleshooting any issues that come up during the process, read [InterBase’s Installation guide](https://docwiki.embarcadero.com/InterBase/2020/en/Installation,_Registration,_and_Licensing_Information#Installation_and_Registration) . Using InterBase with Delphi Delphi integration with InterBase can be clunky if you use dbExpress, BDE, or InterBase Express. These usually complicate your setup with additional data access layers and software. In contrast, [IBDAC](https://www.devart.com/ibdac/) connects directly to the InterBase server through the client software, which simplifies deployment and allows you to write more streamlined applications. It supports the full range of InterBase features and comes with optimized data access , so you can create apps with faster connection reopening, better query execution, and smooth updates. If your project uses BDE or IBX, you can migrate using the Migration Wizard and automate the process to save time. Other key features include integration with RAD Studio and Delphi 6+ among other IDEs, offline capabilities, automatic data updates, and support for building client/server applications using Delphi or C++Builder Professional. Plus, IBDAC has extensive documentation, quick technical support, and plenty of demo projects to help you get started. Integrating InterBase in Delphi applications With IBDAC’s built-in Delphi support and easy database integration, you can speed up application development and connect your project to InterBase in minutes. Here’s an overview of the process: Requirements First, you need to install IBDAC. Simply download the version compatible with your Delphi IDE, run the installer, and follow the prompts. Also, make sure the InterBase server is running and you know the server address, port, database file path, username, and password. You’ll need these to create a connection. Create a connection Now, use the server’s details to set up the TIBCConnection component or define the parameters in the ConnectString property. Once you’ve set them up, you can choose to connect either at design-time or runtime. At design-time, you can visually configure the TIBCConnection component in the Delphi IDE using either the TIBCConnection Editor or Object Inspector . If you opt for runtime connections, you’ll programmatically set these properties in your code before calling the Open method to establish the connection. Open a connection After configuring the connection, the next step is to open it so you can start interacting with the database. You have a couple of options for this: you can either set the Connected property to True in the Object Inspector at design-time, or you can do it in your code at runtime. You can also simply use the Open method in your code Insert data Let’s say you have a products table defined as follows: ProductID ProductName Category Price 1 Bluetooth Speaker Electronics 49.99 2 Coffee Maker Appliances 89.99 3 Yoga Mat Fitness 19.99 If you want to add a new product at design-time , you can write your own SQL statement and execute it using the TIBCQuery . Just add the TIBCQuery object to your form. If this is your first one, it’ll automatically be named IBCQuery1. Double-clicking on it opens the SQL editor, where you can write: INSERT INTO products VALUES (4, 'Electric Kettle', 'Appliances', 39.99); If you prefer to insert data at runtime , use the Insert , Append , and Post methods from TIBCQuery and TIBCTable components. Below we’ve used the Append method: var\n  IBCQuery1: TIBCQuery;\nbegin\n  IBCQuery1 := TIBCQuery.Create(nil);\n  try\n    IBCQuery1.Connection := IBCConnection1;\n    IBCQuery1.SQL.Text := 'SELECT * FROM products';\n    IBCQuery1.Open;\n    IBCQuery1.Append;\n    IBCQuery1.FieldByName('ProductID').AsInteger := 4;\n    IBCQuery1.FieldByName('ProductName').AsString := 'Electric Kettle';\n    IBCQuery1.FieldByName('Category').AsString := 'Appliances';\n    IBCQuery1.FieldByName('Price').AsFloat := 39.99;\n    IBCQuery1.Post;\n  finally\n    IBCQuery1.Free;\n  end;\nend; Retrieve Data If you need to retrieve data from a table, the TIBCQuery and TIBCTable components are also your go-to. For example, to fetch all the records from the products table using TIBCQuery, write: IBCQuery1.Connection := IBCConnection1; \n IBCQuery1.SQL.Text := 'SELECT * FROM products'; \n IBCQuery1.Open; \n ShowMessage(IntToStr(IBCQuery1.RecordCount)); Modify Data Now, suppose you want to update a product from the table. You can either write a DML statement like: IBCQuery1.SQL.Add('UPDATE products SET ProductName = :ProductName, Price = :Price WHERE ProductID = :ProductID;');\nIBCQuery1.ParamByName('ProductID').AsInteger := 4;\nIBCQuery1.ParamByName('ProductName').AsString := 'Toaster';\nIBCQuery1.ParamByName('Price').AsFloat := 34.55;\nIBCQuery1.Execute; Or use the Edit and Post methods of the TIBCQuery and TIBCTable components to skip writing SQL: IBCQuery1.FindKey([4]); \nIBCQuery1.Edit; \nIBCQuery1.FieldByName('ProductName').AsString := 'Toaster'; \nIBCQuery1.FieldByName('Price').AsFloat := 34.55; \nIBCQuery1.Post; Check [IBDAC’s comprehensive documentation](https://docs.devart.com/ibdac/using_ibdac.htm) to learn other use cases and getdetailed guidance on its functionalities. Conclusion InterBase provides all the essentials to develop Delphi applications with an embedded database that scales and runs across multiple platforms without hogging your system resources. IBDAC simplifies database integration giving you direct access to InterBase servers, which lets you create flexible applications faster. [Try IBDAC for free](https://www.devart.com/ibdac/) to skip the headaches of complex configurations and cumbersome Delphi InterBase integrations. Tags [Delphi InterBase](https://blog.devart.com/tag/delphi-interbase) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-interbase-comprehensive-guide-to-features-installation-and-usage.html) [Twitter](https://twitter.com/intent/tweet?text=Delphi+InterBase%3A+Comprehensive+Guide+to+Features%2C+Installation%2C+and+Usage&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-interbase-comprehensive-guide-to-features-installation-and-usage.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html&title=Delphi+InterBase%3A+Comprehensive+Guide+to+Features%2C+Installation%2C+and+Usage) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html&title=Delphi+InterBase%3A+Comprehensive+Guide+to+Features%2C+Installation%2C+and+Usage) [Copy URL](https://blog.devart.com/delphi-interbase-comprehensive-guide-to-features-installation-and-usage.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) Top 23 Delphi Programming Software, Libraries, and Components to Use in 2025 By [Victoria Shyrokova](https://blog.devart.com/author/victorias) November 6, 2024 [0](https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html#respond) 752 You need robust, stable, up-to-date Delphi programming software, libraries, and tools to make your development journey smoother. Without these, you’ll struggle to integrate your native apps with cloud services, design modern UI frameworks, and work with the latest databases. Besides, their wide range of pre-built functionalities saves you a lot of time. You can use multiple tools to quickly create user-friendly interfaces that look great and perform well, bind data, sync your app, and avoid the hassle of a DB Client library to manage your database connections — the right DAC components will let you do this with just a few clicks. To help you pick the best tools for your projects in 2025, we’ll break down the top trending Delphi tools, libraries, and GUI frameworks below. Table of contents Top GUI frameworks and components for Delphi in 2025 Libraries for Delphi developers Components for database connectivity (Delphi Data Access Components) Tools to enhance Delphi development Visual component libraries Debugging and testing Utilities for code management Third-party integrations Open-source tools and libraries New and noteworthy in 2025 Conclusion Top GUI frameworks and components for Delphi in 2025 Delphi’s RAD environment makes building GUIs pretty easy. It features two primary frameworks: the Visual Component Library (VCL) for Windows applications and FireMonkey (FMX) for cross-platform development . Both include many built-in components , like buttons, text boxes, and tables, so you don’t need to write a lot of code to make advanced interfaces. It also has a drag-and-drop component palette and visual designer that allow you to create responsive GUIs for any device in no time. You can use tools like the Object Inspector to handle events and customize component properties. If you end up with a bunch of data-driven components, using the Data Modules feature will help you group them quickly to make them easier to find. FireMonkey (FMX) in RAD Studio FireMonkey was included in RAD Studio in version XE2, and it added what the environment was clearly missing: cross-platform capabilities. It lets you create user interfaces that run on Windows , macOS , iOS , and Android . There’s also a third-party library called FMX Linux that is available for the Enterprise and Architect editions of RAD Studio and allows your apps to run on Linux platforms. Whether you’re a solo developer or part of a large team, you can use FMX to create native themes and responsive layouts with GPU-accelerated graphics for fluid animations. It also provides easy integration with native platform features , such as sensors, cameras, and push notifications. DevExpress VCL DevExpress VCL is a suite of GUI controls for C++Builder and Delphi environments . These controls speed up your workflow and help you build high-performing, Windows-only applications — even if you’re handling complex data sets. A very helpful control in DevExpress VCL is the Data Grid . It lets you display and organize data in all sorts of ways, from filtering and sorting to grouping. Other widely used tools include the Ribbon Control , which you can use to provide your applications with Microsoft Office-style GUIs, the Chart Control for data visualization via many chart types, and the Scheduler for easy appointment and event management. It also has specialized components like the Pivot Grid for summarizing data, and the Skin Library , which comes with tons of built-in themes. Libraries for Delphi developers There is a wide range of Delphi libraries that can boost your productivity and make your life easier, whether you need help with data manipulation, user interface design, or network communication. We will discuss some of the most relevant ones for 2025 below, but there are lots out there, both open-source and commercial. There are even libraries that were originally designed for other languages but can be adapted for use with Delphi, like C and Python libraries. Jedi Code Library (JCL) The JEDI Code Library is a set of reliable and well-documented code for Delphi and C++ Builder apps . It can be used freely in any open-source and commercial projects. This library includes lots of reusable code for text processing, file management, and network connectivity. Basically, it provides a large collection of ready-to-use, tested, and documented code components and tools that you can quickly integrate into your applications, helping you save time and work. Spring4D Tired of wrestling with spaghetti code that is impossible to test in Delphi? Spring4D is an open-source code library, targeting Delphi XE and above , with the purpose of keeping your code more organized, maintainable, and easier to test. It provides a solid base class library with common types , interface-based collection types , and also extensions for reflection . One of its standout features is the dependency injection framework . Instead of hard-coding dependencies between objects, Spring4D lets you define how they work together, making your code cleaner. It also gives you a richer set of collections for dealing with data than Delphi’s run-time library built-in ones . For instance, IEnumerable enables more expressive, functional-style programming; TOrderedDictionary maintains predictable iteration order, which traditional dictionaries don’t guarantee; and TMultiMap allows multiple values for a single key — which isn’t available in Delphi’s RTL. MARS REST library If you want to create RESTful web services in Delphi and find solutions like DataSnap limiting or downright impractical, MARS REST library is worth checking out. It’s compatible with Delphi versions from XE to 10 Seattle , but some features may require FireDAC, so if you’re using the Professional edition, you might miss on some functionalities. MARS REST library makes creating APIs really simple and intuitive. It works with OpenAPI 3 and lets developers quickly define endpoints and handle requests without extensive boilerplate code. It also supports automatic serialization and deserialization of data , making working with JSON and other data formats easier. The best thing about it is that it is optimized for performance, enabling rapid request handling and response generation , which is crucial for high-traffic applications. Additionally, it’s lightweight , so it won’t slow down your computer, whether you’re using it for a small app or a big enterprise solution. Components for database connectivity (Delphi Data Access Components) Delphi programming software comes with built-in database connectivity components. But it isn’t perfect, especially if your project involves complex database operations. For instance, with versions prior to XE7, you won’t be able to connect directly to modern databases without adding extra drivers. If you’re working with legacy technologies like Delphi’s BDE (Borland Database Engine), which is now deprecated and doesn’t support Unicode, you’ll have to deal with performance issues. Transaction management can also be a bit tricky. If you’re trying to process multiple related database updates without a reliable DAC, you may face errors during rollbacks and auto-commit settings. [Delphi Data Access Components](https://www.devart.com/dac.html) is an easy-to-use solution that connects your cross-platform apps to most popular databases and cloud services . It comes with a number of specialized libraries: ODAC for native connectivity to Oracle, EntityDAC for quick object-relational mapping, and VirtualDAC for creating in-memory data storage capable of executing SQL queries. It also includes UniDAC , which lets you connect to multiple databases through one unified interface. These components provide essential functionalities like advanced data access methods and support for complex data types without needing fine-tuning. In addition, DAC is regularly updated to work with the latest IDE versions and editions, including Community Edition: Embarcadero RAD Studio 12 Athens, Delphi 12 Athens, C++ Builder 12 Athens, Lazarus (and FPC). It is also compatible with the previous IDE versions since Delphi 6 and C++Builder 6. UniDAC UniDAC is a versatile library that lets you switch between different database systems without having to rewrite significant portions of your code — all you need to do is change one connection option. It’s a high-performance solution with low memory consumption, especially [compared to its alternatives](https://blog.devart.com/unidac-vs-firedac-performance-and-memory-consumption-comparison.html) . It works with more than 25 databases and over 13 major cloud services . These include Oracle, MySQL, PostgreSQL, InterBase, and FireBird, as well as Salesforce, SugarCRM, Microsoft SQL Azure, and Adobe Commerce (Magento). You can connect directly to these platforms from a single interface, using TCP/IP or static linking. If you’re curious about how UniDAC can simplify your work, you can try it for free . You’ll get 60 days to explore its features and see how it can improve your project’s performance and scalability. FireDAC FireDAC prioritizes access to various widely used databases, like InterBase, PostgreSQL, and DB2. Its architecture is designed to deliver connections using native drivers . While UniDAC’s Direct Mode guarantees fast access to all databases, with FireDAC you’ll have to deal with more steps and configurations — for example, you can’t connect your app to Oracle without installing the Oracle Client first. Plus, it has a narrower range of databases, cloud services, and platform compatibility . That being said, it’s still a good option for solo developers who want reliable performance and efficient data handling, especially if you don’t need flexibility in your database connections. Tools to enhance Delphi development GExperts GExperts is a free and open-source set of IDE tools that acts as an extension for your Delphi programming software. Once you download and install it, you’ll get a lot of handy tools, including a Code Snippet Librarian that helps you organize and access your code snippet without the hassle of digging through files. There’s also a Grep Search and Replace tool that lets you search through ANSI, UTF-8, and UTF-16 text files using flexible regular expressions. In addition, GExperts puts a heavy focus on improving your coding efficiency with features like multiline editor tabs to enhance navigation and hot tracking for easy tab management . CnPack IDE Wizards Also known as CnWizards, this free plug-in suite provides a robust set of code refactoring Delphi tools , such as automatic code formatting, intelligent code completion, and the ability to rename variables and methods easily. What’s more, it brings a bunch of IDE enhancements that help minimize interruptions and keep your workflow fluid and productive. Features include customizable templates for code snippets, a powerful search tool for quickly locating definitions, and a project management interface. Visual component libraries TMS Component Pack TMS Component Pack, now succeeded by the TMS VCL UI Pack, is a collection of UI controls for Windows-only apps . It goes beyond basic buttons and text boxes. Think powerful grid controls for managing complex data, intuitive scheduling components for visualizing tasks and appointments, dynamic frame views for creating responsive layouts, and a whole lot more. The grid components, like the AdvStringGrid , are well-known for their great deal of built-in functionality. This feature helps Delphi developers present and manage data easily using automatic column widths and enhanced Excel export capabilities. Raize components Unlike the TMS Component Pack, Raize Components emphasizes ease of use and integration with the VCL framework . This allows you to build attractive, native interfaces for your Windows apps with little work. You’ll get everything you need: basic elements like checkboxes, buttons, and text boxes, as well as complex controls like grids, tree views, and custom editors. Given its deep integration with VCL, these components fully support VCL Styles , which means you can adapt your apps to various themes effortlessly. Moreover, they are responsive to changes in the VCL environment , ensuring you can take advantage of RAD Studio’s latest features. Woll2Woll components for GUI Woll2Woll specializes in advanced grid and data-aware controls for building engaging and feature-rich UIs . Their toolset is pretty comprehensive, with a focus on data visualization and interactivity. Their flagship product is InfoPower , which lets you build highly customizable grids able to embed control items (like buttons and images) directly into the grid cells. Their 1stClass buttons integrate easily into the InfoPower grid, giving you clickable components for each record. The grid also supports features like hierarchical column management, dynamic row coloring for improved readability, and incremental search capabilities to make your users’ experience better. Steema components for charts Steema focuses on delivering charting tools that can handle complex datasets across multiple platforms . The suite provides all kinds of functionalities, like 3D chart building, support for real-time data updates, and extensive customization options for each chart type. You can access numerous chart types from their TeeChart tool: line, bar, pie, and even maps and gauges. Besides, it comes with specialized controls like the TChartEditor for customizing chart properties on the fly and the TDBChart for integrating with database components. You can also add interactive elements such as tooltips, legends, and zooming capabilities in a snap. Debugging and testing AQtime Pro AQtime Pro is a tool for profiling and debugging Delphi, C, C++, .NET, and Java code. It’s great for finding memory leaks and performance bottlenecks , as well as gaps in code coverage , turning these into actionable insights that will make testing so much easier. On top of that, you can use its reporting system to optimize the performance of your app . It lets you drill down to the root causes of issues quickly, which saves you from unnecessary routines that could lead to hidden bugs. DUnitX DUnitX is specifically designed for Delphi 2010 and later, and it’s all about streamlining the testing process . Though it keeps limited backward compatibility with DUnit test classes, it can be used on Windows (32-bit and 64-bit) and OSX compilers. Some of the more noticeable features are attribute-based testing, a rich assert class, and setup and teardown methods for both individual tests and entire test fixtures. It also supports XML logging and works great with ContinuaCI, GitLab CI, and other popular CI servers . Now, what does the integration process with your Delphi IDE look like? Well, pretty straightforward: you just have to use the wizard to create new tests. You also get a console-based runner for executing tests , quite handy in case you run automated testing. Utilities for code management ModelMaker Code Explorer ModelMaker’s Code Explorer lets you view classes, methods, and properties with the help of two filtered views ,  just like Windows Explorer. There’s a Class Browser that displays inheritance relationships and member details, while the Member search bar and Navigation history help you quickly find what you’re looking for. In short, it makes improving your existing code much easier. You can add, modify, and remove classes and members in just a few clicks or through drag-and-drop. Besides, there are a few powerful refactoring tools, such as IntelliReplace , that will help rename things and extract methods directly in the IDE, which simplifies the process of updating your code. DelphiAST DelphiAST is an abstract syntax tree (AST) parser that makes working with Delphi code a lot easier. It takes your Delphi source file and transforms it into a tree-like structure that represents your code’s syntax . It does this on a per-unit basis, returning an AST with important nodes like units, interfaces, methods, and parameters along with their types — but without going down into the details of a symbol table. Third-party integrations FastReport VCL FastReport VCL allows you to create a wide range of reports for your Delphi apps , from simple lists to complex master-detail layouts. Its band-oriented design makes it possible to design reports in literally a few minutes. You can even group data or create multi-column reports with just a few clicks. But the best thing about it? The design-time report creation feature. You will be able to visually design and modify report templates using a very modern editor, rich in tools like drag-and-drop and a comprehensive set of formatting options . IntraWeb Want to create web-based apps in Delphi but deploy them as standard HTML and JavaScript without using a bunch of plugins? IntraWeb lets you do just that. It has a visual component model that simplifies the design process — just drag and drop components like you would in desktop development. Plus, it contains additional functionalities for things like session management and state preservation to make the user experience seamless. What’s more, it has built-in AJAX , so you can update parts of your web page without reloading the entire thing. Open-source tools and libraries SynEdit SynEdit is a text editor control that gives your Delphi coding a personal touch. It highlights your code’s syntax to make it easier to read, and it suggests possible completions as you code, which really helps speed things up. You will also get the opportunity to set shortcuts that work best for you, as well as use drag-and-drop to move text around. ICS (Internet Component Suite) The Internet Component Suite is a handy toolkit that simplifies building internet applications in Delphi . It has a bunch of components that take care of the networking details for you. Whether you’re making a simple web client or a more complex server application, you’ll find what you need. It supports many different protocols such as HTTP, FTP, SMTP, POP3, and IMAP , so you can work on a great variety of projects — from email clients and file transfer tools to web browsers. New and noteworthy in 2025 DelphiMVCFramework DelphiMVCFramework is one of the most used frameworks for developing Delphi WEB APIs. It provides a structured approach that helps you keep your code clean and maintainable, which is necessary when developing any kind of modern web app. Besides, it’s very user-friendly and has great documentation , so you can be up and running in no time. It’s quite easy to build RESTful APIs and JSON-RPC services with DMVCFramework. It comes with a good amount of built-in components that handle for you most of the work, such as routing requests and managing responses. Besides, it works with HTTP and JSON-RPC , and has automatic object remoting, session management built in, and customizable routing. Conclusion It’s easy to get stuck using only old-school tools and libraries when working with Delphi programming software. But switching to the latest Delphi tools can save you a lot of headaches: compatibility problems, time wasted debugging old code, and lost features which would make your apps great. Check out the tools listed above and experiment with those that fit your project. To get started, [try UniDAC for free](https://www.devart.com/dac.html) and experience firsthand how it can simplify your database interactions and enhance your Delphi development experience. Tags [dac](https://blog.devart.com/tag/dac) [delphi components](https://blog.devart.com/tag/delphi-components) [delphi data access components](https://blog.devart.com/tag/delphi-data-access-components) [unidac](https://blog.devart.com/tag/unidac) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-programming-software-libraries-and-components-to-use.html) [Twitter](https://twitter.com/intent/tweet?text=Top+23+Delphi+Programming+Software%2C+Libraries%2C+and+Components+to+Use+in+2025&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-programming-software-libraries-and-components-to-use.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html&title=Top+23+Delphi+Programming+Software%2C+Libraries%2C+and+Components+to+Use+in+2025) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html&title=Top+23+Delphi+Programming+Software%2C+Libraries%2C+and+Components+to+Use+in+2025) [Copy URL](https://blog.devart.com/delphi-programming-software-libraries-and-components-to-use.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/delphi-xe2-and-runtime-error-231-on-mac-os-x.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Delphi XE2 FireMonkey HD Applications Raise Runtime Error 231 on Mac OS X By [DAC Team](https://blog.devart.com/author/dac) October 5, 2011 [6](https://blog.devart.com/delphi-xe2-and-runtime-error-231-on-mac-os-x.html#comments) 6227 Many users have encountered the problem when running FireMonkey HD Applications on Mac OS X without 3D hardware HAL. When this application is run, it either freezes or produces the following error: Runtime error 231 at 000169AD We have researched this problem and found a solution from Embarcadero. If a Mac OS X computer has no 3D hardware HAL, you need to set the global variable FMX.Types.GlobalUseHWEffects to False. Example: begin\n FMX.Types.GlobalUseHWEffects := False;\n\n Application.Initialize;\n Application.CreateForm(TForm1, Form1);\n Application.Run; \nend; However, this solution is not a panacea. Sometimes HD Applications continue to freeze or raise the error even with FMX.Types.GlobalUseHWEffects set to False. In this case, you need to modify the FMX.Filter.pas unit. Do the following: Copy the FMX.Filter.pas and FMX.Defines.inc files from the $(BDS)sourcefmx folder to your project folder. In the files copied to your project folder replace the code of the FilterByName and FilterClassByName functions with the following code: function FilterByName(const AName: string): TFilter; \nvar\n i: Integer;\nbegin\n Result := nil;\n if not GlobalUseHWEffects or (Filters = nil) then // <-- change this line\n Exit;\n for i := 0 to Filters.Count - 1 do\n if CompareText(TFilterClass(Filters.Objects[i]).FilterAttr.Name, AName) = 0\n then\n begin\n Result := TFilterClass(Filters.Objects[i]).Create;\n Exit;\n end;\nend;\n\nfunction FilterClassByName(const AName: string): TFilterClass; \nvar\n i: Integer;\nbegin\n Result := nil;\n if not GlobalUseHWEffects or (Filters = nil) then // <-- change this line\n Exit;\n for i := 0 to Filters.Count - 1 do\n if CompareText(TFilterClass(Filters.Objects[i]).FilterAttr.Name, AName) = 0\n then\n begin\n Result := TFilterClass(Filters.Objects[i]);\n Exit;\n end;\nend; Add the FMX.Filter unit to the USES section of your project: program Project1;\n\nuses\n FMX.Forms,\n FMX.Types,\n FMX.Filter, // <-- add unit\n Unit1 in 'Unit1.pas' {Form1}; After such modifications your applications will run successfully on Mac OS X without 3D hardware HAL; however, 3D effects will not be available for it. Updated: This trick is not required for Rad Studio XE2 with Update 3 and higher. Tags [delphi](https://blog.devart.com/tag/delphi) [macos development](https://blog.devart.com/tag/macos-development) [rad studio](https://blog.devart.com/tag/rad-studio) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdelphi-xe2-and-runtime-error-231-on-mac-os-x.html) [Twitter](https://twitter.com/intent/tweet?text=Delphi+XE2+FireMonkey+HD+Applications+Raise+Runtime+Error+231+on+Mac+OS+X&url=https%3A%2F%2Fblog.devart.com%2Fdelphi-xe2-and-runtime-error-231-on-mac-os-x.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/delphi-xe2-and-runtime-error-231-on-mac-os-x.html&title=Delphi+XE2+FireMonkey+HD+Applications+Raise+Runtime+Error+231+on+Mac+OS+X) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/delphi-xe2-and-runtime-error-231-on-mac-os-x.html&title=Delphi+XE2+FireMonkey+HD+Applications+Raise+Runtime+Error+231+on+Mac+OS+X) [Copy URL](https://blog.devart.com/delphi-xe2-and-runtime-error-231-on-mac-os-x.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 6 COMMENTS The Best Photo Software June 3, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 6:21 pm You’re in point of fact a just right webmaster. The website loading pace is amazing. It kind of feels that you are doing any distinctive trick. Also, The contents are masterpiece. you’ve done a great task on this subject! AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:47 am Thanks for your kind words. We are trying our best. Nefful products Taiwan June 8, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 3:10 pm I enjoy, cause I found just what I was looking for. You have ended my 4 day long hunt! God Bless you man. Have a great day. Bye AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:39 am Thanks for your praising comment! Glad to be helpful. Softtouch July 24, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 6:52 am Delphi Tokyo does the same! And the filter.pas is completely different code now, no idea what to fix. DAC Team August 1, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 10:10 am Hello, Softtouch! We did not have similar problems on Delphi Tokyo, so it’s difficult for us to recommend something. If we find a solution to your issue, we will definitely write how to solve it. Comments are closed."} {"url": "https://blog.devart.com/designing-views-with-query-builder.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Designing Views with MySQL Query Builder By [dbForge Team](https://blog.devart.com/author/dbforge) May 22, 2009 [0](https://blog.devart.com/designing-views-with-query-builder.html#respond) 3463 If you often need to create and modify views in your MySQL development you will like dbForge Studio for MySQL [in-place query editing](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) feature . This feature allows integration of powerful Query Builder tool with view editor without annoying copy/paste. Suppose you have previously created view with some SELECT statement. To view and design this statement with Query Builder you need to perform these steps: Open view editor. Right-click on SELECT statement. In pop-up menu select ‘Design SQL…’ command. In opened Query Builder re-design you SELECT statement (See picture below). After you click OK new statement automatically gets pasted into the view  editor. If you need to design new SELECT statement when you are creating view use ‘Insert SQL…’ command . Pop-up Query Builder Tags [MySQL](https://blog.devart.com/tag/mysql) [query builder](https://blog.devart.com/tag/query-builder) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdesigning-views-with-query-builder.html) [Twitter](https://twitter.com/intent/tweet?text=Designing+Views+with+MySQL+Query+Builder&url=https%3A%2F%2Fblog.devart.com%2Fdesigning-views-with-query-builder.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/designing-views-with-query-builder.html&title=Designing+Views+with+MySQL+Query+Builder) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/designing-views-with-query-builder.html&title=Designing+Views+with+MySQL+Query+Builder) [Copy URL](https://blog.devart.com/designing-views-with-query-builder.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/devart-announces-python-connectors.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Python Connectors](https://blog.devart.com/category/products/python-connectors) [What’s New](https://blog.devart.com/category/whats-new) Devart Announces Python Connectors By [DAC Team](https://blog.devart.com/author/dac) December 8, 2023 [0](https://blog.devart.com/devart-announces-python-connectors.html#respond) 1712 We are happy to announce the launch of our new product line, Devart Python connectors. After months of hard work and dedication, we released reliable connectivity solutions for accessing database servers and cloud services from Python applications. The connectors fully implement the Python Database API Specification v2.0 and are available as wheel packages for Windows, macOS, and Linux. This release includes connectors for the following data sources: Database connectors: [ASE](https://www.devart.com/python/ase/) , [dBase](https://www.devart.com/python/xbase/) , [Firebird](https://www.devart.com/python/firebird/) , [Google BigQuery](https://www.devart.com/python/bigquery/) , [InterBase](https://www.devart.com/python/interbase/) , [MariaDB](https://www.devart.com/python/mysql/) , [MySQL](https://www.devart.com/python/mysql/) , [Oracle](https://www.devart.com/python/oracle/) , [PostgreSQL](https://www.devart.com/python/postgresql/) , [SQLite](https://www.devart.com/python/sqlite/) , [SQL Server](https://www.devart.com/python/sqlserver/) , [VisualFoxPro](https://www.devart.com/python/xbase/) , and various [xBase](https://www.devart.com/python/xbase/) databases. Cloud connectors: [BigCommerce](https://www.devart.com/python/bigcommerce/) , [Dynamics 365](https://www.devart.com/python/dynamics/) , [HubSpot](https://www.devart.com/python/hubspot/) , [NetSuite](https://www.devart.com/python/netsuite/) , [QuickBooks](https://www.devart.com/python/quickbooks/) , [Salesforce](https://www.devart.com/python/salesforce/) , [Zoho CRM](https://www.devart.com/python/zohocrm/) . The following sections cover the broad set of capabilities in our Python connectors. Direct Connection The solution lets you connect to relational and non-relational databases directly through TCP/IP, eliminating the need for database client libraries. A direct connection also increases the speed of data transmission between a Python application and the database server. For example, to connect to Oracle Database, you don’t have to install Oracle Instant Client on the workstation—enable the Direct mode as in the following code snippet. import devart.oracle\n\nconnection = devart.oracle.connect(\n Direct=True,\n Host=\"dboracle\",\n Port=1521,\n ServiceName=\"orcl1120\",\n UserName=\"scott\",\n Password=\"tiger\"\n) Cloud data sources are accessed through HTTPS—you don’t need additional client software on the user workstation. Standard SQL Syntax in Cloud Connectors The connectors conform to the Python DB API 2.0 specification and fully support the ANSI SQL-92 standard. You can execute SQL statements against your cloud data just like you would normally work with relational databases, including complex queries containing different types of JOINs, ORDER BY and GROUP BY clauses, and aggregate functions. Simple queries are directly converted to API calls and executed on the cloud service side. Complex queries are transformed into simpler queries, which are then converted to API calls. The embedded SQL engine then processes the results in the local cache and applies advanced SQL features from the original complex query. The following example shows a complex query run against Microsoft Dynamics 365. import dynamics365\n \nconnection = dynamics365.connect(\n Server=\"https://mydinamics.crm4.dynamics.com/\",\n UserId=\"user@gmail.com\",\n Password=\"my_password\"\n)\n\n\nsql = \"\n Select a.customersizecode,\n a.territorycode,\n a.name,\n a.accountnumber,\n at.isdocument,\n at.businessunitname\n From account a\n Left Join (Select ant.objectid,\n ant.subject,\n ant.isdocument,\n b.name as businessunitname\n From annotation ant\n Left Join (Select * From businessunit) b \n On ant.owningbusinessunit = b.businessunitid\n ) at\n On at.objectid = a.accountid\n Where \n a.exchangerate = 1 And \n a.websiteurl Is not Null\n Order By \n a.revenue, \n a.versionnumber,\n a.statecode\n\"\n\ncursor = connection.cursor()\ncursor.execute(sql) OAuth 2.0 Authorization in Cloud Connectors You will enjoy the ease of OAuth 2.0 authorization to cloud services that support this authorization framework: Salesforce, BigCommerce, Zoho CRM, and others. import devart.salesforce\n\ndict = devart.salesforce.signin(host=\"login.salesforce.com\")\n\nconnection = devart.salesforce.connect(\n Authentication=\"OAuth\",\n Host=\"login.salesforce.com\",\n RefreshToken=dict[\"Refresh Token\"]\n) Connection Pooling Connection pooling allows you to reduce the cost of opening and closing connections for each operation by maintaining a pool of pre-established connections to a database or cloud service. Connection pooling configuration is quite flexible, but if you want to enable pooling with default settings, you can do that with one line of code. devart.postgresql.connection_pool.enabled = True SQLite Database Encryption The Python Connector for SQLite supports native SQLite database encryption without requiring you to purchase an encryption extension for SQLite. This functionality is available in the Direct mode, which uses a statically linked SQLite library. You can choose from the following encryption algorithms to protect your data from unauthorized access: Triple DES, Blowfish, AES-128, AES-192, AES-256, CAST-128, and RC4. To encrypt a database, you execute the following PRAGMA statements. connection = devart.sqlite.connect(\"Direct=True;Database=your_database;\")\ncursor = connection.cursor()\ncursor.execute(\"PRAGMA ENCRYPTION=AES256\")\ncursor.execute(\"PRAGMA REKEY='your_key'\") You specify the encryption key and algorithm to connect to the encrypted database. connection = devart.sqlite.connect(\"Direct=True;Database=your_database;EncryptionAlgorithm=AES256;EncryptionKey=your_key\") xBase Compatibility Python Connector for xBase supports the xBase family of databases, covering the following database formats: Visual FoxPro, FoxPro 2, dBase III – dBase 7, Clipper, Codebase, and HiPer-Six. Although dBASE and derived databases get few new users, many companies established in the last century still maintain and actively use them daily. The connector can retrieve and update data in various xBase databases. If your company wants to migrate its data to a more modern database, the connector can help implement the migration routines. Our connector provides parameters for ignoring corrupted data and metadata errors in DBF tables. Corrupted data is skipped, while intact data is properly retrieved, which lets you access the data you thought were long lost. The solution supports the standard ANSI SQL-92 syntax and offers an internal data indexing mechanism that is way more efficient than native DBF indexes for complex queries. Support for ETL Tools The connectors are compatible with popular Python ETL tools: petl, pandas, and SQLAlchemy. Among these, only petl is a general-purpose Python package for extracting, transforming, and loading data tables, but complex real-world ETL scenarios often involve all three tools. Tags [Python](https://blog.devart.com/tag/python) [python connectors](https://blog.devart.com/tag/python-connectors) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-announces-python-connectors.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Announces+Python+Connectors&url=https%3A%2F%2Fblog.devart.com%2Fdevart-announces-python-connectors.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-announces-python-connectors.html&title=Devart+Announces+Python+Connectors) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-announces-python-connectors.html&title=Devart+Announces+Python+Connectors) [Copy URL](https://blog.devart.com/devart-announces-python-connectors.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/devart-at-sqlsaturday-290-kiev-2014.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart at SQLSaturday #290 – Kiev 2014 By [dbForge Team](https://blog.devart.com/author/dbforge) September 24, 2014 [0](https://blog.devart.com/devart-at-sqlsaturday-290-kiev-2014.html#respond) 3008 Devart sponsored [SQLSaturday #290](https://www.sqlsaturday.com/290/eventhome.aspx) , that took place on September 20, 2014 in Kiev, Ukraine. About Kiev The Ukrainian capital is a city that is more than 15 centuries old, but yet stays remarkably young. Being one of the most historically and culturally significant centers of Eastern Europe, Kiev surprises by its ancient beauty and unforgettable atmosphere. About SQLSaturday #290 With informative reports and attendance of about 200 participants, this edition of SQLSaturday turned into a perfect platform  for getting new knowledge about SQL Server and related technologies, meeting database experts and communicating with top SQL Server specialists from all over the world. In the course of the event, Devart presented the updated product line of [SQL Server Tools](https://www.devart.com/dbforge/sql/) . Devart Raffle We also held the raffle for the participants of SQLSaturday. The winners were awarded with the exclusive prizes, including 1 FREE license of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , 1 FREE license of [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) and 2 T-shirts. We would like to thank the organizers and the speakers of the conference for the highly informative and productive event. We do hope to attend it next time! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-at-sqlsaturday-290-kiev-2014.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+at+SQLSaturday+%23290+%E2%80%93+Kiev+2014&url=https%3A%2F%2Fblog.devart.com%2Fdevart-at-sqlsaturday-290-kiev-2014.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-at-sqlsaturday-290-kiev-2014.html&title=Devart+at+SQLSaturday+%23290+%E2%80%93+Kiev+2014) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-at-sqlsaturday-290-kiev-2014.html&title=Devart+at+SQLSaturday+%23290+%E2%80%93+Kiev+2014) [Copy URL](https://blog.devart.com/devart-at-sqlsaturday-290-kiev-2014.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart attended and sponsored SQL Day 2016 — Annual Conference of Polish SQL Server User Group Association By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) May 24, 2016 [0](https://blog.devart.com/devart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html#respond) 3769 Devart was excited to participate and sponsor SQL Day 2016 that took place on May 16 – 18 in Wrocław, Poland. The event has ended less than a week ago and we are already looking forward to what’s next in 2017! [SQLDay](https://sqlday.pl/en/) is the largest Microsoft data platform summit in the East Central Europe. This year, we had a great chance to listen to the biggest stars in global SQL world: Grant Fritchey , Dejan Sarka , Mihail Mateev , Uwe Ricken , Aaron Bertrand , and many others. We would like to thank the speakers for the wonderful conference! Our special thanks to Tomasz Libera , Marcin Szeliga , Łukasz Grala , Damian Widera , Roman Czarko-Wasiutycz , Kamil Nowiński and all other team members for the assistance and warm welcome. We have had a great chance to meet new people, exchange views, and present our software products to the wide audience. Sincerely, Devart Tags [devart](https://blog.devart.com/tag/devart) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+attended+and+sponsored+SQL+Day+2016+%E2%80%94+Annual+Conference+of+Polish+SQL+Server+User+Group+Association&url=https%3A%2F%2Fblog.devart.com%2Fdevart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html&title=Devart+attended+and+sponsored+SQL+Day+2016+%E2%80%94+Annual+Conference+of+Polish+SQL+Server+User+Group+Association) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html&title=Devart+attended+and+sponsored+SQL+Day+2016+%E2%80%94+Annual+Conference+of+Polish+SQL+Server+User+Group+Association) [Copy URL](https://blog.devart.com/devart-attended-and-sponsored-sql-day-2016-annual-conference-of-polish-sql-server-user-group-association.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart Attended the Second Meeting of Lviv SQL Server User Group as a Sponsor By [dbForge Team](https://blog.devart.com/author/dbforge) October 28, 2014 [0](https://blog.devart.com/devart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html#respond) 2636 Devart gladly participated as a sponsor on the second meeting of the [Lviv SQL Server User Group](https://lvivsqlug.pass.org/default.aspx?EventID=1940) that took place on October 25, 2014. About Lviv Lviv is a cultural and economic center of West Ukraine. Rich history and a lot of old buildings make Lviv one of the most interesting and exciting places in Ukraine. About Lviv SQL Server User Group Meeting [Lviv SQL Server User Group](https://lvivsqlug.pass.org/default.aspx) has a status of PASS Local Chapter (Professional Association for SQL Server) and is a part of PASS’s Global Growth for supporting of MS SQL Server professionals and connecting the community around the world. This meeting was devoted to such topics as Database Version Control and Microsoft Business Intelligence. In the course of the event, Devart presented the product line of dbForge Tools for SQL Server. Devart Raffle We also held the raffle for the participants of the meeting. The winners were awarded with the exclusive prizes, including 1 FREE license of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) and 1 FREE license of [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . We would like to thank the organizers and the speakers of the conference for the highly informative and productive event. We do hope to attend it next time! Tags [devart](https://blog.devart.com/tag/devart) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Attended+the+Second+Meeting+of+Lviv+SQL+Server+User+Group+as+a+Sponsor&url=https%3A%2F%2Fblog.devart.com%2Fdevart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html&title=Devart+Attended+the+Second+Meeting+of+Lviv+SQL+Server+User+Group+as+a+Sponsor) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html&title=Devart+Attended+the+Second+Meeting+of+Lviv+SQL+Server+User+Group+as+a+Sponsor) [Copy URL](https://blog.devart.com/devart-attended-the-second-meeting-of-lviv-sql-server-user-group-as-a-sponsor.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Devart Becomes the Winner of 2020 CV Magazine Technology Innovator Awards By [dbForge Team](https://blog.devart.com/author/dbforge) August 20, 2020 [0](https://blog.devart.com/devart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html#respond) 2531 For more than 15 years, Devart has been developing database tools and data-related solutions. One of its most popular product lines, dbForge, is designed to facilitate and simplify database development, management, and administration for SQL Server, MySQL, Oracle, and PostgreSQL. All dbForge tools embrace both powerful productivity and exceptional usability. Once again, these qualities got recognition. This year, the Corporate Vision magazine named Devart and dbForge winners of its annual awards in the [Technology Innovator Awards](https://www.corporatevision-news.com/?s=Devart) nomination. Corporate Vision is a monthly digital magazine focused on the latest insights from more than 155,000 business leaders and experts. The fact it has proclaimed Devart the Best Server Development Software Solutions Provider 2020 inspires our best minds to create and enhance powerful solutions for more than 500,000 users around the world. The dbForge product line has always been one of Devart’s household names. We are genuinely proud of these tools and the possibilities they offer: • Rich toolset that covers every stage of work with databases: development, management, administration, automation, documenting, and reporting • High performance and stability that delivers efficiency and productivity at work • Clean and intuitive GUI that helps database developers work with ease and pleasure These are just a few reasons why so many customers worldwide choose [database tools](https://www.devart.com/dbforge/edge/) by dbForge. You can join them easily – [get a free 30-day trial](https://www.devart.com/dbforge/edge/download.html) to test the capabilities of any tool you like! Tags [Awards](https://blog.devart.com/tag/awards) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Becomes+the+Winner+of+2020+CV+Magazine+Technology+Innovator+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html&title=Devart+Becomes+the+Winner+of+2020+CV+Magazine+Technology+Innovator+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html&title=Devart+Becomes+the+Winner+of+2020+CV+Magazine+Technology+Innovator+Awards) [Copy URL](https://blog.devart.com/devart-becomes-the-winner-of-2020-cv-magazine-technology-innovator-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-black-friday-deals.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) Save Up to 50 % with Devart Black Friday Deals! By [dbForge Team](https://blog.devart.com/author/dbforge) November 23, 2018 [0](https://blog.devart.com/devart-black-friday-deals.html#respond) 5439 We are delighted to announce that the time to get Devart products with fantastic discounts has come! On the occasion of Black Friday, Cyber Monday, and upcoming holiday season, we are glad to treat our customers with up to 50% off the new Devart licenses and edition upgrades. To get your discount, please visit our [Black Friday](https://www.devart.com/blackfriday.html) page, choose a product from the list, and apply the coupon code during the checkout! Devart team cordially wishes you to have a great shopping experience and unforgettable holidays! Tags [devart](https://blog.devart.com/tag/devart) [discounts](https://blog.devart.com/tag/discounts) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-black-friday-deals.html) [Twitter](https://twitter.com/intent/tweet?text=Save+Up+to+50+%25+with+Devart+Black+Friday+Deals%21&url=https%3A%2F%2Fblog.devart.com%2Fdevart-black-friday-deals.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-black-friday-deals.html&title=Save+Up+to+50+%25+with+Devart+Black+Friday+Deals%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-black-friday-deals.html&title=Save+Up+to+50+%25+with+Devart+Black+Friday+Deals%21) [Copy URL](https://blog.devart.com/devart-black-friday-deals.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Devart dbForge Product Line and SecureBridge Join the 2024 DBTA Readers’ Choice Awards Contest By [Victoria Shyrokova](https://blog.devart.com/author/victorias) May 14, 2024 [0](https://blog.devart.com/devart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html#respond) 1216 We are thrilled to announce that Devart products participate in the DBTA Readers’ Choice Awards 2024, a yearly worldwide contest held by DBTA magazine that features the best solutions for database connectivity, development, and administration. The Devart team works non-stop to provide you with the best-in-class products, and we look forward to seeing them win this year. Check the list of our solutions and make sure to support them! Devart nominees dbForge Studio This year, we enter this contest with our stellar [Devart dbForge Studio](https://www.devart.com/dbforge/edge/) , nominated in the Best Database Performance Solution, Best Database Development Solution, Best DBA Solution, and Best Database Backup Solution categories. With four universal database IDEs under the hood, the dbForge Studio bundle provides accurate code completion features to speed up workflows, helps optimize queries for top performance, and assists with routine database administration tasks to save time. It is a full-spectrum toolset for everyone working with databases, contributing to database security, administration, maintenance, and monitoring efficiency. With over 200k users already having it at their fingertips to work with SQL Server, MySQL and MariaDB, Oracle, and PostgreSQL DBMS, we believe that this multitasker kit is totally capable of getting ranked as one of the top tools in several categories, attracting a wider community of database administrators and developers. dbForge Edge brings together four powerful IDEs: [Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , so you get everything at once at a reasonable price. We also encourage you to vote for it if you are familiar with one of the IDEs in the bundle. Haven’t tried dbForge Edge yet? Download its [free 30-day trial version](https://www.devart.com/dbforge/edge/download.html) to get started! SQL Complete [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) has become a go-to solution for those database developers and administrators who work solely with SQL databases. The tool boasts lightning-fast and accurate SQL code completion features, doubling the speed of writing queries. With its help, one can easily get information on database objects, debug the written code to reduce any issues to a minimum, beautify it with the SQL Formatter, and refactor it for better readability. SQL Complete has been nominated in the Best DBA Solution and Best Database Development Solution categories, and we are thrilled to see its success. SecureBridge [SecureBridge](https://www.devart.com/sbridge/) ensures data safety in unsecured networks by making it easy to establish a secure connection and safeguard TCP traffic with HSS or SSL protocols and Cryptographic Message Syntax. Offering compatibility with data access components, SecureBridge is on guard of your connectivity, preventing insecure modifications and interceptions of untrusted networks. Readers listed it among the top three data security solutions last time, so now it’s time to get ranked for ultimate success. This year, our tool entered the DBTA Readers’ Choice Awards contest for the Best Data Security Solution category. Contest duration and results The DBTA Readers’ Choice Awards 2024 contest will be open through Monday, June 3, 2024, so hurry up and vote for your favorite solutions in different categories to support your choice. Support your favorite database development, administration, and security tools! [Vote for Devart tools](https://www.dbta.com/Readers-Choice-Awards) With so many users already recognizing Devart dbForge Studio and connectivity tools as being essential for database design, we are sure more people will be able to try them out and appreciate all the benefits of secure database management, administration, development, and connectivity. The top vote-getters will be featured in a special section on the DBTA website and in the August 2024 Database Trends and Applications magazine issue, so be sure to support your favorites in this contest! Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [securebridge](https://blog.devart.com/tag/securebridge) [SQL Server](https://blog.devart.com/tag/sql-server) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+dbForge+Product+Line+and+SecureBridge+Join+the+2024+DBTA+Readers%E2%80%99+Choice+Awards+Contest&url=https%3A%2F%2Fblog.devart.com%2Fdevart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html&title=Devart+dbForge+Product+Line+and+SecureBridge+Join+the+2024+DBTA+Readers%E2%80%99+Choice+Awards+Contest) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html&title=Devart+dbForge+Product+Line+and+SecureBridge+Join+the+2024+DBTA+Readers%E2%80%99+Choice+Awards+Contest) [Copy URL](https://blog.devart.com/devart-dbforge-product-line-and-securebridge-join-the-2024-dbta-readers-choice-awards-contest.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Meet Performance-Leading SQL Comparison Tool By [dbForge Team](https://blog.devart.com/author/dbforge) April 29, 2010 [0](https://blog.devart.com/devart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html#respond) 3332 Devart announced the release of a new version of the professional tool for [SQL server data comparison and synchronization](https://www.devart.com/dbforge/sql/datacompare/) – dbForge Data Compare 2.00. Committed to support the needs of database-dealing professionals, Devart now delivers dbForge Data Compare for SQL Server 2.00 that incorporates three times faster data comparison and synchronization, the smallest price and greater satisfaction than all other data comparison tools offer. Incorporating these three components, dbForge Data Compare for SQL Server guarantees greater satisfaction and smother database-dealing experience for any SQL comparison and synchronization goals. The highlights of dbForge Data Compare for SQL Server 2.00 include: Higher performance in practice Having passed multiple performance tests on large databases including billions of records and extra large BLOB data types, dbForge Data Compare 2.00 proved marked performance improvement as compared with the previous version and beneficial leading among competitors. Now the users will enjoy 3 times faster comparison and 2 times faster synchronization. Support of native SQL Server backups This will allow selecting native SQL Server backups as a metadata source and bringing more freedom while updating data in SQL Server databases. Besides traditional live databases, now any combination of databases and backups is possible to compare. Data comparison and synchronization via command line dbForge Data Compare 2.00 can be used with a Windows task scheduler to compare and synchronize data in automatic mode via command line and check the required result at desired time without permanent supervision. Extended data comparison and synchronization options The users can better tune their comparison and synchronization as dbForge Data Compare 2.00 offers a wider choice of options. Now auto mapping, comparison, and display options are at the disposal. Generating comparison and synchronization reports Now all data differences between compared databases can be saved and transformed into smart and comprehensive reports automatically. This is a beneficial feature that can save precious hours usually spent on data analysis and report preparation. Product editions available dbForge Data Compare leads all other data comparison tools not only in performance but in price. Now two product editions, Standard and Professional, offer a choice of functionality and an opportunity to pay only for necessary things. With dbForge Data Compare 2.00, Devart continues its initiative to produce efficient and database experiences for all the people in SQL Server world. Check the benefits yourself, [download dbForge Data Compare for SQL Server now for free](https://www.devart.com/dbforge/sql/datacompare/download.html) . Tell us what you think about the new version at [dbForge Data Compare feedback page](https://www.devart.com/dbforge/sql/datacompare/feedback.html) . We are looking forward to your comments and suggestions. Tags [data compare](https://blog.devart.com/tag/data-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html) [Twitter](https://twitter.com/intent/tweet?text=Meet+Performance-Leading+SQL+Comparison+Tool&url=https%3A%2F%2Fblog.devart.com%2Fdevart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html&title=Meet+Performance-Leading+SQL+Comparison+Tool) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html&title=Meet+Performance-Leading+SQL+Comparison+Tool) [Copy URL](https://blog.devart.com/devart-delivers-performance-leading-sql-comparison-tool-dbforge-data-compare-2-00.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart Has Become a Digital Sponsor of SQLBits 2024 By [Victoria Shyrokova](https://blog.devart.com/author/victorias) July 24, 2024 [0](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html#respond) 995 On March 19-23, Devart joined SQLBits2024, a non-profit yearly conference connecting 2279 attendees from 60 countries for training, networking, and experience sharing. This venue is a place of power for everyone interested in data engineering, architecture, database administration, analytics, and development with SQL, and we were glad to join the ranks with the people who share the same passion for SQL with us. During this venue, our stellar business development professionals demonstrated the forerunning products for SQL database development, design, administration, and management. It was also an elevated experience for us to show how dbForge solutions can assist in building a reliable infrastructure tailored to one’s applications and services and how easy to use these toolsets are. You can still explore them at your own pace, even though you might have missed the venue. [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) It’s a perfect alternative to the well-known SSMS that assists SQL experts in development, database management, data analysis, and collaboration. With dbForge Studio for SQL Server, one can become 2x faster in database development, speed up their workflow up to 200%, and get astonishing productivity. Accurate code completion, visual database design comparison features, test data generation, and documenting tools are only some of the features for which dbForge Studio for SQL Server is very loved by its users. We were proud to present it to those who chose SQL as a foundation for their career. [Try dbForge Studio >](https://www.devart.com/dbforge/sql/studio/download.html) [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/dbforge-sql-tools.html) Another product that the Devart team offered for exploration is designed to assist developers using SSMS with Microsoft SQL Server and Azure SQL. This versatile toolkit offers features like SQL formatting, schema comparison, data comparison and synchronization, database documentation, query building, and data import and export. It’s an essential asset for streamlining workflows and boosting development experience, and its functionality is partially free to explore and use. [Try SQL Tools >](https://www.devart.com/dbforge/sql/sql-tools/download.html) [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) For those looking for a simple way to increase coding speed while working with SSMS, Devart has presented the solution for smooth refactoring, debugging, code autocompletion, and SQL formatting. dbForge SQL Complete boosts one’s ability to develop and design databases, providing lucrative features that you won’t find in Visual Studio and that are handy for most of your routine tasks. [Try SQL Complete >](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) All the tools presented at the SQLBits conference by Devart are free to try out, so make sure to explore them further. Wrapping up As a company specializing in solutions for database development, management, and administration, we were delighted to meet so many SQL experts looking for professional growth, training, and software that can help them achieve higher efficiency and greater results in their work. We were excited to share insights and collaborate with the vast community of the world’s leading data conference, and we hope to join SQLBits next year! Tags [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-has-become-a-digital-sponsor-of-sqlbits-2024.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Has+Become+a+Digital+Sponsor+of+SQLBits+2024&url=https%3A%2F%2Fblog.devart.com%2Fdevart-has-become-a-digital-sponsor-of-sqlbits-2024.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html&title=Devart+Has+Become+a+Digital+Sponsor+of+SQLBits+2024) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html&title=Devart+Has+Become+a+Digital+Sponsor+of+SQLBits+2024) [Copy URL](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Devart Wins Silver at the 19th Annual 2024 Globee Awards for Technology](https://blog.devart.com/devart-wins-silver-in-globee-awards.html) July 16, 2024"} {"url": "https://blog.devart.com/devart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) Devart has been nominated in 4 categories of 2010 DevProConnections Community Choice Awards! By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 13, 2010 [0](https://blog.devart.com/devart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html#respond) 2717 Nominated products: Best Add-In Product – [Entity Developer](http://www.devart.com/entitydeveloper/) (page 1, category 1: Add-In) Best Component Set – [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) (page1, category 7: Component Set) Best IDE Product – [dbForge Studio for MySQL](http://www.devart.com/dbforge/mysql/studio/) (page 2, category 14: IDE) Best Free Tool – [CodeCompare](http://www.devart.com/codecompare/) (page2, category 26: Free Tool ) If some of your favorite products are in this list, please visit: [http://www.surveymonkey.com/s/DEVCommChoiceFinalVoting](http://www.surveymonkey.com/s/DEVCommChoiceFinalVoting) and cast your vote for your favorite Devart products. Voting ends on September 21, 2010. [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+has+been+nominated+in+4+categories+of+2010+DevProConnections+Community+Choice+Awards%21&url=https%3A%2F%2Fblog.devart.com%2Fdevart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html&title=Devart+has+been+nominated+in+4+categories+of+2010+DevProConnections+Community+Choice+Awards%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html&title=Devart+has+been+nominated+in+4+categories+of+2010+DevProConnections+Community+Choice+Awards%21) [Copy URL](https://blog.devart.com/devart-has-been-nominated-in-4-categories-of-2010-devproconnections-community-choice-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Devart Hones DevOps Automation to Perfection in dbForge Studio for SQL Server 6.1 By [dbForge Team](https://blog.devart.com/author/dbforge) December 21, 2021 [0](https://blog.devart.com/devart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html#respond) 2491 We are thrilled to roll out the new version of dbForge Studio for SQL Server—our multi-featured IDE for database development, management, and administration. The release brings a number of new features and improvements. In the [previous update](https://blog.devart.com/dbforge-studio-for-sql-server-is-ready-for-devops-automation.html) of dbForge Studio for SQL Server, we introduced the long-awaited DevOps Automation feature. In this version, we keep perfecting the DevOps capabilities and improve the functionality further for you to be able to embrace and implement the best DevOps practices in your routine. What’s New DevOps Automation Improvements dbForge Studio for SQL Server comes with an advanced DevOps Automation tool that brings the best continuous integration practices to the database development and thus takes it to a whole new level. In this version of the Studio, we are adding the support for the Execution step for the Jenkins, TeamCity, and Bamboo plugins. Support for New Functions, Statements, and Keywords UPDATETEXT In dbForge Studio for SQL Server 6.1, you can benefit from objects hints for the UPDATETEXT function. SEMANTICSIMILARITYTABLE, SEMANTICKEYPHRASETABLE, and SEMANTICSIMILARITYDETAILSTABLE The SEMANTICSIMILARITYTABLE, SEMANTICKEYPHRASETABLE, and SEMANTICSIMILARITYDETAILSTABLE functions are now suggested and the syntax check is available for them. Check constraints hints for the ALTER TABLE statements You can now enjoy the automatic suggestions of check constraints when working with the ALTER TABLE Statements. WAIT_AT_LOW_PRIORITY option for the ALTER TABLE … SWITCH PARTITION statements The WAIT_AT_LOW_PRIORITY option is now supported and suggested in the ALTER TABLE ... SWITCH PARTITION statements. DISTRIBUTED_AGG in the SELECT – GROUP BY queries The DISTRIBUTED_AGG hint is now available for the SELECT — GROUP BY queries. ALTER/DROP DATABASE SCOPED CREDENTIAL dbForge Studio for SQL Server 6.1 brings the suggestion of keywords for ALTER/DROP DATABASE SCOPED CREDENTIAL statements. ALTER DATABASE SCOPED CONFIGURATION We are also adding the suggestion of keywords for the ALTER DATABASE SCOPED CONFIGURATION statement. MEMORY_OPTIMIZED in the ALTER SERVER CONFIGURATION statements To speed up your SQL coding, we introduce the suggestion of keywords for the MEMORY_OPTIMIZED syntax element in the ALTER SERVER CONFIGURATION statements. CREATE/ALTER/DROP EVENT SESSION session_name ON DATABASE (for Azure SQL) For efficiency’s sake and to simplify your work with event sessions, we deliver the support for the СREATE/ALTER/DROP EVENT SESSION session_name ON DATABASE statements. Query Store options in the ALTER DATABASE statements We extend the number of supported options for configuring the Query Store parameters. Get the new shiny update Interested? [Download](https://www.devart.com/dbforge/sql/studio/download.html) the fresh version of dbForge Studio for SQL Server and explore its new features. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Studio for SQL Server 6.1](https://blog.devart.com/tag/dbforge-studio-for-sql-server-6-1) [DevOps Automation](https://blog.devart.com/tag/devops-automation) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Hones+DevOps+Automation+to+Perfection+in+dbForge+Studio+for+SQL+Server+6.1&url=https%3A%2F%2Fblog.devart.com%2Fdevart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html&title=Devart+Hones+DevOps+Automation+to+Perfection+in+dbForge+Studio+for+SQL+Server+6.1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html&title=Devart+Hones+DevOps+Automation+to+Perfection+in+dbForge+Studio+for+SQL+Server+6.1) [Copy URL](https://blog.devart.com/devart-hones-devops-automation-to-perfection-in-dbforge-studio-for-sql-server-6-1.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart sponsored SQL Saturday Lviv 2019 By [dbForge Team](https://blog.devart.com/author/dbforge) September 20, 2019 [0](https://blog.devart.com/devart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html#respond) 2539 We are delighted to inform you that Devart was a Gold sponsor of the [SQL Saturday Lviv 2019](https://www.sqlsaturday.com/860/EventHome.aspx) event that took place on September 21 in Lviv, Ukraine. Providing support and sponsorship for both, local user group meetups, and global industry-level events is an endeavor that Devart strives to stick to and keeps developing. [SQLSaturday](https://www.sqlsaturday.com/) is a continuous series of free training events that take place all across the world for professionals who use the Microsoft data platform. These community events offer content across data management, cloud and hybrid architecture, analytics, business intelligence, AI, and more. SQL Saturday Lviv 2019 featured 3 panels with contributions of 15 speakers from Ukraine, Malta, Spain, Portugal, Bulgaria, Belgium, Poland, Israel who shared their knowledge in data collection, processing, analysis, visualization, security, backup, big data. Along with sponsorship, Devart participated directly in the event. Our experts made a number of topical presentations dedicated to current software development issues. Devart licenses we raffled at the end of the event. Our congratulations to the winners! We are looking forward to the next SQL Saturday event. Tags [events](https://blog.devart.com/tag/events) [SQL Server](https://blog.devart.com/tag/sql-server) [sqlsaturday](https://blog.devart.com/tag/sqlsaturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+sponsored+SQL+Saturday+Lviv+2019&url=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html&title=Devart+sponsored+SQL+Saturday+Lviv+2019) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html&title=Devart+sponsored+SQL+Saturday+Lviv+2019) [Copy URL](https://blog.devart.com/devart-is-a-gold-sponsor-of-sql-saturday-lviv-2019.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Devart Has Become a Digital Sponsor of SQLBits 2024](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) July 26, 2024"} {"url": "https://blog.devart.com/devart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart sponsored SQLSaturday Kharkiv 2019 By [dbForge Team](https://blog.devart.com/author/dbforge) September 20, 2019 [0](https://blog.devart.com/devart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html#respond) 2914 For the past decade, Devart has been actively supporting SQL Saturday events including this year’s conference in Kharkiv, Ukraine. As usual, Devart was a Gold Sponsor of the event which was held on September 28 at Fabrika space. 13 speakers from 5 countries shared their knowledge and experience with more than 200 attendees. On Saturday, Devart team presented its new product dbForge Transaction Log and the latest version of dbForge Compare Bundle. The company’s experts answered attendees’ questions and demonstrated the advantages of dbForge tools for SQL Server. SQLSaturday is a continuous series of global free training events for professionals who use the Microsoft data platform. These community events offer content across data management, cloud and hybrid architecture, analytics, business intelligence, AI, and more. We would like to express our gratitude to everyone who participated in the event organization – you did a great job and managed to gather people who spent quality time with colleagues and like-minded people. Tags [events](https://blog.devart.com/tag/events) [SQL Server](https://blog.devart.com/tag/sql-server) [sqlsaturday](https://blog.devart.com/tag/sqlsaturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+sponsored+SQLSaturday+Kharkiv+2019&url=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html&title=Devart+sponsored+SQLSaturday+Kharkiv+2019) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html&title=Devart+sponsored+SQLSaturday+Kharkiv+2019) [Copy URL](https://blog.devart.com/devart-is-a-gold-sponsor-of-sqlsaturday-kharkiv-2019.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Devart Has Become a Digital Sponsor of SQLBits 2024](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) July 26, 2024"} {"url": "https://blog.devart.com/devart-is-a-media-partner-of-net-fullstack-2019.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart is a media partner of .NET FullStack 2019 By [dbForge Team](https://blog.devart.com/author/dbforge) July 19, 2019 [0](https://blog.devart.com/devart-is-a-media-partner-of-net-fullstack-2019.html#respond) 2942 Devart is a media partner of .NET FullStack Kharkiv Conference which is an annual IT conference that brings together experts in the field of .NET development. The main meeting of progressive developers will be held on September 14 at Fabrika.space. During the event, Devart licenses will be raffled. About the conference: 3 streams of reports 8 hours of communication 5 main .Net directions 12+ cool speakers 400+ participants Attendees will enjoy: reports from top Ukrainian professionals in WEB, DevOps, Cloud, Xamarin, and Database development discussion about advanced technologies and development approaches meetings with conference partners – representatives of the best IT-companies in Ukraine face-to-face communication with the speakers, establishing contacts gifts from conference hosts and partners Among the speakers are the representatives of SSA Group, Global Logic, Daxx, DataArt, ELEKS, SBTech. Tickets to the largest .Net developer meeting in Kharkiv are available online. [Join us now.](https://fullstacknet.ticketforevent.com/) For more information about the speakers and the program visit the official conference [website](https://www.fullstack.net.ua/en/) . Tags [.NET FullStack 2019](https://blog.devart.com/tag/net-fullstack-2019) [media partner](https://blog.devart.com/tag/media-partner) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-media-partner-of-net-fullstack-2019.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+is+a+media+partner+of+.NET+FullStack+2019&url=https%3A%2F%2Fblog.devart.com%2Fdevart-is-a-media-partner-of-net-fullstack-2019.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-is-a-media-partner-of-net-fullstack-2019.html&title=Devart+is+a+media+partner+of+.NET+FullStack+2019) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-is-a-media-partner-of-net-fullstack-2019.html&title=Devart+is+a+media+partner+of+.NET+FullStack+2019) [Copy URL](https://blog.devart.com/devart-is-a-media-partner-of-net-fullstack-2019.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Devart Has Become a Digital Sponsor of SQLBits 2024](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) July 26, 2024"} {"url": "https://blog.devart.com/devart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart is the GOLD sponsor of SQLSaturday #508 – Kiev 2016 By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) May 24, 2016 [0](https://blog.devart.com/devart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html#respond) 3163 Devart sponsored [SQLSaturday #508](https://www.sqlsaturday.com/508/eventhome.aspx) , that took place on May 21, 2016 in Kiev, Ukraine. Besides databases, there were a lot of interesting sessions on the popular topics: BI, Data Science, Big Data, Machine Learning, and others. This year, over 500 attendees have had a great chance to learn something new from world-known speakers from more than 10 countries: Tobiasz Koprowski, Alexander Karl, Mihail Mateev, Dejan Sarka, Uwe Ricken, Michal Sadowski, Satya SK Jayanty, Regis Baccaro, Tomasz Libera, and others. We would like to thank the speakers for the wonderful conference! [SQLSaturday’s](https://www.sqlsaturday.com/default.aspx) are free 1-day training events for SQL Server professionals that focus on local speakers, providing a variety of high-quality technical sessions, and making it all happen through the efforts of volunteers. We would like to express our gratitude to the  organizers — the team that made the event possible. We also held the raffle for the participants of SQLSaturday. The winners were awarded with the exclusive prizes, including FREE licenses for [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [dbForge Developer Bundle](https://www.devart.com/dbforge/sql/developer-bundle/) , and a quadcopter. Until next time! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+is+the+GOLD+sponsor+of+SQLSaturday+%23508+%E2%80%93+Kiev+2016&url=https%3A%2F%2Fblog.devart.com%2Fdevart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html&title=Devart+is+the+GOLD+sponsor+of+SQLSaturday+%23508+%E2%80%93+Kiev+2016) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html&title=Devart+is+the+GOLD+sponsor+of+SQLSaturday+%23508+%E2%80%93+Kiev+2016) [Copy URL](https://blog.devart.com/devart-is-the-gold-sponsor-of-sqlsaturday-508-kiev-2016.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart Joins the Winners’ Circle of 2020 DBTA Readers’ Choice Awards By [dbForge Team](https://blog.devart.com/author/dbforge) August 13, 2020 [0](https://blog.devart.com/devart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html#respond) 2463 For the second year in a row, readers of DBTA — a magazine focusing on data, information management, and analytics — [choose Devart products as winners](https://www.dbta.com/Editorial/Actions/Winners-Ciricle-Devart-142162.aspx) among the world’s top solutions for database development, management, and administration. Today, our winner is [Devart dotConnect](https://www.devart.com/dotconnect/) with its enhanced ORM-enabled ADO.NET data providers — a world-class solution that ensures high-performance access to databases and clouds. Additionally, dotConnect goes beyond ordinary providers and offers a visual ORM designer (the best one on the market), support for Entity Framework and Entity Framework Core, and advanced SQL support for clouds. The readers chose it as the Best Data Integration Solution. That wasn’t the only Devart solution to win the recognition of DBTA readers. We’ve got four more finalists in the following nominations: • Best Data Modeling Solution: [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) • Best Database Development Solution: [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) • Best Database Backup Solution: [dbForge for SQL Server](https://www.devart.com/dbforge/sql/studio/) • Best DBA Solution: [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) We are thankful to our users for this support. It inspires us to be the best for you. For those who would like to get acquainted with our solutions: you can easily download a free 30-day trial of any tool we offer. Check the capabilities, and if you like them — and we are sure you will — you can purchase the product with a perpetual license. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [MySQL](https://blog.devart.com/tag/mysql) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Joins+the+Winners%E2%80%99+Circle+of+2020+DBTA+Readers%E2%80%99+Choice+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html&title=Devart+Joins+the+Winners%E2%80%99+Circle+of+2020+DBTA+Readers%E2%80%99+Choice+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html&title=Devart+Joins+the+Winners%E2%80%99+Circle+of+2020+DBTA+Readers%E2%80%99+Choice+Awards) [Copy URL](https://blog.devart.com/devart-joins-the-winners-circle-of-2020-dbta-readers-choice-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-joins-the-winners-of-2020-componentsource-bestselling-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart Joins the Winners of 2020 ComponentSource Bestselling Awards By [dbForge Team](https://blog.devart.com/author/dbforge) September 9, 2020 [0](https://blog.devart.com/devart-joins-the-winners-of-2020-componentsource-bestselling-awards.html#respond) 2395 The response and appreciation of our users make our company proud of the database solutions we create. This time Devart gladly joins the ranks of [Top 25 Bestselling Publishers](https://www.componentsource.com/help-support/bestselling-publisher-awards-2020) on ComponentSource, based on sales orders placed by their customers over the course of a year. It isn’t any less exciting that our database tools – [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) and [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) – earned a place on the list of [Top 100 Bestselling Product Awards](https://www.componentsource.com/help-support/bestselling-product-awards-2020) . Now a few words about ComponentSource: it is a respectable source of software components and tools. They have offices in the US, Europe, and Asia, supporting over 125,000 customers in 180 countries. Every year they do research and analysis of popular software categories on their website. As a result, all the purchases made by their customers become the actual votes that define top bestselling vendors and software products. Devart is genuinely proud of being recognized as one of the 25 most popular global software vendor brands. Our [database management tools](https://www.devart.com/dbforge/) are designed to deliver value to users through smooth user experience and rich functionality. These awards inspire us to work harder and deliver exceptional products to all our users worldwide. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dotconnect](https://blog.devart.com/tag/dotconnect) [sql complete](https://blog.devart.com/tag/sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-joins-the-winners-of-2020-componentsource-bestselling-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Joins+the+Winners+of+2020+ComponentSource+Bestselling+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-joins-the-winners-of-2020-componentsource-bestselling-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-joins-the-winners-of-2020-componentsource-bestselling-awards.html&title=Devart+Joins+the+Winners+of+2020+ComponentSource+Bestselling+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-joins-the-winners-of-2020-componentsource-bestselling-awards.html&title=Devart+Joins+the+Winners+of+2020+ComponentSource+Bestselling+Awards) [Copy URL](https://blog.devart.com/devart-joins-the-winners-of-2020-componentsource-bestselling-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-keeps-upgrading-its-ide-to-be-innovative.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Devart keeps upgrading its SQL Server IDE to be innovative By [dbForge Team](https://blog.devart.com/author/dbforge) November 21, 2018 [2](https://blog.devart.com/devart-keeps-upgrading-its-ide-to-be-innovative.html#comments) 17115 Devart is always trying to keep in touch with the times and provide its users with more and more opportunities for successful and effective databases management. That is why we have released a new version of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) v.5.6 with the changes aimed at supporting the latest technologies in the world of database development. Connectivity support for SQL Server 2019 The latest dbForge Studio for SQL Server update includes full support for the SQL Server 2019 for the most efficient and fast work with databases. Support for TFS 2018 TFS 2018 is now fully supported by dbForge Studio for SQL Server! All necessary supplies and functionality are provided for the most effective and simple management of database changes. Foreign Key Generator Our tool now provides users with the ability to generate values for a group of unique table columns. Tell Us What You Think We invite you to [try the new version of dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and share your thoughts about the tools with us. This will help us to make dbForge Studio for SQL Server better for you. Tags [data generator](https://blog.devart.com/tag/data-generator) [sql server 2019](https://blog.devart.com/tag/sql-server-2019) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [tfs](https://blog.devart.com/tag/tfs) [what's new sql server studio](https://blog.devart.com/tag/whats-new-sql-server-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-keeps-upgrading-its-ide-to-be-innovative.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+keeps+upgrading+its+SQL+Server+IDE+to+be+innovative&url=https%3A%2F%2Fblog.devart.com%2Fdevart-keeps-upgrading-its-ide-to-be-innovative.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-keeps-upgrading-its-ide-to-be-innovative.html&title=Devart+keeps+upgrading+its+SQL+Server+IDE+to+be+innovative) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-keeps-upgrading-its-ide-to-be-innovative.html&title=Devart+keeps+upgrading+its+SQL+Server+IDE+to+be+innovative) [Copy URL](https://blog.devart.com/devart-keeps-upgrading-its-ide-to-be-innovative.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 2 COMMENTS Sonja Birner November 30, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 1:18 pm Hello everybody, since today I’m testing your express edition and I’m very impressed…. But one thing is really annoying: I wanted to edit my stored procedure, add a parameter and a variable to it, but this was not possible, because the editor doesn’t want me to write the character @ ! So I have to write a @ in another editor and copy it to your query editor… With kind regards, Sonja Birner Sonja Birner November 30, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 1:40 pm Your Query Builder lies to me! I have a stored procedure with a optional parameter: ALTER PROCEDURE dbo.spSomeName @param1 int, @param2 int, @msg nvarchar(500) OUTPUT, @NachbehaelterID int = NULL OUTPUT AS BEGIN … Because of the last parameter line your editor denies to save the changes with the following message: Object has invalid source text and cannot be saved. But this IS NOT correct! The SQL Server Management Studio saves this without any problems! With kind regards Sonja Birner Comments are closed."} {"url": "https://blog.devart.com/devart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Devart ODBC Driver for Stripe: Data Source Name Configuration and Stripe Access from PHP, Python, and Power BI By [DAC Team](https://blog.devart.com/author/dac) October 25, 2022 [0](https://blog.devart.com/devart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html#respond) 2758 In this article, we will discuss how to configure a DSN and connect to Stripe from Python, PHP, and Power BI using the ODBC driver for Stripe. Devart ODBC Driver for Stripe is a  high-performance connectivity tool that allows accessing Stripe from both 32-bit and 64-bit Windows. The driver provides an efficient, enterprise-grade method of connecting to the data sources and quick, secure access to live Stripe data from any machine. ODBC driver for Stripe supports all of the standard ODBC API operations and data types. Key Features The Stripe Link Our data connector enables ODBC-compatible software to securely communicate with Stripe over the Internet. If you are blocked from using Stripe, you may still connect to the service using a proxy server. Extended SQL Syntax The ODBC driver makes it possible to use Stripe objects as standard SQL tables inside your software. The extended SQL syntax allows for the use of all the benefits of SQL in SELECT queries that are compatible with SQL-92: Complex JOINs WHERE conditions Subqueries GROUP statements Aggregation functions ORDER statements DML Operations Support You can perform DML operations like INSERT, UPDATE, or DELETE on your data using the Devart ODBC Driver for Stripe, which is impossible with SQL databases. ODBC Conformance Indicators The driver fully supports the ODBC interface. So when using an ODBC Data Type, you may use one of the many ODBC API Functions available. Furthermore, we assist with more complex Connection String parameters. Stripe may be integrated with any ODBC-enabled system, whether it is a desktop application or a web service. Stripe Compatibility The ODBC driver works with all of the file types supported by the Stripe API. The driver is also compatible with Stripe’s API. Integration With the Devart ODBC drivers, you may access any data sources from various IDEs, reporting, or data analysis tools, for example, Microsoft Excel in different operating systems. See the [Compatibility](https://docs.devart.com/odbc/stripe/compatibility.htm) page for a full rundown of compatible programs and devices. Variety of Platforms The Devart ODBC Driver for Stripe works with 32-bit and 64-bit applications on x32 and x64 platforms, so there’s no need to configure the driver, software, or environment additionally. Driver for Unicode-Compatible Devices For multilingual Stripe databases, our fully Unicode driver guarantees error-free data retrieval and processing, no matter the charset used for the data (Latin, Cyrillic, Hebrew, Chinese, etc.). The Highest Level of Efficiency Our driver can do everything with Stripe substantially faster because of local data caching, connection pooling, query optimization, and other features. Support Visit our [Support](https://www.devart.com/odbc/stripe/support.html) page for instantaneous help from qualified professionals, rapid problem resolution, and the latest hotfixes in nightly releases. To see the full list of the Features, follow the [Features](https://docs.devart.com/odbc/stripe/features.htm) page. ODBC Driver Configuration for Stripe Microsoft Windows DSN Setup To begin with, we would like to point out that the Windows search box is where you will go to find ODBC Data Sources. Once you find it, choose the version corresponding to the program’s bitness (32-bit or 64-bit). Control Panel > Administrative Tools is also where you’ll discover ODBC Data Sources. Earlier versions of Windows referred to the icon as Data Sources (ODBC). You may also build a 32-bit DSN by running С:WindowsSystem32odbcad32.exe , or a 64-bit DSN by using C:WindowsSysWOW64odbcad32.exe . So, as soon as the driver has finished installing, follow the next steps. Open the ODBC Data Source Administrator and add a new DSN for Stripe by clicking Add button. Note that the ODBC driver for Stripe is preinstalled. You may choose the User DSN or System DSN menu. Although any DSN type may be used with most programs, some have stricter requirements. Add it by clicking the Add button. The prompt to set up a fresh data source will show up. Pick the Devart Stripe ODBC Driver and hit Finish . A window to configure the driver will appear. Fill up the required fields with your connection data. To make sure the connection is successful, choose Test Connection . The DSN will be saved when you click OK . Creating an ODBC Trace Log on Windows Activating or deactivating tracing in the 64-bit ODBC Administrator also activates or deactivates tracing in the 32-bit ODBC Administrator correspondingly. Select Machine-Wide tracing for all user identities if the ODBC client program you need to trace uses the Local System account or a user login other than your own. For instance, this setting may be required for SSMS. If you’re using Windows and ODBC Source Administrator, you may create a trace file by following the instructions below. In Windows 10, type ODBC Data Sources into the search box (in older versions of Windows, enter Control Panel > Administrative Tools ) and choose the appropriate bitness from the drop-down menu. Find the Tracing menu and click it. If you need to, you may modify the default location of the log files. Verify that the program can write the path, and then press the Apply button. Select Trace Now to begin. It’s time to reboot the whole system’s software. To ensure the driver can connect, go to the DSN settings and click the Test Connection button. On the Tracing tab, choose Stop Tracing Now . Forward the collected log file to us (for example, devart.log). Creating an ODBC Trace Log on macOS Use the ODBC Administrator’s Tracing tab to activate the tracing feature on macOS. Open the ODBC Administrator . Find the Tracing menu and click it. You may modify the default location of the Log file if you need to. Choose Always from the When to trace dropdown. Creating an ODBC Trace Log on Linux In Linux, you may monitor ODBC calls by configuring the Trace and TraceFile keyword/value pairs in the [ODBC] section of the /etc/odbcinst.ini file, as shown below: [ODBC]\nTrace=Yes\nTraceFile=/home/test/devart.log After you have obtained a log file, you should turn off logging since it slows down read and write operations. Using the ODBC Driver for Stripe in Third-Party Tools ODBC Driver for Stripe is compliant with the following ODBC tools: DBeaver Oracle Database Link Microsoft Access Microsoft Excel OpenOffice and LibreOffice PHP Power BI Python QlikView SQL Server Management Studio SSIS Tableau To get more information about using ODBC Driver for Stripe in Third-Party Tools, check out the [documentation](https://docs.devart.com/odbc/stripe/using_in_third_party_tools.htm) . Importing Stripe Data into Power BI through an ODBC Connection Power BI is a well-known business intelligence solution that includes services, applications, and connectors for aggregating and analyzing data from various sources. Using an ODBC driver, Power BI may be linked to multiple databases. With the help of an ODBC driver for Stripe, you can access Stripe directly from Power BI Desktop and import the needed data. It is expected that an ODBC driver for Stripe has previously been installed and set up on your machine. Launch Power BI Desktop and choose Get Data to analyze your information. Select Other > Get Data > ODBC . To finalize your selection, click Connect . Click the down arrow next to Data Source Name (DSN) in the From ODBC dialog box, then choose the DSN you set up for Stripe. If you want to input a SQL statement to filter the results, you may expand the dialogue box and click the Advanced options arrow. Click OK . If your data source requires authentication, Power BI will ask for your credentials. Then, enter your Username and Password into the corresponding boxes and click the Connect button. Now you can view the data structures in your data source by clicking on a database item. Select the table you need to import into Power BI from Stripe, and click Load . Connecting to Stripe from Python using the ODBC Driver for Stripe Connection to various databases and cloud services and retrieving data from them from Python is easier with ODBC. To operate with Python, you need first to download it from the official website and run the installer. Besides, the pyodbc module must be established. In Python’s interactive mode, this may be done with the command pip install pyodbc. Then all you need to use the driver as a translation layer between your application and the data source is to configure a DSN (Data Source Name). To do this, follow the steps below: Step 1: Connect import pyodbc \ncnxn = pyodbc.connect('DRIVER={Devart ODBC Driver for Stripe};Account ID=myaccountid, Secret Key=mykey') Step 2: Insert a row To verify your database connection, look at this elementary example of running an insert command. A new entry is added to the EMP table due to the script. cursor = cnxn.cursor()\ncursor.execute(\"INSERT INTO EMP (EMPNO, ENAME, JOB, MGR) VALUES (535, 'Scott', 'Manager', 545)\") Step 3: Execute the query Selected query rows are returned by the cursor.execute() function. The print() method outputs all table records to the terminal, whereas the cursor.fetchone() function iterates across the result set produced by cursor.execute() . cursor = cnxn.cursor()\t\ncursor.execute(\"SELECT * FROM EMP\") \nrow = cursor.fetchone() \nwhile row:\n print (row) \n row = cursor.fetchone() Connecting to Stripe from PHP using ODBC Driver for Stripe Regarding creating websites, PHP is among the most widely used languages. ODBC connectors allow PHP developers to be database-agnostic, meaning that your PHP-based product will run with any vendor’s DBMS. SQL queries may be prepared and run against databases like MySQL, SQLite, PostgreSQL, and more with the help of tools like odbc exec() . Data storage, either in the form of a local server or on the cloud, is a common need for PHP-based projects. Through the ODBC driver, you may connect to them. Our ODBC drivers allow you to connect to several data sources and get data from databases, including their tables and fields. To establish a connection to the Stripe database and retrieve every record from a specific table, the PHP script is used. Below you can see such an example of a PHP script. Step 1: Connect to an ODBC data source. To link to an ODBC data source, use the odbc_connect() function. Please be aware that the function requires three inputs: the name of the data source, the user’s name, and their password. Leave these fields blank if your database does not need a password or login. Using the odbc_connect() function in PHP, the following code connects to a database. ?php\n $user = \"myusername\"; \n $password = \"mypassword\";\n $ODBCConnection = odbc_connect(\"DRIVER={Devart ODBC Driver for Stripe};Account ID=myaccountid, Secret Key=mykey\", $user, $password); Step 2: Execute an SQL statement A SELECT query is executed on the dept table in the autotest database through the odbc_exec() function if the connection is successful. $SQLQuery = \"SELECT * FROM autotest.dept\";\n $RecordSet = odbc_exec($ODBCConnection, $SQLQuery); Step 3: Print the result set The data in the result set is provided by the odbc_fetch_row() function. The odbc_result_set() method outputs a set of results in an HTML table, while the odbc_fetch_row() function retrieves rows. The odbc_close() function terminates the connection when the result set’s rows have been printed. while (odbc_fetch_row($RecordSet)) {\n $result = odbc_result_all($RecordSet, \"border=1\");\n }\n odbc_close($ODBCConnection);\n?> To use other ODBC drivers with your PHP application, you can modify this script by specifying another setting for a certain driver. Tags [linux](https://blog.devart.com/tag/linux) [macOS](https://blog.devart.com/tag/macos) [odbc](https://blog.devart.com/tag/odbc) [Python](https://blog.devart.com/tag/python) [what's new odbc drivers](https://blog.devart.com/tag/whats-new-odbc-drivers) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+ODBC+Driver+for+Stripe%3A+Data+Source+Name+Configuration+and++Stripe+Access+from+PHP%2C+Python%2C+and+Power+BI&url=https%3A%2F%2Fblog.devart.com%2Fdevart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html&title=Devart+ODBC+Driver+for+Stripe%3A+Data+Source+Name+Configuration+and++Stripe+Access+from+PHP%2C+Python%2C+and+Power+BI) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html&title=Devart+ODBC+Driver+for+Stripe%3A+Data+Source+Name+Configuration+and++Stripe+Access+from+PHP%2C+Python%2C+and+Power+BI) [Copy URL](https://blog.devart.com/devart-odbc-driver-for-stripe-how-to-configure-a-data-source-name-and-access-stripe-data-from-php-python-and-power-bi.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) The Industry Leader Again: Devart ODBC Drivers are Awarded the G2 High Performer and GetApp Category Leader 2021 By [DAC Team](https://blog.devart.com/author/dac) November 5, 2021 [0](https://blog.devart.com/devart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html#respond) 2335 Fall is the time to reap the rewards! The Devart team has been hard on bringing you the best-in-class ODBC connectors. We are delighted to announce that [our ODBC drivers](https://www.devart.com/odbc/) were highly recognized by the software review websites – G2 and GetApp – based on rates and reviews from real users. Devart ODBC Driver has been awarded by G2 Crowd as: Easiest To Use – by the highest Ease of Use rating of the Usability Index Easiest Admin – by the highest Ease of Admin rating of the Usability Index High Performer – by the highest Customer Satisfaction scores Also, ODBC drivers are among [GetApp’s 2021 Category Leaders of the Integration Software nomination](https://www.getapp.com/it-management-software/integration/category-leaders/) . The ratings are compiled according to the real users’ feedbacks. We are happy and proud to be awarded for our ODBC drivers in these categories and want to thank all our customers for choosing our products and taking the time to rate them on the review websites. Devart ODBC drivers are ODBC-compliant connectors for accessing and managing data in on-premise and cloud databases and applications. With our drivers, you can get direct access to SQL and NoSQL data sources, without any client libraries. You are welcome to download a [free 30-days trial](https://www.devart.com/products.html#odbc) of our ODBC drivers. Tags [odbc](https://blog.devart.com/tag/odbc) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html) [Twitter](https://twitter.com/intent/tweet?text=The+Industry+Leader+Again%3A+Devart+ODBC+Drivers+are+Awarded+the+G2+High+Performer+and+GetApp+Category+Leader+2021&url=https%3A%2F%2Fblog.devart.com%2Fdevart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html&title=The+Industry+Leader+Again%3A+Devart+ODBC+Drivers+are+Awarded+the+G2+High+Performer+and+GetApp+Category+Leader+2021) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html&title=The+Industry+Leader+Again%3A+Devart+ODBC+Drivers+are+Awarded+the+G2+High+Performer+and+GetApp+Category+Leader+2021) [Copy URL](https://blog.devart.com/devart-odbc-drivers-are-awarded-the-g2-high-performer-and-getapp-category-leader-2021.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-presented-entitydac-at-this-years-coderage-xii-conference.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Events](https://blog.devart.com/category/events) Devart presented EntityDAC at this year’s Coderage XII conference By [DAC Team](https://blog.devart.com/author/dac) November 14, 2017 [0](https://blog.devart.com/devart-presented-entitydac-at-this-years-coderage-xii-conference.html#respond) 17537 Devart was pleased to participate in the annual online conference for Delphi developers – Coderage XII organized by the Embarcadero company and held on November 7-9, 2017. During the session, we presented the solution how using ORM for Delphi – [EntityDAC](https://www.devart.com/entitydac/) can simplify and speed-up the application development process. In our presentation, we used real samples, as well as other Devart products ( [UniDAC](https://www.devart.com/unidac/) , [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) ) to clearly demonstrate this. We would like to thank the organizers of this event – Embarcadero, in particular Jim McKeeth and David Millington, for providing us with such an opportunity. Also we want to express our gratitude to our colleagues who took part in preparation for this event: EntityDAC developers and marketing team. Looking forward to taking part in the coming Coderage sessions next year. Tags [delphi](https://blog.devart.com/tag/delphi) [devart](https://blog.devart.com/tag/devart) [entitydac](https://blog.devart.com/tag/entitydac) [orm](https://blog.devart.com/tag/orm) [studio for oracle](https://blog.devart.com/tag/studio-for-oracle) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-presented-entitydac-at-this-years-coderage-xii-conference.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+presented+EntityDAC+at+this+year%E2%80%99s+Coderage+XII+conference&url=https%3A%2F%2Fblog.devart.com%2Fdevart-presented-entitydac-at-this-years-coderage-xii-conference.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-presented-entitydac-at-this-years-coderage-xii-conference.html&title=Devart+presented+EntityDAC+at+this+year%E2%80%99s+Coderage+XII+conference) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-presented-entitydac-at-this-years-coderage-xii-conference.html&title=Devart+presented+EntityDAC+at+this+year%E2%80%99s+Coderage+XII+conference) [Copy URL](https://blog.devart.com/devart-presented-entitydac-at-this-years-coderage-xii-conference.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/devart-presented-securebridge-at-this-years-coderage-2018-conference.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Events](https://blog.devart.com/category/events) Devart presented SecureBridge at this year’s Coderage 2018 conference By [DAC Team](https://blog.devart.com/author/dac) December 10, 2018 [0](https://blog.devart.com/devart-presented-securebridge-at-this-years-coderage-2018-conference.html#respond) 4727 Devart was glad to participate in the annual online conference for Delphi developers – CodeRage 2018 organized by the Embarcadero company and held on December 4-6, 2018. During the session, we presented [SecureBridge](https://www.devart.com/sbridge/) product and its component – HTTPS client. This component is designed for accessing data from a Web server over the HTTPS protocol and can be easily customized and used (no additional third-party libraries are required). In our presentation, we clearly demonstrated its configuration and described the benefits of using this component while developing apps in Delphi, C++Builder, and Lazarus. We would like to thank the organizers of this event – Embarcadero, in particular Jim McKeeth and David Millington, for providing us with such an opportunity. Furthermore, we want to express our gratitude to our colleagues who took part in preparation for this event: SecureBridge developers and the marketing team. You can watch our presentation at [Embarcadero Academy](https://www.embarcaderoacademy.com/courses) when logging in to your account. On the occasion of this event, we have a [special offer](https://www.devart.com/events/2018/coderage2018.html) on SecureBridge and DAC products until the end of December. We look forward to taking part in the coming CodeRage sessions next year. Tags [delphi](https://blog.devart.com/tag/delphi) [devart](https://blog.devart.com/tag/devart) [securebridge](https://blog.devart.com/tag/securebridge) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-presented-securebridge-at-this-years-coderage-2018-conference.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+presented+SecureBridge+at+this+year%E2%80%99s+Coderage+2018+conference&url=https%3A%2F%2Fblog.devart.com%2Fdevart-presented-securebridge-at-this-years-coderage-2018-conference.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-presented-securebridge-at-this-years-coderage-2018-conference.html&title=Devart+presented+SecureBridge+at+this+year%E2%80%99s+Coderage+2018+conference) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-presented-securebridge-at-this-years-coderage-2018-conference.html&title=Devart+presented+SecureBridge+at+this+year%E2%80%99s+Coderage+2018+conference) [Copy URL](https://blog.devart.com/devart-presented-securebridge-at-this-years-coderage-2018-conference.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Devart Products Become the Winners of DBTA 2022 Readers’ Choice Awards By [dbForge Team](https://blog.devart.com/author/dbforge) September 13, 2022 [0](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html#respond) 2971 For the fourth year in a row, [Devart solutions are voted winners of the Readers’ Choice Awards](https://www.dbta.com/Editorial/Trends-and-Applications/DBTA-Readers-Choice-Awards-Winners-2022-154324.aspx) by the Database Trends and Applications magazine, which is one of the leading media focused on data science, big data, and information management. Just like in [the previous case](https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html) , most of the honors go to the [dbForge Studio product line](https://www.devart.com/dbforge/studio/) —a collection of IDEs designed and developed to help users streamline all database-related activities with a single integrated solution. Three of our Studios — namely, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , and [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) — earned silver awards in the Best Database Administration Solution category. And indeed, our Studios are undeniably feature-rich in this respect; their administration capabilities include but are not limited to the following: User account and permission management Database backup and recovery Database duplication and migration Server session management Real-time server performance monitoring Server diagnostics Index fragmentation management In the Best Database Development Solution category, the three aforementioned Studios all qualify as well, also joined by [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . Well, with the following features at your disposal, you really can’t go wrong. Code editor that helps write, edit, and run queries of any complexity [Smart code completion, formatting, refactoring, and instant syntax check](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) Effective handling of large scripts Debugging of stored procedures and functions Generation of database scripts Convenient visual query building with no coding whatsoever Query optimization and performance tuning All four Studios scored another bronze award under the title of Best Database Performance Solutions . To quote DBTA, performance solutions effectively “aid DBAs and data teams in maintaining response time and efficiency of databases.” We are happy to see that our flagship products excel at this, and we are thankful to our users for all this recognition. The last of our winners for today deviates from the formula, being an integrated network security solution. The title of Best Data Security Solution went to [SecureBridge](https://www.devart.com/sbridge/) , a suite of client and server components for SSH, SFTP, FTPS, HTTP/HTTPS, SSL, WebSocket, and SignalR protocols. SecureBridge protects TCP traffic using SSH/SSL protocols and Cryptographic Message Syntax, delivering client and server authentication, strong data encryption, and data integrity verification. The components of SecureBridge can be effectively used alongside data access components to prevent data interception or modification in untrusted networks. How about checking these tools in action? No problem — just download any of them from our website for a free 30-day trial and enjoy the test drive! Be it data connectivity solutions, multifeatured Studios for major DBMSs, or [dbForge Edge](https://www.devart.com/dbforge/edge/) — the most robust IDE supporting SQL Server, MySQL, Oracle, and Postgres at once, we bet you won’t be able to imagine your daily work without these tools afterward! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [DBTA](https://blog.devart.com/tag/dbta) [securebridge](https://blog.devart.com/tag/securebridge) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Products+Become+the+Winners+of+DBTA+2022+Readers%E2%80%99+Choice+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA+2022+Readers%E2%80%99+Choice+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA+2022+Readers%E2%80%99+Choice+Awards) [Copy URL](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2022-readers-choice-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Devart Products Become the Winners of DBTA Readers’ Choice Awards 2023 By [dbForge Team](https://blog.devart.com/author/dbforge) August 10, 2023 [0](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html#respond) 2191 For many years now, Devart has taken a leading position among vendors of database development tools. In 2023, it won the trust of the readers of the Database Trends and Applications magazine, a prominent media renowned for its expertise in information management, big data, and data science. This [award](https://www.dbta.com/Editorial/Actions/Winners-Circle-Devart-159971.aspx) would not have been possible without the constant support and active participation of our dedicated users from all over the world. We extend our sincere gratitude to each of you for your invaluable contributions! Already a tradition, the dbForge Studio product line continued to grab honors due to its exceptional quality and advanced feature-packed tools that it consistently delivers. BEST DATABASE PERFORMANCE SOLUTION [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) got the winning place in the Best Database Performance Solution category. These products are part of the ultimate multidatabase solution – [dbForge Edge](https://www.devart.com/dbforge/edge/) . It offers feature-rich tools with intuitive user interfaces that provide the best experience and highest productivity in performing database-related tasks on different database systems such as Microsoft SQL Server, MySQL, MariaDB, Oracle, PostgreSQL, and Amazon Redshift. BEST DATA SECURITY SOLUTION We are proud to witness how the Devart products continue to win the recognition and satisfaction of business-oriented users and tech experts. In the Best Data Security Solution category, [SecureBridge](https://www.devart.com/sbridge/) , a suite of client and server components for SSH, SFTP, FTPS, HTTP/HTTPS, SSL, WebSocket, and SignalR protocols, took the bronze prize as some of the best data security solutions on the market. It is no wonder that these reliable tools ensure high database stability and security. They make the difference in controlling access to data and visually managing user accounts based on the defined privileges and rights. Without further ado, download any of the Studios from our [website](https://www.devart.com/dbforge/edge/) and try them in practice within a free 30-day trial period. Moreover, you can test-drive dbForge Edge, the ultimate solution providing simultaneous support for SQL Server, MySQL, MariaDB, Oracle, PostgreSQL, and Amazon Redshift, within 30 days for free. Undoubtedly, a wide range of advanced functionalities aimed at increasing productivity and simplifying database management and development while making the final product of high quality will encourage you to use our tools further. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [DBTA](https://blog.devart.com/tag/dbta) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Products+Become+the+Winners+of+DBTA+Readers%E2%80%99+Choice+Awards+2023&url=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA+Readers%E2%80%99+Choice+Awards+2023) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA+Readers%E2%80%99+Choice+Awards+2023) [Copy URL](https://blog.devart.com/devart-products-become-the-winners-of-dbta-2023-readers-choice-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Events](https://blog.devart.com/category/events) Devart Products Become the Winners of DBTA’s 2021 Readers’ Choice Awards By [dbForge Team](https://blog.devart.com/author/dbforge) August 12, 2021 [0](https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html#respond) 2848 We are delighted to announce that [Devart products have been ranked #1](https://www.dbta.com/Editorial/Trends-and-Applications/DBTA-Readers-Choice-Awards-Winners-2021-148104.aspx?PageNum=4) by the readers of the Database Trends and Applications magazine — one of the leading media specializing in information management, big data, and data science. BEST DATABASE BACKUP SOLUTION 2021 has become the year when [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) and [Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) have been voted the best among backup solutions for databases. These products are a part of the dbForge Studio family of convenient GUI tools, which also include Studio for Oracle and PostgreSQL. Devart also offers [dbForge Edge](https://www.devart.com/dbforge/edge/) , the ultimate multidatabase solution, which combines the best features from their renowned tools, providing a seamless experience for working across various database platforms. This is not the only news we would like to share. Some of our products have become finalists of this year’s DBTA Readers’ Choice Awards. BEST DATABASE DEVELOPMENT SOLUTION dbForge Studio for [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [SQL Server](https://www.devart.com/dbforge/sql/studio/) , [Oracle](https://www.devart.com/dbforge/oracle/studio/) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) earned a silver award; [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) took bronze. BEST DATABASE PERFORMANCE SOLUTION [dbForge Studio](https://www.devart.com/dbforge/studio/) also completed the list of the best DB performance solutions. The product got a bronze award. BEST DBA SOLUTION The above-mentioned category has included dbForge Studios for [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [SQL Server](https://www.devart.com/dbforge/sql/studio/) , and [Oracle](https://www.devart.com/dbforge/oracle/studio/) , which are sharing a silver award. BEST DATA SECURITY SOLUTION [Devart SecureBridge](https://www.devart.com/sbridge/) took the silver prize as one of the best data security solutions. Data security has always been a burning issue for all organizations, but the past year and a half highlighted the need for modern, reliable solutions. According to the results of annual voting at DBTA, the [Devart SecureBridge](https://www.devart.com/sbridge/download.html) is among the top-3 most functional and popular data security solutions. The SecureBridge developers are genuinely proud of their product and aim to make it even more flexible and powerful. This convenient tool combines clients and server components for SSH, SFTP, FTPS, HTTP/HTTPS, TLS/SSL, SignalR, and WebSocket protocols. It is easy to install and use, and it efficiently protects all TCP traffic between the client and the server. SecureBridge components can be used with data access components to prevent data interception or modification. CONCLUSION What makes Devart solutions so exceptional is that they make everyday routine much easier when it comes to DB development. Such all-in-one tools allow to develop, manage, analyze, design, and compare databases. Moreover, the intuitive interface and automatization possibilities help minimize the time and effort you put into the database development without affecting the quality of the final product. Devart highly appreciates our users’ commitment to Devart products. We strive to measure up to exacting standards and meet our users’ expectations. Receiving such awards encourages us to go even further and shoot for the stars. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [securebridge](https://blog.devart.com/tag/securebridge) [sql complete](https://blog.devart.com/tag/sql-complete) [what's new securebridge](https://blog.devart.com/tag/whats-new-in-securebridge) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Products+Become+the+Winners+of+DBTA%E2%80%99s+2021+Readers%E2%80%99+Choice+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA%E2%80%99s+2021+Readers%E2%80%99+Choice+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html&title=Devart+Products+Become+the+Winners+of+DBTA%E2%80%99s+2021+Readers%E2%80%99+Choice+Awards) [Copy URL](https://blog.devart.com/devart-products-become-the-winners-of-dbtas-2021-readers-choice-awards.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-products-win-at-vs-magazine-2019-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Devart Products Win at Visual Studio Magazine 2019 Reader’s Choice Awards By [dbForge Team](https://blog.devart.com/author/dbforge) January 8, 2019 [0](https://blog.devart.com/devart-products-win-at-vs-magazine-2019-readers-choice-awards.html#respond) 4867 We are super excited to inform our users that our products have become bronze winners at [Visual Studio Magazine 2019 Reader’s Choice Awards](https://visualstudiomagazine.com/articles/2018/12/17/vsm-2019-readers-choice-awards-are-out.aspx) ! Every year, readers of [Visual Studio Magazine](https://visualstudiomagazine.com/) vote across 40 categories for their favorite software products that they love best. This year, two Devart products have received recognition: [Code Compare](https://www.devart.com/codecompare/) won bronze in the Productivity Tools category. [Entity Developer](https://www.devart.com/entitydeveloper/) won bronze in the Software Design, Frameworks & Modelling Tools category. We would like to take this chance to thank all our users who supported us with their votes. We will strive to keep justify your confidence bestowed upon us and make our products yet better! Tags [Code Compare](https://blog.devart.com/tag/code-compare) [devart](https://blog.devart.com/tag/devart) [entity developer](https://blog.devart.com/tag/entity-developer) [orm solutions](https://blog.devart.com/tag/orm-solutions) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-products-win-at-vs-magazine-2019-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Products+Win+at+Visual+Studio+Magazine+2019+Reader%E2%80%99s+Choice+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdevart-products-win-at-vs-magazine-2019-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-products-win-at-vs-magazine-2019-readers-choice-awards.html&title=Devart+Products+Win+at+Visual+Studio+Magazine+2019+Reader%E2%80%99s+Choice+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-products-win-at-vs-magazine-2019-readers-choice-awards.html&title=Devart+Products+Win+at+Visual+Studio+Magazine+2019+Reader%E2%80%99s+Choice+Awards) [Copy URL](https://blog.devart.com/devart-products-win-at-vs-magazine-2019-readers-choice-awards.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/devart-products-win-visual-studio-magazine-readers-choice-awards-2021.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart Products Win Visual Studio Magazine Reader’s Choice Awards 2021 By [dbForge Team](https://blog.devart.com/author/dbforge) February 24, 2021 [0](https://blog.devart.com/devart-products-win-visual-studio-magazine-readers-choice-awards-2021.html#respond) 2263 Here come this year’s first awards received by Devart products! The Visual Studio Magazine [announced the winners of its 27th annual Reader’s Choice Awards](http://www.globenewswire.com/news-release/2021/01/27/2165171/0/en/Visual-Studio-Magazine-an-1105-Media-Inc-Product-Announces-2021-Reader-s-Choice-Award-Winners.html) , and we’ve got two products among them. The first one is dbForge Fusion for SQL Server , which was awarded the Bronze badge in the Databases and Data Development and Modeling category. The second one, Entity Developer , scored the Bronze badge in the Software Design, Frameworks, and Modeling Tools category. Visual Studio Magazine specializes in news, analytics, and practical guidelines for the global Microsoft community members working with the Visual Studio. Their competition involved more than 400 products in 41 categories. The top 3 entrants in each category received Gold, Silver, and Bronze badges. Our first winner, dbForge Fusion for SQL Server, is a Visual Studio plugin designed to simplify SQL database development and enhance data management capabilities. It delivers an easy way to explore and maintain databases, design compound SQL statements and queries, and manipulate data in various ways. If you work closely with SQL Server, we gladly invite you to [get a free 30-day trial of dbForge Fusion](https://www.devart.com/dbforge/sql/fusion/download.html) and see its capabilities for yourself. Our second winner, [Entity Developer](https://www.devart.com/entitydeveloper/) , is a powerful modeling and code generation tool for LinqConnect and ADO.NET Entity Framework. Feel free to check it out as well. Devart appreciates this recognition of our [database development tools](https://www.devart.com/dbforge/) . Our teams will continue working hard to help more and more database specialists cope with their daily challenges. Stay tuned for further updates! Tags [dbforge fusion](https://blog.devart.com/tag/dbforge-fusion) [entity developer](https://blog.devart.com/tag/entity-developer) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-products-win-visual-studio-magazine-readers-choice-awards-2021.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Products+Win+Visual+Studio+Magazine+Reader%E2%80%99s+Choice+Awards+2021&url=https%3A%2F%2Fblog.devart.com%2Fdevart-products-win-visual-studio-magazine-readers-choice-awards-2021.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-products-win-visual-studio-magazine-readers-choice-awards-2021.html&title=Devart+Products+Win+Visual+Studio+Magazine+Reader%E2%80%99s+Choice+Awards+2021) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-products-win-visual-studio-magazine-readers-choice-awards-2021.html&title=Devart+Products+Win+Visual+Studio+Magazine+Reader%E2%80%99s+Choice+Awards+2021) [Copy URL](https://blog.devart.com/devart-products-win-visual-studio-magazine-readers-choice-awards-2021.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-release-new-version-dbforge-studio-for-oracle-v-3-1.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Devart is glad to release the new improved version of dbForge Studio for Oracle, v 3.1 By [dbForge Team](https://blog.devart.com/author/dbforge) December 7, 2011 [0](https://blog.devart.com/devart-release-new-version-dbforge-studio-for-oracle-v-3-1.html#respond) 3119 We’ve analyzed user feedbacks received after the release of the completely redesigned dbForge Studio and defined prior points for further development of the product. As a result, we’ve devoted the new release of [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) to improve the possibilities for users who work with PL/SQL code and data. And in addition, we’ve improved the application performance and added support for the Win-x64 platform. dbForge Studio for Oracle v3.1 New Features 150 improvements in the code autocompletion system We’ve made nearly 150 changes and improvements in the dbForge Studio for Oracle component that helps users edit PL/SQL code. Among them: Oracle SQL and PL/SQL syntax support is expanded Work of quick info hints for schema objects is improved Keywords case is changed automatically on typing (if the corresponding options are set) Usability shortcomings and errors are fixed SQL document works with additional PL/SQL file types: Support for editing .pls, .plb, .pks, .pkb, .pck Oracle PL/SQL files is added Support for editing files with PL/SQL code created in some competitor products is added Data export to SQL statements Possibility to export data to INSERT, UPDATE, DELETE, and MERGE statements is added Data can be exported from a table or data grid using a wizard Quick export from data grid without opening  the wizard Convenient work with result sets of several SELECT queries If there is more than one SELECT statement in a document after execution query results will be displayed on separate tabs in the Data window. Such approach allows working with each data set independently. Editing data of object fields The new product version provides the possibility to edit data in object fields in tables using a pop-up editor. Support for editing temporary tables There is a capability to create and edit temporary tables in Table Editor. Document Outline window for code navigation is improved Grouping nodes by IF, FOR, etc. flow-control statements blocks is added Displaying labels and navigation among them is added More convenient working with query execution plan Quick query plan obtaining (without turning the Profiling mode on) is added Displaying of the EXPLAIN PLAN results without creating an additional table in the user’s scheme is added Performance improvements The time required for application startup and connection opening is reduced The building of the Database Explorer tree, refreshing it, and receiving table list is quicker now Getting metadata for code completion becomes faster Work of Schema Export with a large amount of objects and data is improved Data Editor working speed is increased Object Viewer and Property Browser windows working speed is increased Navigation to schema object editors from code is improved Win-x64 Native Support Now the application does not require 32-bit Oracle client software installed when working with 64-bit Windows operating system – the application works with 64-bit Oracle client software. Availability Consumers can give the updated dbForge Studio for Oracle a test drive by downloading the free Express edition or the 30-day trial Professional edition at the product [download page](https://www.devart.com/dbforge/oracle/studio/download.html) . To leave feedback, users can go to the [dbForge Studio for Oracle feedback page](https://www.devart.com/dbforge/oracle/studio/feedback.html) . The Devart team is looking forward to receiving any comments and suggestions. Tags [studio for oracle](https://blog.devart.com/tag/studio-for-oracle) [what's new oracle studio](https://blog.devart.com/tag/whats-new-oracle-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-release-new-version-dbforge-studio-for-oracle-v-3-1.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+is+glad+to+release+the+new+improved+version+of+dbForge+Studio+for+Oracle%2C+v+3.1&url=https%3A%2F%2Fblog.devart.com%2Fdevart-release-new-version-dbforge-studio-for-oracle-v-3-1.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-release-new-version-dbforge-studio-for-oracle-v-3-1.html&title=Devart+is+glad+to+release+the+new+improved+version+of+dbForge+Studio+for+Oracle%2C+v+3.1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-release-new-version-dbforge-studio-for-oracle-v-3-1.html&title=Devart+is+glad+to+release+the+new+improved+version+of+dbForge+Studio+for+Oracle%2C+v+3.1) [Copy URL](https://blog.devart.com/devart-release-new-version-dbforge-studio-for-oracle-v-3-1.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/devart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Python Connectors](https://blog.devart.com/category/products/python-connectors) [What’s New](https://blog.devart.com/category/whats-new) Devart Rolls Out Python Connectors for Microsoft Access, Snowflake, and MongoDB By [Sofiia Fomitska](https://blog.devart.com/author/sophie-rawlings) April 25, 2024 [0](https://blog.devart.com/devart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html#respond) 1301 We’re thrilled to introduce our new offerings: Python connectors for Microsoft Access , Snowflake , and MongoDB . These products mark a significant leap forward in enhancing data connectivity and analysis within Python applications. At Devart, we strive to cater to diverse data needs, and with this release, we’ve expanded our support to cover Microsoft Access , Snowflake , and MongoDB . Our Python Connector for Microsoft Access is designed to facilitate communication between Python applications and Microsoft Access Databases. Devart’s Python Connector for Snowflake is a robust integration tool, bridging Python applications with Snowflake’s cloud data platform. Meanwhile, the launch of the Python Connector for MongoDB is considered a game-changer for both developers and data enthusiasts, as MongoDB is the most famous NoSQL document database in the market. The primary advantage our customers get in the scope of this release is that the new Python connector for Microsoft Access is now able to operate not only on the Windows platform but also on Linux and macOS operating systems! Additionally, accessing Snowflake is streamlined through HTTPS, which eliminates the need for additional client software on the user’s workstation. With our new Python connector, you can enjoy simple authorization to Snowflake cloud services, quick access to your data workloads, and easy data management. When working with MongoDB, Python connectors typically connect to the database using standard MongoDB libraries like libmongoc and libbson. These libraries serve as the bridge between Python applications and the MongoDB database. Therefore, we recommended the installation of the MongoDB libraries on your workstation, which in turn is running the Python application. Here are the key features of Devart’s Python connectors: Cross-platform support for MS Access and MongoDB: The connectors can be run on all major desktop operating systems, including Windows, macOS, and Linux. Effortless connectivity : Easily connect Python applications to Microsoft Access databases. As you are granted direct access to a database—there’s no need to have MS Access or MS Access Database Engine Redistributable installed on your machine. Flexibility of data formats: Take advantage of MongoDB’s versatile data manipulation capabilities, enabling efficient management of data as either MongoDB documents or in a traditional relational database format. Multiple file formats: Benefit from extensive support for Microsoft Access .mdb and .accdb file formats, including databases created in the latest Microsoft Access versions. Flexible querying: Perform complex data queries and manipulations directly from Python, using the complete functionalities of Microsoft Access, Snowflake, and MongoDB. Data visualization : Visualize data from Microsoft Access, Snowflake, and MongoDB databases using Python libraries for insightful analysis and reporting. Enhanced productivity : Streamline workflows and automate data processes with Python scripts, reducing manual effort and increasing productivity. Better performance : Enjoy fast and reliable data access and retrieval for more efficient data processing and analysis. Are you prepared to elevate your data experience? Download the brand-new Python Connectors for Microsoft Access, Snowflake, and MongoDB to start your data exploration journey right away! To access the installation files for the new Python drivers, simply click the corresponding download links – [Python Connector for Microsoft Access](https://www.devart.com/python/access/download.html) , [Python Connector for Snowflake](https://www.devart.com/python/snowflake/download.html) , and [Python Connector for MongoDB](https://www.devart.com/python/mongodb/download.html) . Tags [mongodb](https://blog.devart.com/tag/mongodb) [MS ACCESS](https://blog.devart.com/tag/ms-access) [Python](https://blog.devart.com/tag/python) [python connectors](https://blog.devart.com/tag/python-connectors) [Snowflake](https://blog.devart.com/tag/snowflake) [Sofiia Fomitska](https://blog.devart.com/author/sophie-rawlings) Sofiia is a talented technical writer who writes easy-to-understand, concise, and user-friendly documentation for our pioneering IT solutions. Besides being a writer, she's a loving mother of two sons. That's why Sofiia spends her free moments playing soccer, biking, and joining in on the fun activities her boys are passionate about. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Rolls+Out+Python+Connectors+for+Microsoft+Access%2C+Snowflake%2C+and+MongoDB&url=https%3A%2F%2Fblog.devart.com%2Fdevart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html&title=Devart+Rolls+Out+Python+Connectors+for+Microsoft+Access%2C+Snowflake%2C+and+MongoDB) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html&title=Devart+Rolls+Out+Python+Connectors+for+Microsoft+Access%2C+Snowflake%2C+and+MongoDB) [Copy URL](https://blog.devart.com/devart-rolls-out-python-connectors-for-microsoft-access-snowflake-and-mongodb.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/devart-sponsored-sqlsaturday-363-nashville-2014.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart Sponsored SQLSaturday #363 – Nashville 2014 By [dbForge Team](https://blog.devart.com/author/dbforge) January 21, 2015 [0](https://blog.devart.com/devart-sponsored-sqlsaturday-363-nashville-2014.html#respond) 2464 Devart was excited to participate at the [363d edition of SQLSaturday](https://www.sqlsaturday.com/363/eventhome.aspx) as a sponsor. The event took place on January 17, 2015 at the Lipscomb University, Nashville, USA. SQLSaturday’s are free 1-day training events for SQL Server professionals that focus on local speakers, providing a variety of high-quality technical sessions, and making it all happen through the efforts of volunteers. During the event proceedings, the leading SQL Server experts and professionals covered a variety of topics related to SQL Server works and what it takes to keep it up and running smoothly, including aspects of database administration, performance tuning,  data visualization, database maintenance, security, and much, much more. We would like to thank organizers, speakers and all participants for the highly informative and productive event. We would also like to express special thanks to Ms. Tamera Clark for a perfect event management and coordination. Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsored-sqlsaturday-363-nashville-2014.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Sponsored+SQLSaturday+%23363++%E2%80%93+Nashville+2014&url=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsored-sqlsaturday-363-nashville-2014.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-sponsored-sqlsaturday-363-nashville-2014.html&title=Devart+Sponsored+SQLSaturday+%23363++%E2%80%93+Nashville+2014) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-sponsored-sqlsaturday-363-nashville-2014.html&title=Devart+Sponsored+SQLSaturday+%23363++%E2%80%93+Nashville+2014) [Copy URL](https://blog.devart.com/devart-sponsored-sqlsaturday-363-nashville-2014.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-sponsors-sqlsaturday-753-in-lviv-ukraine.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart sponsored SQLSaturday 753 in Lviv, Ukraine By [dbForge Team](https://blog.devart.com/author/dbforge) September 18, 2018 [0](https://blog.devart.com/devart-sponsors-sqlsaturday-753-in-lviv-ukraine.html#respond) 15645 [SQLSaturday](http://www.sqlsaturday.com/753/EventHome.aspx) a free international conference for  SQL Server professionals and those wanting to learn about SQL Server, Business Intelligence and Analytics.  This year, the event was held at Hotel Taurus, 5, Kn. Sviatoslava Sq., Lviv. Traditionally, the event gathered many SQL experts and practitioners, among which were [Michał Sadowski](https://www.facebook.com/profile.php?id=100014825707065) , [Grzegorz Stolecki](https://www.facebook.com/grzegorz.stolecki) , [Denis Reznik](https://www.facebook.com/denis.reznik.5) , [Satya SK Jayanty](https://www.facebook.com/satyaskj) , [Erland Sommarskog](https://www.facebook.com/photo.php?fbid=1929175050438430&set=pob.1424477494&type=3&theater#) , [Marcos Freccia](https://www.facebook.com/marcos.freccia) , [Eugene Polonichko](https://www.facebook.com/mydjeki) , and [Torsten Strauss](https://www.facebook.com/torsten.strauss) . The participants got a great opportunity to attend a variety of high-quality technical sessions, get a deeper knowledge of SQL Server and related technologies, chat with database professionals from around the world. The Devart team presented its software products, shared ideas and discussed future challenges with the professional community, and of course, raffled off a special prize for the participants of SQLSaturday. We would like to thank all the speakers for the wonderful reports and sharing their experience and knowledge. Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsors-sqlsaturday-753-in-lviv-ukraine.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+sponsored+SQLSaturday+753+in+Lviv%2C+Ukraine&url=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsors-sqlsaturday-753-in-lviv-ukraine.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-sponsors-sqlsaturday-753-in-lviv-ukraine.html&title=Devart+sponsored+SQLSaturday+753+in+Lviv%2C+Ukraine) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-sponsors-sqlsaturday-753-in-lviv-ukraine.html&title=Devart+sponsored+SQLSaturday+753+in+Lviv%2C+Ukraine) [Copy URL](https://blog.devart.com/devart-sponsors-sqlsaturday-753-in-lviv-ukraine.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart sponsored SQLSaturday 780 in Kharkiv, Ukraine By [dbForge Team](https://blog.devart.com/author/dbforge) September 18, 2018 [0](https://blog.devart.com/devart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html#respond) 6873 [SQLSaturday](http://www.sqlsaturday.com/780/eventhome.aspx) a free training event with local and international speakers reporting on SQL Server related topics. This year, the event was held at Fabrica Space, Blagovischenska str. 1, Kharkiv. As always, the event gathered many SQL experts and practitioners for getting new knowledge about SQL Server and related technologies, communicating with DBAs and SQL Server gurus from around the world, as well as supporting the Ukrainian SQL Server community in whole. Our team also presented new and updated Devart products and features, shared ideas and discussed future challenges with the professional community, and of course, raffled off a special prize for the participants of the event. We would like to express our gratitude to everyone who participated in the event organization – you did a great job and managed to gather people who spend the Saturday with colleagues and like-minded people. Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+sponsored+SQLSaturday+780+in+Kharkiv%2C+Ukraine&url=https%3A%2F%2Fblog.devart.com%2Fdevart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html&title=Devart+sponsored+SQLSaturday+780+in+Kharkiv%2C+Ukraine) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html&title=Devart+sponsored+SQLSaturday+780+in+Kharkiv%2C+Ukraine) [Copy URL](https://blog.devart.com/devart-sponsors-sqlsaturday-780-in-kharkiv-ukraine.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-start-new-product-line-for-postgresql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) Devart to Start New Product Line for PostgreSQL By [dbForge Team](https://blog.devart.com/author/dbforge) December 25, 2012 [0](https://blog.devart.com/devart-start-new-product-line-for-postgresql.html#respond) 4016 Devart is glad to announce the release of [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) — a powerful and easy-to-use tool for table data comparison and synchronization. This tool allows you to review all the differences in compared tables and execute an automatically generated script to remove these differences. Customizable comparison and synchronization process allows you to select tables and fields for comparison and adjust many comparison options. User friendly wizard, guides you through the data comparison and synchronization process step by step. A possibility of using Windows Task Scheduler enables you to automate database synchronization process. The key features of Data Compare for PostgreSQL include the following: Flexible customization of database object mapping. Data Compare for PostgreSQL allows you either automatically or manually map schemas, tables, views, columns. You can use custom mapping to map schemas or database objects that can not be mapped automatically. Comfortable view of data differences. Processing data comparison results is easy and quick, as they are displayed in a convenient way in the Data Comparison document. Database objects can be filtered by the differences. Full control over data to synchronize. After comparison you can easily exclude tables or individual records from synchronization. Easy to use Data Synchronization wizard. Data Synchronization wizard enables you to customize synchronization settings. You can backup target database before synchronization. Generated synchronization script can be immediately applied to the target database or saved in a file. Warnings of possible data loss and various notifications. Before synchronization you will be warned if errors or data loss may occur because of types incompatibility. Also you will be notified about data overflow, roundings, etc. Large script execution. dbForge Data Compare for PostgreSQL allows you to execute large scripts without opening script in the SQL editor and loading the whole script in memory. When you try to open a large script, you will be prompted to execute it instead with the help of the Execute Script Wizard. Compare and sync data via command line. You can compare and synchronize data via command line. There is no need to open the application and go through the wizard pages any more. Friendly GUI. Data Compare for PostgreSQL provides a self-intuitive user interface, so you will quickly learn how to use the product to your advantage. Enhanced working with data. You will get new opportunities for analyzing and processing received data the best way: Group, filter, and sort data in the grid View data rows as neat cards Display data in the paginal mode Find the required data using the auto-search mode Download [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/download.html) and see the benefits. Tags [data compare](https://blog.devart.com/tag/data-compare) [PostgreSQL](https://blog.devart.com/tag/postgresql) [what's new postgresql tools](https://blog.devart.com/tag/whats-new-postgresql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-start-new-product-line-for-postgresql.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+to+Start+New+Product+Line+for+PostgreSQL&url=https%3A%2F%2Fblog.devart.com%2Fdevart-start-new-product-line-for-postgresql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-start-new-product-line-for-postgresql.html&title=Devart+to+Start+New+Product+Line+for+PostgreSQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-start-new-product-line-for-postgresql.html&title=Devart+to+Start+New+Product+Line+for+PostgreSQL) [Copy URL](https://blog.devart.com/devart-start-new-product-line-for-postgresql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/devart-stays-on-winning-paths-with-componentsource-awards-2021.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Events](https://blog.devart.com/category/events) Devart Stays on Winning Paths with ComponentSource Awards 2021 By [dotConnect Team](https://blog.devart.com/author/dotconnect) April 6, 2021 [0](https://blog.devart.com/devart-stays-on-winning-paths-with-componentsource-awards-2021.html#respond) 2333 Once again, ComponentSource announces its Bestsellers of the year. This well-known professional resource researches the popularity of different software products each year. The analysis grounds on actual sales. Based on its customers’ choices in more than 180 countries of the world, ComponentSource defines the most popular products and software development companies. Bestselling Publisher Awards is presented to the company according to the number of its all product sales during a year. Bestselling Product Awards is presented to the software product according to the number of product sales during a year. Devart is proud to be present in both Bestselling categories. As a software vendor, Devart gets the [Top 25 Publisher Award](https://www.componentsource.com/help-support/bestselling-publisher-awards-2021) . Among many products that earned our company place in top-25 Publishers, [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) got special recognition. This ADO.NET data provider made an entry into the [Top 100 Bestselling products](https://www.componentsource.com/help-support/bestselling-product-awards-2020) . ComponentSources is a platform that has been supplying developers with the best software products since 1995. With more than 10 000 products from 200+ publishers and 1 000 000+ registered users, ComponentSource is one of the most visited software development resources in the world. For Devart, the customers’ recognition is the best praise. The popularity of our [database management tools](https://www.devart.com/dbforge/) comes from their exceptional functionality and user-friendliness. We are happy to offer an excellent user experience to all our customers, and we work on advancing it even more. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-stays-on-winning-paths-with-componentsource-awards-2021.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Stays+on+Winning+Paths+with+ComponentSource+Awards+2021&url=https%3A%2F%2Fblog.devart.com%2Fdevart-stays-on-winning-paths-with-componentsource-awards-2021.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-stays-on-winning-paths-with-componentsource-awards-2021.html&title=Devart+Stays+on+Winning+Paths+with+ComponentSource+Awards+2021) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-stays-on-winning-paths-with-componentsource-awards-2021.html&title=Devart+Stays+on+Winning+Paths+with+ComponentSource+Awards+2021) [Copy URL](https://blog.devart.com/devart-stays-on-winning-paths-with-componentsource-awards-2021.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Events](https://blog.devart.com/category/events) Devart: The Silver Sponsor at the Delphi Summit 2024 By [DAC Team](https://blog.devart.com/author/dac) September 11, 2024 [0](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html#respond) 925 The [Delphi Summit 2024](https://delphisummit.com/) was a 2-day event, filled to the brim with the latest and greatest news and innovations—all related to Delphi, of course. It was conducted by GDK Software in partnership with Embarcadero and Barnsten and held in June 13-14 at the H20 Esports Campus, a large venue located near Amsterdam in the Netherlands. Since we’ve got an entire product line dedicated to Delphi—namely, a robust selection of [Data Access Components](https://www.devart.com/dac.html) that enable direct high-performance connection to databases—we were honored to join in as a Silver Sponsor and take part in what turned to be one of the most diverse and inspiring gatherings of Delphi enthusiasts. Our team spiced up the proceedings with an online prize draw; the participants competed for a free year-long license to [UniDAC](https://www.devart.com/unidac/) , our high-end universal library of components supporting the broadest range of databases and cloud services that one can switch between with a single click. We were glad to see so many people willing to get one—and we awarded every participant with a free 3-month license. Although it was the first event of this kind for GDK Software, it was a definite success that left a lasting positive impression and the promise of the next Delphi Summit being even bigger and busier. Get Delphi Data Access Components for a free 60-day trial today! Last but not least—if you are a developer of Delphi-based applications that require easy direct access to data from different sources, we’d love to invite you to join our big friendly community. You can start your journey with a free 60-day trial on any of our components, including the aforementioned all-in-one [UniDAC](https://www.devart.com/unidac/) . You’ll be able to explore them, apply them in a real project, and evaluate their actual effectiveness. To make your test drive more comfortable and immersive, we suggest consulting our [Documentation Center](https://docs.devart.com/) (the Delphi Data Access Components section). And should you require any additional information, you can freely contact our support service. We’ll be glad to answer any questions you might have. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [delphi data access](https://blog.devart.com/tag/delphi-data-access) [delphi data access components](https://blog.devart.com/tag/delphi-data-access-components) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-the-silver-sponsor-at-the-delphi-summit-2024.html) [Twitter](https://twitter.com/intent/tweet?text=Devart%3A+The+Silver+Sponsor+at+the+Delphi+Summit+2024&url=https%3A%2F%2Fblog.devart.com%2Fdevart-the-silver-sponsor-at-the-delphi-summit-2024.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html&title=Devart%3A+The+Silver+Sponsor+at+the+Delphi+Summit+2024) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html&title=Devart%3A+The+Silver+Sponsor+at+the+Delphi+Summit+2024) [Copy URL](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) RELATED ARTICLES [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Best VCL UI Components for Delphi Developers](https://blog.devart.com/best-vcl-ui-components-for-delphi-developers.html) April 10, 2025"} {"url": "https://blog.devart.com/devart-took-part-in-the-kharkiv-intentional-marathon.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart Took Part in the Kharkiv International Marathon By [dbForge Team](https://blog.devart.com/author/dbforge) April 14, 2016 [0](https://blog.devart.com/devart-took-part-in-the-kharkiv-intentional-marathon.html#respond) 3037 Devart team took part in the 3rd Annual Kharkiv International Marathon , that took place on April 9, in city of Kharkiv, Ukraine. Gorgeous sunny weather, wonderful organization and cheerful atmosphere helped our runners to get through both, 4.2K ad 10K runs and finish with pretty good results. [Kharkiv International Marathon](http://kharkivmarathon.com/en/) is one of the biggest community sports events in Ukraine that gathers more than 10 thousand participants every year. The main objective of the event is to promote running in Ukraine. The marathon unites professional runners, amateurs and beginners from across the country and beyond. We are proud of our sporty employees and for sure will participate in the marathon next year! Tags [devart](https://blog.devart.com/tag/devart) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-took-part-in-the-kharkiv-intentional-marathon.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Took+Part+in+the+Kharkiv+International+Marathon&url=https%3A%2F%2Fblog.devart.com%2Fdevart-took-part-in-the-kharkiv-intentional-marathon.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-took-part-in-the-kharkiv-intentional-marathon.html&title=Devart+Took+Part+in+the+Kharkiv+International+Marathon) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-took-part-in-the-kharkiv-intentional-marathon.html&title=Devart+Took+Part+in+the+Kharkiv+International+Marathon) [Copy URL](https://blog.devart.com/devart-took-part-in-the-kharkiv-intentional-marathon.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Devart Has Become a Digital Sponsor of SQLBits 2024](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) July 26, 2024"} {"url": "https://blog.devart.com/devart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Devart Unveils Free Data Compare Tool for Oracle to Bring High Speed and Adjustable Comparison By [dbForge Team](https://blog.devart.com/author/dbforge) August 12, 2010 [0](https://blog.devart.com/devart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html#respond) 2741 Devart today unveiled a new data comparison tool, [dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/) . It delivers automatic data comparison in Oracle databases, provides a convenient GUI to manage the differences, and generates SQL*Plus-compatible synchronization script to synchronize the data. Read more: dbForge Data Compare for Oracle news… Tags [data compare](https://blog.devart.com/tag/data-compare) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Unveils+Free+Data+Compare+Tool+for+Oracle+to+Bring+High+Speed+and+Adjustable+Comparison&url=https%3A%2F%2Fblog.devart.com%2Fdevart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html&title=Devart+Unveils+Free+Data+Compare+Tool+for+Oracle+to+Bring+High+Speed+and+Adjustable+Comparison) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html&title=Devart+Unveils+Free+Data+Compare+Tool+for+Oracle+to+Bring+High+Speed+and+Adjustable+Comparison) [Copy URL](https://blog.devart.com/devart-unveils-free-data-compare-tool-for-oracle-to-bring-high-speed-and-adjustable-comparison.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/devart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was a Gold sponsor of SQLSaturday 739 in Kyiv By [dbForge Team](https://blog.devart.com/author/dbforge) May 22, 2018 [0](https://blog.devart.com/devart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html#respond) 15975 [SQLSaturday](http://www.sqlsaturday.com/739/eventhome.aspx) is an annual conference devoted to SQL Server, Business Intelligence, and Analytics. Among the topics discussed this year are data store, collection, optimization, visualization, scaling, data science, machine learning, big data, security, etc. This year, the annual SQLSaturday conference in Kyiv, Ukraine, was held on May 19. This is the sixth time the conference takes place in Kiev and every year its community is growing, as well as the scale of the conference: more and more participants come to the event, new speakers from different countries arrive. We would like to express our gratitude to all of the speakers for interesting reports, support of the Ukrainian SQL Server community, and the desire to share experience and knowledge. Special thanks to the team of organizers who have done a great job and managed to gather everyone who wanted to provide valuable insights, make new acquaintances and spend Saturday with colleagues and like-minded people. This year SQLSaturday Kyiv gathered more than 600 participants, 25 professional speakers held sessions in 5 parallel sessions, about 10 sponsoring companies supported the event and brought their products to the exhibition. We are looking forward to new meetings and reports at future SQLSaturdays in Kyiv. Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+a+Gold+sponsor+of+SQLSaturday+739+in+Kyiv&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html&title=Devart+was+a+Gold+sponsor+of+SQLSaturday+739+in+Kyiv) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html&title=Devart+was+a+Gold+sponsor+of+SQLSaturday+739+in+Kyiv) [Copy URL](https://blog.devart.com/devart-was-a-gold-sponsor-of-sqlsaturday-739-in-kyiv.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Bronze Sponsor of SQLSaturday #416 – Odessa 2015 By [dbForge Team](https://blog.devart.com/author/dbforge) July 27, 2015 [0](https://blog.devart.com/devart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html#respond) 2708 Devart was a Bronze sponsor of the [416th edition of SQLSaturday](https://www.sqlsaturday.com/416/eventhome.aspx) , that was held on July 25, 2015 in Odessa, Ukraine. We would like to thank the organizers of the event – Anton Vidishchev, Oleg Chorny and Alex Tumanoff, and all speakers for the remarkable event! We highly appreciated the opportunity to share thoughts and ideas, discuss challenges as well as to present our software products to the professional community. Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. We are looking forward to participate in the upcoming SQLSaturday events! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Bronze+Sponsor+of+SQLSaturday+%23416+%E2%80%93+Odessa+2015&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html&title=Devart+was+the+Bronze+Sponsor+of+SQLSaturday+%23416+%E2%80%93+Odessa+2015) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html&title=Devart+was+the+Bronze+Sponsor+of+SQLSaturday+%23416+%E2%80%93+Odessa+2015) [Copy URL](https://blog.devart.com/devart-was-the-bronze-sponsor-of-sqlsaturday-416-odessa-2015.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the GOLD sponsor of Annual Conference of Polish SQL Server User Group By [dbForge Team](https://blog.devart.com/author/dbforge) May 19, 2015 [0](https://blog.devart.com/devart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html#respond) 3282 Devart was the GOLD sponsor of the [SQL Day | Annual Conference of Polish SQL Server User Group](https://sqlday.pl/) , which was held on the 11th – 13th May 2015 in Wrocław, Poland. We would like to thank the organizers and speakers for the wonderful conference! Our special thanks to [Maciej Pilecki](https://pl.linkedin.com/in/maciejpilecki) and Katarzyna Nieradka for the assistance and warm welcome. We have had a great chance to meet new people, exchange views, and present our software products to the wide audience. Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. We do hope to continue our conversation and collaboration! We are looking forward to take part in upcoming SQL Server events in Poland! Tags [devart](https://blog.devart.com/tag/devart) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+GOLD+sponsor+of++Annual+Conference+of+Polish+SQL+Server+User+Group&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html&title=Devart+was+the+GOLD+sponsor+of++Annual+Conference+of+Polish+SQL+Server+User+Group) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html&title=Devart+was+the+GOLD+sponsor+of++Annual+Conference+of+Polish+SQL+Server+User+Group) [Copy URL](https://blog.devart.com/devart-was-the-gold-sponsor-of-annual-conference-of-polish-sql-server-user-group.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-gold-sponsor-of-sql-saturday-616.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Gold Sponsor of SQL Saturday #616 By [dbForge Team](https://blog.devart.com/author/dbforge) May 31, 2017 [0](https://blog.devart.com/devart-was-the-gold-sponsor-of-sql-saturday-616.html#respond) 3493 [SQL Saturday #616](https://www.sqlsaturday.com/616/eventhome.aspx) was the biggest training event for SQL Server professionals and those wanting to learn about SQL Server in Ukraine. The event was held on May 20, 2017, in Kiev, Ukraine. We would like to thank the organizers of the event – [Denis Reznik](https://www.facebook.com/denis.reznik.5) , [Eugene Polonichko](https://www.facebook.com/mydjeki?hc_ref=NEWSFEED) , [Eugen Niedaszkowski](https://www.facebook.com/airalchemist) , [Alesya Zhuk](https://www.facebook.com/profile.php?id=100004335701881) , [Nick Pobyivovk](https://www.facebook.com/nick.pobiyvovk) , [Oksana Tkach](https://www.facebook.com/ok.girle) and [Oksana Borysenko](https://www.facebook.com/oksana.borysenko.7) , as well as all speakers for the remarkable event! We highly appreciated the opportunity to share thoughts and ideas, discuss challenges as well as to present our software products to the professional community. During the event, we had a great chance to talk to Denis Reznik, who is: Data Architect at Intapp, Inc. Microsoft Data Platform MVP PASS Regional Mentor for Central and Eastern Europe Ukrainian Data Community Kyiv Co-Founder SQL community enthusiast How many SQL Server oriented events did you organize before SQLSaturday 616? Well, I need to count… The first event was held in Kharkiv, 2012. Probably, this was the first and the last one, which was organized only by me; all others were organized by the team of us. In 2013, my friend Kostya Khomyakov organized the SQLSaturday in Kyiv, where I was a co-organizer. Later in 2013, there was another SQLSaturday in Kharkiv. In 2014 and 2015 the SQLSaturday events were in Kiev. There were other three events held by leaders of local user groups that year: in Odessa – by Anton Vedishchev and Alex Tumanov, in Lviv – by Sergey Lunyakin, and in Kharkiv – by Alexey Kovalev and Vladimir Leshchinskiy. Later in 2016, the event was organized in Kiev, which is the sixth one. Also in 2016, SQLSaturday in Dnipro was organized by Alesya Zhuk. And this year, SQLSaturday 2017 is running right now :) and as you see this time 7 organizers and more than 10 volunteers are involved :) So I can say that I was involved as an organizer into 7 SQLSaturdays in Ukraine, but I can’t say that I organized them, we organized them :) What motivated you to organize the first SQLSaturday event held in Kharkov? I often hear this question. This is so cool to organize such events, and I am passionate about it! Probably, “This is cool” was the major argument for me that time, and it is still valid now. Which SQLSaturday event was the most difficult for you to organize: the first or the last one? Why? To tell the truth, each event takes more and more efforts. Each time I say: next time I will get enough sleep, now I will spend more time to prepare slides, this time I will be very punctual in checking the checklists. But, you know, that never happens :) you have no time to do each of these tasks. Each time you promise to yourself that next year we will do everything right, and create a detailed checklist, it does not go as planned. I think it depends on the number of participants. Every year our conference grows, and this is the case. I think this year was the most difficult one, and probably next year will be even more difficult, but it is ok. What part of the event do you prefer most of all? Why? I like the moment when you understand that everything is fine and goes as planned. You can talk to speakers and participants without any rush. You can simply relax and listen to interesting sessions. You want to do this, indeed. However, this is a rare case, and if I have a chance to listen to at least one session at this SQLSaturday, this will be great. So, your favorite part is the beginning of the event, is not it? Usually, it comes after the lunch-break. All the questions are solved, the evening is planned, and all the speakers are already here :) Everything goes as planned. What is the most important for you in events? Let’s clarify: are we talking about events where I take part or events I organized? The events you organized… I would like to see that all the participants, speakers, and sponsors are satisfied. Not exactly…. I want everyone to get more than they expected. Only, in this case, it is a success. What do you think whether the community has changed compared to the first SQL Saturday event in Ukraine? Actually, there was no sequel community then. Now, we can see fundamental changes – this community was born. It has grown for the last five years, and as a result, we have one of the biggest and the most active communities in the world. Would you like to share your experience (technical or life) gained while organizing SQLSaturday events during 5 years? Our conference gets better and depends on our community, which supports it. The conference allows gathering many people who know about the community. It is a kind of the interdependence. What I wanted to share is that we need to work on developing both the conference and community. People from the community join the conference organization team. They are the best, as they do not come for their personal and selfish interests. They also believe that this is cool. We do not need to explain why you organize the conference and why they need it. They simply join the team and make the difference. Are you going to organize the SQLSaturday event in 2018? I even can say the exact date – May 19. What would you like to add or change in 2018? There are some ideas, which we discuss. Also, I have an idea… but I do not know how the team will take it… For example, in Poland at SQLDay, they have a CEO of the conference, a chief organizer. Last year it was Marcin, this year – Pawel, next year – I have no idea. But I would like to try this option so that anyone from our team can be a CEO of the conference. Let’s see. How do Ukrainian IT companies take part in events? What role do they play in developing the SQL Server community? Such product companies as Devart and DbBest that are interested in supporting SQL community from year to year take part in our events. We appreciate this support very much. Product companies like Microsoft and Intapp have been sponsoring us for the last three years. This year such software development companies as EPAM, Eleks, DataArt, and InfoPulse take part in this event as well. The conference grows, and the companies are interested in participating. This year we had even the hosting provider Colocall and Craft Beer shop KRAN as our sponsors, and this is awesome! We appreciate the sponsors who support us because this is the reason why the conference is free for attendees. When the conference was small, it was not so exciting. Now, it is more beneficial for companies to collaborate with us. We are going to keep this partnership and help our partners to achieve their goals. What products for databases do you use in your work? I use Profiler and Management Studio, as well as Visual Studio. Usually, I do not use third-party tools because I often work with a client in their environment, where it is not possible to set up additional tools. Thus, it is very important to find the issue quickly and to use the default tools to solve it. I use third-party products only to simplify my job which I make locally or to do some specific tasks. By the way, I am using your products – Data and Schema Compare, and it is not advertising, I think that they are pretty good and they are very useful when I compare databases. Name top three must-have tools, which each developer/administrator require. You know, I’m not a good person for this question :) I do not use a lot of third-party tools in my work. I think this question is likely to be addressed to DBA. For me, these three tools are SSMS, Profiler, and Visual Studio. This is enough for me, but obviously, it is not enough for DBA work, and they may require monitoring tool, backup tool, server management and others. There are a lot of good vendors on the market and a lot of good products to choose from. In addition, another benefit for our SQLSaturday attendee is that they can come to the conference in order to talk to the vendor and make a decision if they need the particular product. Before leaving, what would you like to say to all Ukrainian SQL Server community? See you next year on SQLSaturday Kyiv! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sql-saturday-616.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Gold+Sponsor+of+SQL+Saturday+%23616&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sql-saturday-616.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sql-saturday-616.html&title=Devart+was+the+Gold+Sponsor+of+SQL+Saturday+%23616) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sql-saturday-616.html&title=Devart+was+the+Gold+Sponsor+of+SQL+Saturday+%23616) [Copy URL](https://blog.devart.com/devart-was-the-gold-sponsor-of-sql-saturday-616.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Gold Sponsor of SQLSaturday #406 – Kharkiv 2015 By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) September 21, 2015 [0](https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html#respond) 3799 Devart was the Gold sponsor of [SQLSaturday #406](https://www.sqlsaturday.com/406/EventHome.aspx) , that was held on SEP 19, 2015 in Kharkiv, Ukraine. We would like to thank the organizer of the event – Oleksii Kovalov, and all speakers: [Denis Rezni](https://www.facebook.com/denis.reznik.5) k, [Vitalii Bondarenko](https://www.facebook.com/vitalii.bondarenko.37) , [Olena Smoliak](https://www.facebook.com/olena.smoliak) , [Konstantin Proskurdin](https://www.facebook.com/k.proskurdin) , [Valentyn Yeliseyev](https://www.facebook.com/valentyn.yeliseyev) , [Sergey Lunyakin](https://www.facebook.com/sergey.lunyakin) , [Eugene Polonichko](https://www.facebook.com/mydjeki) , [Oleg Alexeev](https://www.facebook.com/oleg.alexeev.1426) , [Taras Bobrovytskyi](https://www.facebook.com/taras.bobrovytskyi) , [Andrey Langovoy](https://www.facebook.com/langovoy.andrey) , [Mihail Mateev](https://www.facebook.com/mihail.mateev) , [Eugen Niedaszkowski](https://www.facebook.com/airalchemist) , [Yan Roginevich](https://www.facebook.com/yan.roginevich) , [Mariya Shirokopetleva](https://www.facebook.com/profile.php?id=100009005752940) , Andrii Zrobok, Kevin G. Boles for the remarkable event! We highly appreciated the opportunity to share thoughts and ideas, discuss challenges as well as to present our software products to the professional community. Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. We are looking forward to participate in the upcoming SQLSaturday events! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23406+%E2%80%93+Kharkiv+2015&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html&title=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23406+%E2%80%93+Kharkiv+2015) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html&title=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23406+%E2%80%93+Kharkiv+2015) [Copy URL](https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-406-kharkiv-2015.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Gold Sponsor of SQLSaturday #538 Sofia By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) October 18, 2016 [0](https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html#respond) 3209 Devart was the GOLD sponsor of [SQLSaturday #538](https://www.sqlsaturday.com/538/eventhome.aspx) , that was held on 15 October 2016 in Sofia, Bulgaria. [Mihail Mateev](https://www.facebook.com/mihail.mateev) , [Genoveva Andreeva](https://www.facebook.com/genoveva.andreeva) , and other team members and volunteers, we would like to say THANK YOU very much for organizing this amazing conference. We had a great time while communicating with the audience and demonstrating our [SQL Server tools](https://www.devart.com/dbforge/sql/) . We also had a great opportunity to share thoughts and ideas, discuss challenges with the community professionals and our friends from Poland, Slovenia, Bulgaria and Ukraine! Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. We look forward to participating in the upcoming SQLSaturday events! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23538+Sofia&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html&title=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23538+Sofia) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html&title=Devart+was+the+Gold+Sponsor+of+SQLSaturday+%23538+Sofia) [Copy URL](https://blog.devart.com/devart-was-the-gold-sponsor-of-sqlsaturday-538-sofia.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Silver Sponsor of SQL Saturday #426 at Lviv By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) December 9, 2015 [0](https://blog.devart.com/devart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html#respond) 2715 Devart was a Silver sponsor of [SQLSaturday #426](https://sqlsaturday.com/426/EventHome.aspx) that was held on 5 Dec 2015 at Lviv, Ukraine. We would like to thank the organizers: Sergey Lunyakin, Andrey Zrobok, and all speakers for the outstanding event! Our special thanks to [Eugene Polonichko](https://www.facebook.com/mydjeki) for the assistance :) We highly appreciate the opportunity to share thoughts and ideas, discuss challenges as well as to present our software products to the professional community. Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Silver+Sponsor+of+SQL+Saturday+%23426+at+Lviv&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html&title=Devart+was+the+Silver+Sponsor+of+SQL+Saturday+%23426+at+Lviv) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html&title=Devart+was+the+Silver+Sponsor+of+SQL+Saturday+%23426+at+Lviv) [Copy URL](https://blog.devart.com/devart-was-the-silver-sponsor-of-sql-saturday-426-at-lviv.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the sponsor of SQLSaturday #660 at Lviv By [dbForge Team](https://blog.devart.com/author/dbforge) September 8, 2017 [0](https://blog.devart.com/devart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html#respond) 18028 Devart was a sponsor of [SQLSaturday#660](https://www.sqlsaturday.com/660/eventhome.aspx) held on August 19, 2017, in Lviv, Ukraine. We were very glad to take part in the event, talk to visitors, make new acquaintances and support our Ukrainian SQL Server community. We would like to thank the organizers of the event: Sergey Lunyakin, Andrey Zrobok, Oleg Deyneko, Marina Lunyakina and volunteers for organizing SQLSaturday Lviv and the opportunity to be a part of the event. We express our tremendous gratitude to all speakers who attended the conference and shared their knowledge and ideas with the community participants. At the conference, we had a great chance to talk to our chief organizer Sergey Lunyakin who is: Business Intelligence Professional at ITMAGINATION Microsoft MVP Data Platform Leader of the Lviv SQL Server User Group Regular speaker at SQL Server meetups and events How do you take part in developing the SQL Server community in Ukraine? I supervise Lviv SQL Server UG and help to organize the events. Andrey Zrobok helps me to hold meetups. Also, I am a speaker on SQLSaturdays in other cities of Ukraine and local communities of companies. I am an organizer of SQLSaturday Lviv. What SQLSaturday events did you organize and take part in? I was an organizer of SQLSaturday in Lviv in 2015 and this year. Of course,  I would not cope without the help of Andrey Zrobok, Oleg Deyneka, my wife and guys from our Lviv community. In 2015, we held the first SQLSaturday event in Lviv. Every year we have SQLSaturdays in different cities of Ukraine. However, there is a team of organizers in each city, where I am only a speaker at a conference. What motivated you to organize your first SQLSaturday in Lviv in 2015? The community and love for SQLSaturday. I like to make new professional acquaintances and talk on general topics with interesting people. SQLSat helps people develop, get acquainted and gain new knowledge. What SQLSaturday was the most difficult for you to organize, this or previous year? It was difficult to organize the first SQLSaturday event in 2015, of course. The real help for me was that, before organizing the event, I had got much experience from participants and speakers on different SQLSaturdays in other cities. I borrowed ideas on the organization from different events I attended. Of course, there were mistakes at the first event, but we fixed them during the event and the conference went good. In 2017, we took into account all the previous mistakes and the event went more smoothly. We had small surprises, but in general, everything was fine. What is the most important part in SQLSaturday for you as for a participant? The most important for me is networking with attendees and speakers. I get a lot of new experience and knowledge at the events. What part of the event do you prefer the most (introductory part, ruffle, etc.)? Why? The most favorite part starts during the second half, when all the questions and problems are solved, and you can relax and talk to participants, sponsors, and speakers of the event. Which SQLSaturday event you attended stand out to you in particular? Why? Most of all I liked two events, in Kiev in 2017 and in Iceland. In Kiev, there were many people and great mood – it was a huge event. In Iсeland, I liked to combine the culture of people with the event, it was very interesting. Iceland is a very amazing country. What do you think the SQL Server community in Ukraine has changed with the appearance of user groups and events? It all started with the first SQLSaturday in Kharkov in 2012. The community itself began to develop actively 4 years ago – in 2013. In Ukraine, the number of events increased as well as user groups appeared. In Lviv, for example, 86 people came to the first meeting, which was a surprise for us. In Kiev, over 100 people constantly attend the event. It would be great if we had an association, like in Poland. It is much more effective to organize conferences and allocate resources in this case. How do Ukrainian IT companies take part in events? What role do they play in developing the SQL Server community? Here everything depends on the company. In Ukraine, there are product companies, for example, Devart that bails out and can provide their licenses for the ruffle or for the education at the UG meetings. In addition, there are service companies such as DBBest, Eleks, Softserve that help financially or can provide premises for holding meetings. Each company contributes and it makes us happy. What products do you use in your work? Name top three must-have tools, which each developer/administrator require s . I use Schema/Data Compare, Search, SQL Complete and Data Generator. All these products allow me to save my working time and simplify the working process. Search helps to quickly find a necessary string of the code in a huge database, which contains many procedures and functions. Data Generator allows me to make demos and tests without using real data, since some information can be sensitive. If we talk about must-have tools, then they are probably Search, SQL Complete, and Schema/Data Compare. Are you going to organize SQLSaturday in 2018?  What would you like to change or add? I do not know yet. I need some time to calm down after this event J Also, there may be changes towards the association so we will see later. Regarding changes and new ideas, everything has actually been thought out. On different SQLSaturdays, you can find many interesting ideas, sessions schedule on badges, like in Poland, or the process of collecting stamps from sponsors in order to take part in the ruffle, like on PASS Summit. Before leaving, what would you like to say to the entire Ukrainian SQL Server community? Do not hesitate to communicate :) The main thing in SQLSaturday is communication.  I am an example of how community and communication can help in professional activities. At SQLSaturday, you can get acquainted with famous people and gain invaluable experience. Do not split into groups, you simply need to get acquainted and communicate :) We would like to thank all the organizers and speakers of the event! Looking forward to participating in the upcoming SQLSaturday events! Tags [devart](https://blog.devart.com/tag/devart) [sql saturday](https://blog.devart.com/tag/sql-saturday) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+sponsor+of+SQLSaturday+%23660+at+Lviv&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html&title=Devart+was+the+sponsor+of+SQLSaturday+%23660+at+Lviv) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html&title=Devart+was+the+sponsor+of+SQLSaturday+%23660+at+Lviv) [Copy URL](https://blog.devart.com/devart-was-the-sponsor-of-sqlsaturday-660-at-lviv.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-was-the-sponsor-of-ukrainian-data-community-kyiv.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Devart was the Sponsor of Ukrainian Data Community Kyiv By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) December 16, 2015 [0](https://blog.devart.com/devart-was-the-sponsor-of-ukrainian-data-community-kyiv.html#respond) 4617 We were more than happy to take part in the New Year meeting of [Ukrainian Data Community in Kyiv](https://www.facebook.com/events/757814140990142/) on 15 Dec 2015. It was the great event, where we had a great chance to talk to the community and present our products to the wide audience of professionals. There were two outstanding speakers that shared their knowledge to the visitors: Alexander Kalenik: Using Columnstore indexes in MS SQL Server (2012, 2014, 2016CTP3) . Alexander Kalenik is Microsoft CIS technology lead, Senior Premier Field Engineer for areas of SQL Server and Windows Cluster. He is working for Microsoft almost 9 years. He has a big experience in SQL Server from 1991. He is author of books about SQL Server and author of 2 blogs on TechNet for areas SQL Server and Windows Cluster. Alexander is PHD in area of Computer Science. [Vitalii Bondarenko](https://www.facebook.com/vitalii.bondarenko.37) : Dive into Hadoop (HDInsight): common Big Data analysis scenarios on Microsoft Azure Vitalii is a DW/BI/ETL Architect and Technical Lead experienced in OLAP and OLTP systems design, performance tuning and administration. Total professional record includes about 15 years of experience in software applications and Database design and development which contains about 8 years of experience in MSSQL Server. We would like to thank the organizer of the event — [Denis Reznik](https://www.facebook.com/denis.reznik.5) . We are looking forward to the next meetings! Devart supports User Groups and provides sponsorship for different events. We are looking forward to your sponsorship requests! E-mail us a brief summary of your event or User Group meetup. Tags [devart](https://blog.devart.com/tag/devart) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-sponsor-of-ukrainian-data-community-kyiv.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+was+the+Sponsor+of+Ukrainian+Data+Community+Kyiv&url=https%3A%2F%2Fblog.devart.com%2Fdevart-was-the-sponsor-of-ukrainian-data-community-kyiv.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-was-the-sponsor-of-ukrainian-data-community-kyiv.html&title=Devart+was+the+Sponsor+of+Ukrainian+Data+Community+Kyiv) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-was-the-sponsor-of-ukrainian-data-community-kyiv.html&title=Devart+was+the+Sponsor+of+Ukrainian+Data+Community+Kyiv) [Copy URL](https://blog.devart.com/devart-was-the-sponsor-of-ukrainian-data-community-kyiv.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) Devart sponsored SQLSaturday 857 in Kyiv, Ukraine By [dbForge Team](https://blog.devart.com/author/dbforge) May 17, 2019 [0](https://blog.devart.com/devart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html#respond) 4017 SQLSaturday is one of the largest free-to-attend IT events in Ukraine, and Devart was its Gold Sponsor this time. We had a lot of new product to show, gave a lot of interesting talks and had many interesting productive conversations. SQLSaturday is a perfect event for Microsoft Data Platform professionals and those wanting to learn about SQL Server, Business Intelligence and Analytics.  This year, the event was held on May 18 2019 at Hotel President, 12 Hospitalna street, Kyiv. With each year, the number of participants and speakers from all over the world increases, and this year was not an exception. We saw many new faces and spent some quality time with like-minded people. Thanks to everyone who attended the event and helped make it a good time for everyone involved! You can check out what was featured on SQLSaturday 857 [here](https://www.sqlsaturday.com/857/Sessions/Schedule.aspx) . Tags [dbforge](https://blog.devart.com/tag/dbforge) [events](https://blog.devart.com/tag/events) [sql saturday](https://blog.devart.com/tag/sql-saturday) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+sponsored+SQLSaturday+857+in+Kyiv%2C+Ukraine&url=https%3A%2F%2Fblog.devart.com%2Fdevart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html&title=Devart+sponsored+SQLSaturday+857+in+Kyiv%2C+Ukraine) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html&title=Devart+sponsored+SQLSaturday+857+in+Kyiv%2C+Ukraine) [Copy URL](https://blog.devart.com/devart-will-sponsor-sqlsaturday-857-in-kyiv-ukraine.html) RELATED ARTICLES [What’s New](https://blog.devart.com/category/whats-new) [SQL Konferenz 2024 Insights Recap](https://blog.devart.com/sql-konferenz-insights-recap.html) October 28, 2024 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Devart: The Silver Sponsor at the Delphi Summit 2024](https://blog.devart.com/devart-the-silver-sponsor-at-the-delphi-summit-2024.html) September 11, 2024 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Devart Has Become a Digital Sponsor of SQLBits 2024](https://blog.devart.com/devart-has-become-a-digital-sponsor-of-sqlbits-2024.html) July 26, 2024"} {"url": "https://blog.devart.com/devart-wins-silver-in-globee-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Events](https://blog.devart.com/category/events) Devart Wins Silver at the 19th Annual 2024 Globee Awards for Technology By [Victoria Shyrokova](https://blog.devart.com/author/victorias) July 16, 2024 [0](https://blog.devart.com/devart-wins-silver-in-globee-awards.html#respond) 1078 [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) from Devart, a high-performance .NET connector that provides direct access to the Oracle Database without the need for an Oracle Client, has won Silver at the 19th Annual 2024 Globee® Awards for Technology in the Oracle Cloud Application Services category. This prestigious business award praises companies that excel at hard work and transform tech domains, empowering businesses worldwide. Thus, becoming a winner means that we’ve succeeded in creating an innovative solution for connectivity and delivering significant business value to our clients. The Globee® Awards for Technology celebrate the achievements and positive contributions of organizations and individuals from all over the world, and it’s an honor for the Devart team to join the circle of winners. We appreciate that our commitment to innovation is being noticed. We also feel strongly motivated by getting so much attention to our connectivity solution, which is well-loved not only by its loyal users but also by the vast business technology community. Indeed, [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) offers a range of benefits to businesses. It provides database-specific features, advanced durability, fast data loading, and versatile integration options, letting development teams combine it with Microsoft’s data-oriented technologies and development IDEs like Visual Studio for ultimate success. By using dotConnect for Oracle, development teams save time and expenses as they can perform integration with the Oracle database in no time while ensuring that this architecture stays scalable and secure and supports both modern and tried-and-true technologies. The Devart team is always excited to see our products appreciated by the community. Winning such a high award in the Technology domain tells us that our work is important. It inspires us to extend the features and combine our unique vision with the essential needs of our product users to get the best results. Interested in trying out dotConnect for Oracle ? Learn more about this unique solution and get a 30 days of trial to explore its robust functionality. Tags [ADO.NET](https://blog.devart.com/tag/ado-net) [dotconnect](https://blog.devart.com/tag/dotconnect) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevart-wins-silver-in-globee-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart+Wins+Silver+at+the+19th+Annual+2024+Globee+Awards+for+Technology&url=https%3A%2F%2Fblog.devart.com%2Fdevart-wins-silver-in-globee-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devart-wins-silver-in-globee-awards.html&title=Devart+Wins+Silver+at+the+19th+Annual+2024+Globee+Awards+for+Technology) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devart-wins-silver-in-globee-awards.html&title=Devart+Wins+Silver+at+the+19th+Annual+2024+Globee+Awards+for+Technology) [Copy URL](https://blog.devart.com/devart-wins-silver-in-globee-awards.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) SQL Tools Support New Secure Authentication By [dbForge Team](https://blog.devart.com/author/dbforge) February 20, 2019 [0](https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html#respond) 12777 We are excited to inform our SQL Server users that we have just released our updated pack of SQL Server tools, [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) ! For this release, we have enhanced the tools pack with support for a new authentication method, – Active Directory – Universal with MFA authentication . Active Directory – Universal with MFA authentication is an interactive method that also supports Azure Multi-Factor Authentication (MFA). Azure MFA helps to safeguard access to data and applications while meeting user demand for a simple sign-in process. It delivers strong authentication with a range of easy verification options (phone call, text message, smart cards with a pin, or mobile app notification), allowing users to choose the method they prefer. Interactive MFA with Azure AD can result in a pop-up dialog box for validation. The feature is available in the following tools: [Source Control](https://www.devart.com/dbforge/sql/source-control/) [Unit Test](https://www.devart.com/dbforge/sql/unit-test/) [Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) [Data Generator](https://www.devart.com/dbforge/sql/data-generator/) [Documenter](https://www.devart.com/dbforge/sql/documenter/) [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) [Index Manager](https://www.devart.com/dbforge/sql/index-manager/) [Query Builder](https://www.devart.com/dbforge/sql/querybuilder/) [Search](https://www.devart.com/dbforge/sql/search/) [Monitor](https://www.devart.com/dbforge/sql/monitor/) [Event Profiler](https://www.devart.com/dbforge/sql/event-profiler/) Note that to use this authentication method,  you need.Net Framework 4.6 or higher installed. Dramatically Improved Comparison Speed Schema and data comparison has never been that fast! To make your work with the dbForge Data Compare for SQL Server and dbForfe Schema Compare for SQL Server yet more productive and time-saving, we have dramatically improved the performance speed of comparing SQL Server data and schemas. We invite you to [try the updated version](https://www.devart.com/dbforge/sql/sql-tools/download.html) of dbForge SQL Tools for free during 30 days. Tags [active directory](https://blog.devart.com/tag/active-directory) [developer bundle](https://blog.devart.com/tag/developer-bundle) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdeveloper-bundle-supports-new-secure-authentication.html) [Twitter](https://twitter.com/intent/tweet?text=SQL+Tools+Support+New+Secure+Authentication&url=https%3A%2F%2Fblog.devart.com%2Fdeveloper-bundle-supports-new-secure-authentication.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html&title=SQL+Tools+Support+New+Secure+Authentication) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html&title=SQL+Tools+Support+New+Secure+Authentication) [Copy URL](https://blog.devart.com/developer-bundle-supports-new-secure-authentication.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/developer-bundle-will-take-you-to-2019-right-now.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Developer Bundle will take you to 2019 right now! By [dbForge Team](https://blog.devart.com/author/dbforge) December 22, 2018 [0](https://blog.devart.com/developer-bundle-will-take-you-to-2019-right-now.html#respond) 12320 Devart always tries to provide the best opportunities for managing databases to its users. Today we are thrilled to inform SQL Server users that our [Developer Bundle](https://www.devart.com/dbforge/sql/developer-bundle/) now fully supports the connection to SQL Server 2019!  Moreover, our bundle includes all the necessary tools for creating, managing and processing big amounts of SQL data, among which are: [Source Control](https://www.devart.com/dbforge/sql/source-control/) [Unit Test](https://www.devart.com/dbforge/sql/unit-test/) [Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) [Data Generator](https://www.devart.com/dbforge/sql/data-generator/) [Documenter](https://www.devart.com/dbforge/sql/documenter/) [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) [Index Manager](https://www.devart.com/dbforge/sql/index-manager/) [Query Builder](https://www.devart.com/dbforge/sql/querybuilder/) [Search](https://www.devart.com/dbforge/sql/search/) [Monitor](https://www.devart.com/dbforge/sql/monitor/) [Event Profiler](https://www.devart.com/dbforge/sql/event-profiler/) [SQL Decryptor](https://www.devart.com/dbforge/sql/sqldecryptor/) We also want to note that [buying the Developer Bundle](https://www.devart.com/dbforge/sql/developer-bundle/ordering.html) will save a lot of funds compared to purchasing all of the tools separately. Tags [developer bundle](https://blog.devart.com/tag/developer-bundle) [sql server 2019](https://blog.devart.com/tag/sql-server-2019) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdeveloper-bundle-will-take-you-to-2019-right-now.html) [Twitter](https://twitter.com/intent/tweet?text=Developer+Bundle+will+take+you+to+2019+right+now%21&url=https%3A%2F%2Fblog.devart.com%2Fdeveloper-bundle-will-take-you-to-2019-right-now.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/developer-bundle-will-take-you-to-2019-right-now.html&title=Developer+Bundle+will+take+you+to+2019+right+now%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/developer-bundle-will-take-you-to-2019-right-now.html&title=Developer+Bundle+will+take+you+to+2019+right+now%21) [Copy URL](https://blog.devart.com/developer-bundle-will-take-you-to-2019-right-now.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/developing-applications-with-salesforce-crm-integration-in-delphi.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [How To](https://blog.devart.com/category/how-to) Developing applications with Salesforce CRM integration in Delphi By [DAC Team](https://blog.devart.com/author/dac) February 1, 2017 [4](https://blog.devart.com/developing-applications-with-salesforce-crm-integration-in-delphi.html#comments) 7251 Cloud-based CRM systems are becoming incredibly popular in these days. This trend is quite clear. The usage of cloud services does not require a deployment and costly specialized software domestic support, which ultimately reduces a company’s total expenses. Despite the flexibility and ease of use, it is often necessary to work directly with data stored in the cloud, without using the original Web interfaces or custom APIs. It can be any analytical studies, statistical data processing or creating certain forms of reporting documentation. In this case, a developer will find it more convenient to work with such data using familiar tools. The popular development tool – Embarcadero RAD Studio is no exception here. Various APIs and different access methods to Cloud CRM data can significantly complicate this process, as in most cases, they are not universal. Here comes ODBC technology to help, providing a standard interface to access any databases, including clouds. [UniDAC components](https://www.devart.com/unidac/) allow working in Delphi with any ODBC driver. For this UniDAC includes a special component: TODBCUniProvider. Let’s see how to work with data stored in one of the most popular CRM systems – Salesforce. First, you need to install and configure any ODBC driver working with Salesforce. For example, [Devart ODBC Driver for Salesforce](https://www.devart.com/odbc/salesforce/download.html) . As an example, let’s create a small application that implements a master-detail connection between two standard Salesforce objects: Account and Opportunity. Account will be represented by master dataset and Opportunity by detail dataset. The AccountId field will be used for communication between the entities. The order of actions will be as follows: 1. Download and install ODBC Driver for Salesforce. 2. Configure the installed driver to work with the existing Salesforce account. You can find the detailed instruction [here](https://www.devart.com/odbc/salesforce/docs/index.html?driver_configuration_and_conne.htm) . Before using the datasource name “Devart Salesforce Driver” created at this stage, you can check its setting with the help of the “Test connection” button. 3. Open IDE and create a small project, containing the following components: 4. To work with ODBC driver, configure TUniConnection as follows: uses ODBCUniProvider;\n...\n UniConnection.ProviderName := 'ODBC';\n UniConnection.Server := 'Devart Salesforce Driver'; In this case in the UniConnection.Server property we specified DSN, created in Step 2. Note that you can also work with a required ODBC driver, using UniConnection without explicitly specifying DSN. To do this, you should fill the appropriate parameters in the ConnectString property. To work with Devart ODBC Driver for Salesforce, this property will look as follows: UniConnection.ConnectString := 'Provider Name=ODBC;Server=\"DRIVER={Devart ODBC Driver for Salesforce};Data Source=login.salesforce.com;User ID=;Password=;Security Token=\"'; 5. Compose necessary SQL queries: AccountQuery.SQL.Text := 'Select Id, Name, BillingStreet, BillingState, WebSite From Account';\nOpportunityQuery.SQL.Text := 'Select Name, StageName, Amount, Type, Description From Opportunity Where AccountId = :Id'; Note that in both queries we did not use all the fields of the Account and Opportunity objects, but only those, which are needed for the demonstration. In addition, to implement the master-detail connection, we added the appropriate Where clause. 6. Configure a master-detail relationship between the queries: AccountDataSource.DataSet := AccountQuery;\nOpportunityDataSource.DataSet := OpportunityQuery;\nOpportunityQuery.MasterSource := AccountDataSource; 7. Open the queries to obtain the required result. AccountQuery.Open;\n OpportunityQuery.Open; 8. On the screenshot below you can see the obtained work result. Thus, we were able to implement the specified task. We have seen that working with cloud CRM data in [Delphi using ODBC](https://www.devart.com/odbc/salesforce/integrations/salesforce-delphi-connection.html) provider is as simple as using traditional DBs. Similarly, you can work with data of any system, having a necessary driver. Devart offers a whole range of such [ODBC drivers](https://www.devart.com/odbc/) . Moreover, this list is constantly updated, thereby expanding end users opportunities to work with a variety of data. Here is the code source: [UniDACODBCDemo.zip](https://blog.devart.com/wp-content/uploads/2017/02/UniDACODBCDemo.zip) Tags [delphi](https://blog.devart.com/tag/delphi) [odbc](https://blog.devart.com/tag/odbc) [rad studio](https://blog.devart.com/tag/rad-studio) [salesforce](https://blog.devart.com/tag/salesforce) [unidac](https://blog.devart.com/tag/unidac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdeveloping-applications-with-salesforce-crm-integration-in-delphi.html) [Twitter](https://twitter.com/intent/tweet?text=Developing+applications+with+Salesforce+CRM+integration+in+Delphi&url=https%3A%2F%2Fblog.devart.com%2Fdeveloping-applications-with-salesforce-crm-integration-in-delphi.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/developing-applications-with-salesforce-crm-integration-in-delphi.html&title=Developing+applications+with+Salesforce+CRM+integration+in+Delphi) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/developing-applications-with-salesforce-crm-integration-in-delphi.html&title=Developing+applications+with+Salesforce+CRM+integration+in+Delphi) [Copy URL](https://blog.devart.com/developing-applications-with-salesforce-crm-integration-in-delphi.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS Sergio February 11, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 7:45 pm Great topic. It’s useful. Thanks. Rex Chan June 3, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 10:23 am If possible, salesforce.CRM provider included in Unidac DAC Team June 6, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 10:45 am Hello, Rex Chan! We will consider a possibility of adding this functionality in UniDAC in the future. Rex Chan July 16, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 5:36 pm I know delphi Enterprise version plus include in future delphi version. I hope you can including Unidca or new product line due to i use delphi XE5 prof. I dream my next delphi project will Excel addon with salesforce.com integration. Comments are closed."} {"url": "https://blog.devart.com/development-with-embarcadero-rad-studio-xe2.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) 64-bit Development with Delphi XE2 By [DAC Team](https://blog.devart.com/author/dac) September 28, 2011 [4](https://blog.devart.com/development-with-embarcadero-rad-studio-xe2.html#comments) 7519 Delphi XE2 Overview Delphi XE2 is the major breakthrough in the line of all Delphi versions of this product. It allows deploying your applications both on Windows and Mac OS platforms. Additionally, it is now possible to create 64-bit Windows applications to fully benefit from the power of new hardware. Moreover, you can create visually spectacular applications with the help of the FireMonkey GPU application platform. Its main features are the following: Windows 64-bit platform support; Mac OS support; FireMonkey application development platform; Live data bindings with visual components; VCL styles for Windows applications. For more information about Delphi XE2, please refer to [Delphi XE2 Overview](http://edn.embarcadero.com/article/41593) article on Embarcadero website. Changes in 64-bit Application Development 64-bit platform support implies several important changes that each developer must keep in mind prior to the development of a new application or the modernization of an old one. General Delphi XE2 IDE is a 32-bit application. It means that it cannot load 64-bit packages at design-time. So, all design-time packages in Delphi XE2 IDE are 32-bit. Therefore, if you develop your own components, you should remember that for the purpose of developing components with the 64-bit platform support, you have to compile run-time packages both for the 32- and 64-bit platforms, while design-time packages need to be compiled only for the 32-bit platform. This might be a source of difficulties if your package is both a run-time and a design-time package, as it is more than likely that this package won’t be compiled for the 64-bit platform. In this case, you will have to separate your package into two packages, one of which will be used as run-time only, and the other as design-time only. For the same reason, if your design-time packages require that certain DLLs be loaded, you should remember that design-time packages can be only 32-bit and that is why they can load only 32-bit versions of these DLLs, while at run-time 64-bit versions of the DLLs will be loaded. Correspondingly, if there are only 64-bit versions of the DLL on your computer, you won’t be able to use all functions at design-time and, vice versa, if you have only 32-bit versions of the DLLs, your applications that are compiled for the 64-bit platform won’t be able to work at run-time. Extended type For this type in a 64-bit applications compiler generates SSE2 instructions instead of FPU, and that greatly improves performance in applications that use this type a lot (where data accuracy is needed). For this purpose, the size and precision of the Extended type is reduced: TYPE 32-bit 64-bit Extended 10 bytes 8 bytes The following two additional types are introduced to ensure compatibility in the process of developing 32- and 64-bit applications: Extended80 – whose size in 32-bit application is 10 bytes; however, this type provides the same precision as its 8-byte equivalent in 64-bit applications. Extended80Rec – can be used to perform low-level operations on an extended precision floating-point value. For example, the sign, the exponent, and the mantissa can be changed separately. It enables you to perform memory-related operations with 10-bit floating-point variables, but not extended-precision arithmetic operations. Pointer and Integers The major difference between 32- and 64-bit platforms is the volume of the used memory and, correspondingly, the size of the pointer that is used to address large memory volumes. TYPE 32-bit 64-bit Pointer 4 bytes 8 bytes At the same time, the size of the Integer type remains the same for both platforms: TYPE 32-bit 64-bit Integer 4 bytes 4 bytes That is why, the following code works incorrectly on the 64-bit platform: Ptr := Pointer(Integer(Ptr) + Offset); While this code works correctly on the 64-bit platform and incorrectly on the 32-bit platform: Ptr := Pointer(Int64(Ptr) + Offset); For this purpose, the following platform-dependent integer type is introduced: TYPE 32-bit 64-bit NativeInt 4 bytes 8 bytes NativeUInt 4 bytes 8 bytes This type helps ensure that pointers work correctly both for the 32- and 64-bit platforms: Ptr := Pointer(NativeInt(Ptr) + Offset); However, you need to be extra-careful when developing applications for several versions of Delphi, in which case you should remember that in the previous versions of Delphi the NativeInt type had different sizes: TYPE Delphi Version Size NativeInt D5 N/A NativeInt D6 N/A NativeInt D7 8 bytes NativeInt D2005 8 bytes NativeInt D2006 8 bytes NativeInt D2007 8 bytes NativeInt D2009 4 bytes NativeInt D2010 4 bytes NativeInt Delphi XE 4 bytes NativeInt Delphi XE2 4 or 8 bytes Out parameters Some WinAPI functions have OUT parameters of the SIZE_T type, which is equivalent to NativeInt in Delphi XE2. The problem is that if you are developing only a 32-bit application, you won’t be able to pass Integer to OUT, while in a 64-bit application, you will not be able to pass Int64; in both cases you have to pass NativeInt. For example: procedure MyProc(out Value: NativeInt);\nbegin\n  Value := 12345;\nend; \n\nvar\n Value1: NativeInt;\n{$IFDEF WIN32}\n Value2: Integer;\n{$ENDIF}\n{$IFDEF WIN64}\n Value2: Int64;\n{$ENDIF}\nbegin\n MyProc(Value1); // will be compiled;\n\n MyProc(Value2); // will not be compiled !!!\nend; If you pass pointers to SendMessage/PostMessage/TControl.Perform, the wParam and lParam parameters should be type-casted to the WPARAM/LPARAM type and not to Integer/Longint. Correct: SendMessage(hWnd, WM_SETTEXT, 0, LPARAM(@MyCharArray)); Wrong: SendMessage(hWnd, WM_SETTEXT, 0, Integer(@MyCharArray)); Replace SetWindowLong/GetWindowLog with SetWindowLongPtr/GetWindowLongPtr for GWLP_HINSTANCE, GWLP_ID, GWLP_USERDATA, GWLP_HWNDPARENT and GWLP_WNDPROC as they return pointers and handles. Pointers that are passed to SetWindowLongPtr should be type-casted to LONG_PTR and not to Integer/Longint. Correct: SetWindowLongPtr(hWnd, GWLP_WNDPROC, LONG_PTR(@MyWindowProc)); Wrong: SetWindowLong(hWnd, GWL_WNDPROC, Longint(@MyWindowProc)); Pointers that are assigned to the TMessage.Result field should use a type-cast to LRESULT instead of Integer/Longint. Correct: Message.Result := LRESULT(Self); Wrong: Message.Result := Integer(Self); Assembler In order to make your application (that uses assembly code) work, you will have to make several changes to it: rewrite your code that mixes Pascal code and assembly code. Mixing them is not supported in 64-bit applications; rewrite assembly code that doesn’t consider architecture and processor specifics. You can use conditional defines to make your application work with different architectures. You can learn more about Assembly code [here](http://docwiki.embarcadero.com/RADStudio/en/Using_Inline_Assembly_Code) . You can also look at the [following article](http://docwiki.embarcadero.com/RADStudio/en/Converting_32-bit_Delphi_Applications_to_64-bit_Windows) that will help you to make your application support the 64-bit platform. Exception handling The biggest difference in exception handling between Delphi 32 and 64-bit is that in Delphi XE2 64-bit you will gain more performance because of different internal exception mechanism. For 32-bit applications, the Delphi compiler (dcc32.exe) generates additional code that is executed any way and that causes performance loss. The 64-bit compiler (dcc64.exe) doesn’t generate such code, it generates metadata and stores it in the PDATA section of an executable file instead. But in Delphi XE2 64-bit it’s impossible to have more than 16 levels of nested exceptions. Having more than 16 levels of nested exceptions will cause a Run Time error. Debugging Debugging of 64-bit applications in Delphi XE2 is remote. It is caused by the same reason: Delphi XE2 IDE is a 32 application, but your application is 64-bit. If you are trying to debug your application and you cannot do it, you should check that the Include remote debug symbols project option is enabled. To enable it, perform the following steps: Open Project Options (in the main menu Project->Options ). In the Target combobox, select Debug configuration – 64-bit Windows platform . If there is no such option in the combobox, right-click Target Platforms in Project Manager and select Add platform . After adding the 64-bit Windows platform, the Debug configuration – 64-bit Windows platform option will be available in the Target combobox. Select Linking in the left part of the Project Options form. Enable the Include remote debug symbols option. After that, you can run and debug your 64-bit application. To enable remote debugging, perform the following steps: Install Platform Assistant Server (PAServer) on a remote computer. You can find PAServer in the %RAD_Studio_XE2_Install_Directory%PAServer directory. The setup_paserver.exe file is an installation file for Windows, and the setup_paserver.zip file is an istallation file for MacOS. Run the PAServer.exe file on a remote computer and set the password that will be used to connect to this computer. On a local computer with Delphi XE2 installed, right-click the target platform that you want to debug in Project Manager and select Assign Remote Profile . Click the Add button in the displayed window and input your profile name. Click the Next button, input the name of a remote computer and the password to it (that you assigned when you started PAServer on a remote computer). After that, you can test the connection by clicking the Test Connection button. If your connection failed, check that your firewalls on both remote and local computers do not block your connection, and try to establish a connection once more. If your connection succeeded, click the Next button and then the Finish button. Select your newly created profile and click OK. After performing these steps you will be able to debug your application on a remote computer. You application will be executed on a remote computer, but you will be able to debug it on your local computer with Delphi XE2. For more information about working with Platform Assistant Server, please refer to [this article](http://docwiki.embarcadero.com/RADStudio/en/Installing_and_Running_the_Platform_Assistant_on_the_Target_Platform) . Database-Specific Aspects of 64-bit Development For each database, the specifics of the 64-bit development is mainly conditioned by the use of particular database client libraries. When our connectivity solutions are used in Direct mode (without involving database client software), the specifics of developing applications depends exclusively on peculiarities of 64-bit platform regardless of the used database. For example, our PostgreSQL connectivity solutions ( [PgDAC](https://www.devart.com/pgdac/) , [UniDAC](https://www.devart.com/unidac/) and [dbExpress Driver for PostgreSQL](https://www.devart.com/dbx/postgresql/) ) work with the PostgreSQL database directly. Our connectivity solutions for Oracle and MySQL can be used in Direct mode as well as with the corresponding database client. For our connectivity solutions, using DBMS client libraries, the following requiements should be met. As Delphi XE2 is a 32-bit application, it can load only 32-bit libraries. So, to connect to the database at design-time, you need to use 32-bit client library, while at run-time you will need the 64-bit client library. For SQL Server, InterBase and Firebird, MySQL (in client mode), and SQLite you need to place the 32-bit client library to the C:WindowsSysWOW64 directory (if you need to connect to the database at design-time), and the 64-bit client library, used for connecting to the database at run-time, to the C:WindowsSystem32 directory. Note that developers of SQLite do not provide a ready driver for x64 platforms, and for x64 applications you need to manually compile the sqlite library (for example, in MS VisualStudio). Our connectivity solutions for Oracle ( [ODAC](https://www.devart.com/odac/) , UniDAC, [dbExpress driver for Oracle](https://www.devart.com/dbx/oracle/) ) use the DEFAULT Oracle Client as standard, so depending on the capacity of the DEFAULT Oracle Client you need to explicitly specify either the 64-bit client for run-time (if the DEFAULT Oracle Client is 32-bit), or the 32-bit client for design-time (if the DEFAULT Oracle Client is 64-bit), or you may explicitly specify both clients – for run-time and for design-time. When developing cross-platform application using UniDAC to work with the MS Access database, you should remember that it is impossible to install two (32- and 64-bit) drivers on the same system (Microsoft limitation). That is why, if you need to connect to the database at design-time, the 32-bit driver must be installed on the development computer, since Delphi XE2 uses x32 libraries at design-time. If no such connection is needed, you can install the x64 MS Access driver. All the other aspects of x64 and x32 development are identical. As regards using UniDAC for connecting to other database servers through ODBC, for information on drivers for different platforms and specifics contact their developers. Tags [c++builder](https://blog.devart.com/tag/cbuilder) [delphi](https://blog.devart.com/tag/delphi) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevelopment-with-embarcadero-rad-studio-xe2.html) [Twitter](https://twitter.com/intent/tweet?text=64-bit+Development+with+Delphi+XE2&url=https%3A%2F%2Fblog.devart.com%2Fdevelopment-with-embarcadero-rad-studio-xe2.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/development-with-embarcadero-rad-studio-xe2.html&title=64-bit+Development+with+Delphi+XE2) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/development-with-embarcadero-rad-studio-xe2.html&title=64-bit+Development+with+Delphi+XE2) [Copy URL](https://blog.devart.com/development-with-embarcadero-rad-studio-xe2.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS [Carroll Yoney](http://www.acupuncture.freeclon.com) October 2, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 11:14 am I really appreciate this your good article from there i get something that i want to know thanks for this usefull informations runsheng May 14, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 2:39 pm This article is very good and usefull,thank you! Alfredo June 20, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 6:41 am Hey I am so excited I found your weblog, I really found you by error, while I was browsing on Digg for something else, Regardless I am here now and would just like to say many thanks for a fantastic post and a all round thrilling blog (I also love the theme/design), I don’t have time to look over it all at the minute but I have bookmarked it and also added your RSS feeds, so when I have time I will be back to read more, Please do keep up the excellent job. AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:35 am Thank you for your kind words, Alfredo. We highly appreciate your interest. Comments are closed."} {"url": "https://blog.devart.com/devops-automation-release.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Welcome to the Future of SQL Database Development! By [dbForge Team](https://blog.devart.com/author/dbforge) May 17, 2019 [0](https://blog.devart.com/devops-automation-release.html#respond) 9762 We are thrilled to announce the most notable addition to our dbForge for SQL Server product line in years, [dbForge DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) . The solution brings  a cutting-edge approach to conventional database development. Powered by the tools from dbForge Developer Bundle, and  by embracing such DevOps practices, as Continuous Integration and Continuous Delivery, dbForge DevOps Automation for SQL Server minimizes deployment risks, energizes quality and update frequency, and makes the overall workflow consistent and safe. PowerShell Module dbForge DevOps Automation PowerShell for SQL Server allows tuning and automating the following SQL Server database development tasks via the PowerShell interface: Build – deploys a database on LocalDB or on a specified SQL Server and generates a NuGet package from a source control repository. Test – runs tSQLt and generates test data. Sync – deploys a NuGet package and synchronizes it with a working DB. Publish – places a NuGet Package to a NuGet feed for further deployment. Jenkins Integration dbForge DevOps Automation for SQL Server also allows setting up DevOps processes in Jenkins with the help of the open-source dbForge DevOps Automation for SQL Server Plugin. Ant it’s just the beginning!  We plan to extend the range of the supported automation servers. Stay tuned! Availability dbForge DevOps Automation for SQL Server is a free product that  is supplied exclusively as a part of [dbForge Developer Bundle for SQL Server](https://www.devart.com/dbforge/sql/developer-bundle/download.html) . Tags [devops](https://blog.devart.com/tag/devops) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevops-automation-release.html) [Twitter](https://twitter.com/intent/tweet?text=Welcome+to+the+Future+of+SQL+Database+Development%21&url=https%3A%2F%2Fblog.devart.com%2Fdevops-automation-release.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devops-automation-release.html&title=Welcome+to+the+Future+of+SQL+Database+Development%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devops-automation-release.html&title=Welcome+to+the+Future+of+SQL+Database+Development%21) [Copy URL](https://blog.devart.com/devops-automation-release.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/devops-mindset.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) The DevOps Mindset: 3 Ways to Motivate DBAs to Use DevOps By [dbForge Team](https://blog.devart.com/author/dbforge) March 21, 2022 [0](https://blog.devart.com/devops-mindset.html#respond) 2623 Nowadays, DevOps is a key technology in the world of software development. With the advent of microservices and cloud technologies, the delivery process of software has been modified much. IT companies are actively involving DevOps Engineers in creating and maintaining a competitive product on the market. While DevOps Engineers are busy automating processes, Database Administrators’ (DBAs’) role in the development process is becoming more meaningful. A final stage of this tendency must be the alliance of DBAs’ skills and experience with the DevOps practices. In this article, we are going to review three ways to boost DBAs to use the DevOps methodology. Three ways to motivate DBAs to use the DevOps approach in their work There are three ways how to help DBAs see the advantages of the DevOps principles for their routine work. They are: Show DBAs their relevance and indispensability in the DevOps process. Collaborate DBAs’ expertise and DevOps Engineers’ mindset. Provide DBAs with respective tools. Let’s discuss each way in turn. Show DBAs their relevance and indispensability in the DevOps process It isn’t a secret that the DevOps approach is more about developing applications architecture. That is why many DBAs think that all these DevOps things are not for them. They cannot see benefits from using microservices or CI/CD pipelines in database management and administration. But in fact, DBAs are almost the main people in a DevOps team. A DBA must be a major member of development and operations teams to be involved in each stage of an application release process. This way of working is successful because a well-incorporated database can have a positive impact on organizational efficiency. It’s impossible to achieve business continuity with a solution that includes all the development aspects in the DevOps pipeline without the database. Collaborate DBAs’ expertise and DevOps Engineers’ mindset Data is the key in the development process, and it’s a connecting link in DBAs and DevOps Engineers collaboration. This collaboration is extremely favorable because DBAs, for example, can share their expertise, for example, in data modeling in order to provide efficient data structures. Their experience in improving database performance also matters. In their turn, DevOps Engineers can better define a reason why the access to production data must be protected to avoid failures and other issues. If there is a need to use data for competitive benefit, these two teams must work closely. Provide DBAs with respective tools Unfortunately, DBAs don’t have the same variation of tools as DevOps Engineers. To encourage DBAs to embrace DevOps, they should have a right and convenient tool. It must be the tool that greatly simplifies DBAs’ work and has the functionality innate in the DevOps practices. Devart can suggest the product, which bridges this gap between DBAs and DevOps—DevOps Automation for SQL Server that is included both in the SQL Tools pack and in dbForge Studio for SQL Server. [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) is a great solution for a smooth switch to the DevOps approach. Actually, the tool is used along with a stack of other tools, available in the SQL Tools pack. It allows implementing the database DevOps automation on four common stages: Development, Continuous Integration, Continuous Delivery, and Operation. DevOps Automation for SQL Server can be integrated with PowerShell and Jenkins. The tool provides an automated approach for database development and deployment, including testing, detecting and fixing failures in the early stages. [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/database-devops.html) is a powerful product that includes all the database tools, required to automate the database delivery process. Just like the DevOps Automation for SQL Server, it can be used on four stages. dbForge Studio for SQL Server is rich in advantages, in fact, it is an all-in-one solution that is bound to speed up the database development and deployment processes. Using this tool, you can be 100% sure that the database release process is stable, reliable, and fast. Conclusion DevOps is not just a whim or something trendy. It’s a philosophy that has already proved its efficiency. It’s difficult to imagine how much this industry will get developed in several years. But it’s very impressive what we can see today. Nevertheless, some teams still cannot decide to operate the DevOps methodology in practice. We have shared our thoughts on why it may happen and how to resolve it. In short, to prompt DBAs to use the DevOps principles, it’s required just to unite them with DevOps Engineers to work together and provide them with the right tool. As for the latter, you can download 30-day trial versions of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/download.html) and start discovering DevOps benefits yourself. Tags [#Devops benefit](https://blog.devart.com/tag/devops-benefit) [DBAs](https://blog.devart.com/tag/dbas) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [devops](https://blog.devart.com/tag/devops) [DevOps Automation](https://blog.devart.com/tag/devops-automation) [DevOps challenges](https://blog.devart.com/tag/devops-challenges) [sql tools](https://blog.devart.com/tag/sql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdevops-mindset.html) [Twitter](https://twitter.com/intent/tweet?text=The+DevOps+Mindset%3A+3+Ways+to+Motivate+DBAs+to+Use+DevOps&url=https%3A%2F%2Fblog.devart.com%2Fdevops-mindset.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/devops-mindset.html&title=The+DevOps+Mindset%3A+3+Ways+to+Motivate+DBAs+to+Use+DevOps) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/devops-mindset.html&title=The+DevOps+Mindset%3A+3+Ways+to+Motivate+DBAs+to+Use+DevOps) [Copy URL](https://blog.devart.com/devops-mindset.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/difference-between-ide-and-code-editor-explained.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) The Difference Between an IDE and a Code Editor By [dbForge Team](https://blog.devart.com/author/dbforge) April 18, 2022 [0](https://blog.devart.com/difference-between-ide-and-code-editor-explained.html#respond) 5227 How would you like to get your daily coding done? Fast? Obviously. In neither repetitive nor monotonous manner? Naturally. With all of the tools, you might need always at hand? Yes, definitely. Whether you are going to get it all or not depends on where you choose to write your code. Your two primary options here are either a code editor or an IDE. Both are designed to make your coding easier—however, in different ways. Although this applies to nearly all programming languages, today we will focus on SQL as the essential database-related language. And if you are still in search of an ideal solution for your SQL coding, here’s a guide that we hope will be helpful. You will learn the main difference between an IDE and a code editor, get acquainted with their specifics, advantages, and drawbacks—and finally, you will be able to see what suits you more and make the optimal choice. CONTENTS What is an IDE? What is a code editor? What is the primary difference between an IDE and a code editor? The advantages of IDEs The drawbacks of IDEs IDE vs a code editor comparison What is the best database IDE on the market? What is an IDE? IDE (Integrated Development Environment) is typically the most advanced programming solution that comprises multiple tools in a single application—and thus it streamlines all operations of a software/database developer. In other words, when you get an IDE, you get an entire set of tools for coding, testing, debugging, compiling, and much more—and all of them are properly organized and easily accessible. The most obvious examples here are Microsoft’s Visual Studio and SQL Server Management Studio, the latter of which is the default free solution for SQL Server databases. If we delve a little deeper into database-specific IDEs, we can mention a few other essentials treasured by SQL developers and DBAs. For instance, our own product, [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , delivers such advanced features as comparison and synchronization of table data, generation of dummy data for testing, and visual query building (which actually eliminates the need to code when creating queries). You can see the latter in the screenshot below. Visual query building in dbForge Studio, an IDE for SQL Server databases What is a code editor? It is seemingly just one of the features usually included in IDEs; it is an application that helps you write code more efficiently. There are specialized editors for different languages with varying auxiliary features. Some people think that a code editor is the same as a text editor. A text editor is just used for writing and editing a text without built-in tools for coding. Let’s take another Microsoft product as an example – Visual Studio Code. It delivers IntelliSense code completion, a slew of code editing features (including multi-cursor editing, linting, and parameter hints), code navigation, debugging, and refactoring tools, and even built-in source control. Query document in Visual Studio Code By default, a code editor is a great choice for beginners who are learning to code and want a simple and effective tool at hand. What is the primary difference between an IDE and a code editor? Now if we recap all that’s been said above, we can easily pinpoint the primary difference. Code editors are basically enhanced text editors that streamline and accelerate routine coding. An IDE delivers much more than that, being an all-encompassing software solution for multiple tasks related to database development and administration. The advantages of IDEs IDEs deliver quite a few substantial advantages. You get multiple tools in a single application; this saves your time and effort Your daily routine is streamlined; as a result, you become more productive and get a sharper focus on your tasks You have full control over databases; in case of need, you can easily handle multiple tasks on your own IDEs facilitate collaboration and alignment with corporate standards The functionality of an IDE can be further extended with external plugins Finally, vendors of commercial IDEs deliver added value to their customers; for instance, timely support and up-to-date documentation are an absolute must The drawbacks of IDEs There isn’t really much to tell when it comes to drawbacks. Since IDEs are more complex, they generally take more time to master. On the other hand, you don’t have to learn everything at once. Best modern IDEs deliver clean GUIs that allow fast access to main features. Now the question is whether your tasks are limited to writing code or you need to deal with a bigger number of versatile database-related operations. In the former case, it is reasonable to opt for some IntelliSense-enhanced editor. In the latter case, write down a list of your operations and requirements and search for a solution that addresses them most precisely. IDE vs a code editor comparison To sum up, everything above said about the tools we suggest you review this comparison table: IDE Code editor IDE is a development environment with advanced functionality for fast coding, testing, debugging, etc. It has built-in tools required to work with the code quickly and easily. As usual, a code editor is just one of the built-in tools in an IDE that aims to speed up the code editing process. If you have limited time for performing your tasks, an IDE can become your savior because it already includes all the tools to develop your code. Thus, you do not need to search for and use extra tools. You have everything in one single application. If you’ve just started to code and want to use a convenient and intuitive tool, you should give preference to a code editor. If you go beyond the capabilities of an IDE, you can add required external plugins at any time. So, you can improve your working tool until it meets your needs. As for a code editor, if you want something more than just writing and editing code, you have to find a little bit more complex application. If you’re looking for an optimal IDE, make a choice based on your own requirements. But ensure that it has a debugger, code and error highlighting, autocompletion, and active IDE project. When you hesitate about what code editor to choose, pay attention to its features. A good code editor must at least highlight and complete syntax elements, display line numbers, etc. What is the best database IDE on the market? If you are leaning towards a professional IDE as the main solution for your endeavors in SQL Server database development and administration, we can suggest one that’s been mentioned above – dbForge Studio for SQL Server. Its features include, but are not limited to the following: IntelliSense-like SQL code completion Easy formatting with custom profiles Smart code refactoring with automatic correction of references to renamed objects Debugging of stored procedures, triggers, and functions Comparison and synchronization of table data and entire database schemas Generation of meaningful test data Visual database design Visual query building Data analysis and reporting Database administration And don’t forget the intuitive interface that will take virtually no time to get used to. Smart code completion delivered by dbForge Studio; the interface is designed to make your work effective from day one Don’t hesitate to check it in action! [Get your free 30-day trial of dbForge Studio today](https://www.devart.com/dbforge/sql/studio/download.html) and compare the capabilities of dbForge Studio with your actual needs. Who knows, maybe it’s going to be the perfect match. Tags [database administration](https://blog.devart.com/tag/database-administration) [database management](https://blog.devart.com/tag/database-management) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdifference-between-ide-and-code-editor-explained.html) [Twitter](https://twitter.com/intent/tweet?text=The+Difference+Between+an+IDE+and+a+Code+Editor&url=https%3A%2F%2Fblog.devart.com%2Fdifference-between-ide-and-code-editor-explained.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/difference-between-ide-and-code-editor-explained.html&title=The+Difference+Between+an+IDE+and+a+Code+Editor) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/difference-between-ide-and-code-editor-explained.html&title=The+Difference+Between+an+IDE+and+a+Code+Editor) [Copy URL](https://blog.devart.com/difference-between-ide-and-code-editor-explained.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/difference-between-schema-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) What is the Difference Between Schema and Database? By [dbForge Team](https://blog.devart.com/author/dbforge) May 6, 2022 [0](https://blog.devart.com/difference-between-schema-database.html#respond) 11789 Today, we are going to talk about whether schema and database is the same. To be more precise, we will describe the main differences between these two terms in general. After that, we will take a closer look at the differences between databases and schemas in the following relational database management systems: [SQL Server](https://www.devart.com/dbforge/sql/) , [MySQL](https://www.devart.com/dbforge/mysql/studio/) , [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , and [Oracle](https://www.devart.com/dbforge/oracle/studio/) . Before getting into the differences themselves, let us define both terms. What is a database? First things first: a database (often referred to as DB) is an organized structure designed to store, modify, and process related information, mostly in large volumes. Databases are actively used for dynamic sites with significant amounts of data. Often, these are online stores, portals, and corporate websites. Such websites are usually developed using a server-side programming language (for example, PHP) or based on a CMS (for example, WordPress), and do not have ready-made data pages by analogy with HTML sites. Pages of dynamic sites are formed on the fly as a result of the interaction of scripts and databases after a corresponding request from the client to the webserver. What is a schema? On the contrary, the term database schema can mean either a visual data representation, a set of rules it is subject to, or a complete set of objects owned by a particular user. An easy way to imagine a schema is to think of it as a field that contains tables, stored procedures, views, and related data resources. The schema defines the infrastructure of this field. What is the main difference between database and schema? Taking into account the above-mentioned definitions, let us derive the key differences between the two. A database is any collection of data. The data in a database is usually organized in such a way that the information is easily accessible. A schema is basically a formal description of how a database is formed and where everything is located. It works as a blueprint that shows where everything is in the database and how it is structured. Database vs. Schema Comparison Table For better understanding, we have created a comparison table with a set of the main differences between a database and schema: It is important to differentiate between the logical concept of a schema that exists regardless of a specific DBMS and schema as a physical object. This logical concept is synonymous with the structure or model of a database. Now, let us look a little bit deeper into the peculiarities of schemas as physical objects in different RDBMSs. To better understand the database schema and relationships between objects, use [database diagram design tool for SQL Server](https://www.devart.com/dbforge/sql/studio/database-diagram.html) . Database vs. Schema in PostgreSQL As to PostgreSQL, a database can be defined as a container with all the schemas, records, logs, and table constraints. Databases are strictly separated, which means that a user cannot access two databases at once. Use DML (Data Manipulation Language) commands in order to manipulate data in the PostgreSQL database. A schema in PostgreSQL determines how the data is logically structured and stored in a database. It contains all the indexes, tables, functions, data types, stored procedures, and anything that might be related. Database administrators can set different levels of access for different users in order to avoid conflicts and unnecessary interference. Database vs. Schema in SQL Server In the context of SQL, use Data Definition Language (DDL) for creating and modifying database objects. In case you are looking for a better [understanding of a SQL schema](https://blog.devart.com/understanding-a-sql-schema.html) , please refer to another blog article of ours. Both in SQL Server and PostgreSQL, a schema is a physical database object that stores other database objects. Database vs. Schema in Oracle In [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , a database consists of physical files that contain data and metadata. These include the datafiles, controlfiles, and redo log files. Unlike SQL Server and PostgreSQL, there is no separate schema object. However, if a user becomes an owner of any objects like tables, views, etc., it is treated as a schema. At the same time, working with a database in Oracle is almost identical to working with SQL Server. Namely, users connect to an Oracle server and work with the schemas just like they do with databases in SQL Server. A database instance in Oracle comprises a memory that is shared and accessed by all the threads and background processes. This includes the SGA, PGA, and background processes such as RECO, SMON, DBWO, PMON, CKPT, ARCO, etc. Database vs. Schema in MySQL In MySQL, there is no such object as schema. Sometimes, a schema is used as a synonym for a database. In MySQL syntax, you can easily substitute the SCHEMA keyword with the DATABASE keyword. This way, using CREATE SCHEMA will give you the same result as CREATE DATABASE . Conclusion To sum up, database and schema are different things in all RDMSs, except MySQL. A database is more about data and content, while a schema is more about the structure of the said data. If you are working with databases on a daily basis, Devart offers [a range of products](https://www.devart.com/dbforge/edge/) that can help you improve database development, management, and administration. Moreover, we can boast powerful, fast, and easy-to-use schema comparison tools for [SQL Server](https://www.devart.com/dbforge/sql/studio/sql-server-schema-compare.html) , [MySQL](https://www.devart.com/dbforge/mysql/schemacompare/) , [PostgreSQL](https://www.devart.com/dbforge/postgresql/schemacompare/) , and [Oracle](https://www.devart.com/dbforge/oracle/schemacompare/) . Tags [data visualization](https://blog.devart.com/tag/data-visualization) [database instance](https://blog.devart.com/tag/database-instance) [database schema](https://blog.devart.com/tag/database-schema) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [schema vs database](https://blog.devart.com/tag/schema-vs-database) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdifference-between-schema-database.html) [Twitter](https://twitter.com/intent/tweet?text=What+is+the+Difference+Between+Schema+and+Database%3F&url=https%3A%2F%2Fblog.devart.com%2Fdifference-between-schema-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/difference-between-schema-database.html&title=What+is+the+Difference+Between+Schema+and+Database%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/difference-between-schema-database.html&title=What+is+the+Difference+Between+Schema+and+Database%3F) [Copy URL](https://blog.devart.com/difference-between-schema-database.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/difference-between-views-and-tables-in-sql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Difference Between Views and Tables in SQL By [Nataly Smith](https://blog.devart.com/author/nataly-smith) June 11, 2024 [0](https://blog.devart.com/difference-between-views-and-tables-in-sql.html#respond) 1493 Businesses widely use relational databases because their ability to structure data makes it much easier to manage. Tables and views, among other database objects, can often be found within such a database. In this article, we will discover the fundamental differences between these two concepts in SQL databases, their characteristics, use cases, and how they impact data management and security in your database projects. [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/studio-sql.html) — a powerful integrated development environment — will help us along the way. Contents What is a table in SQL Key characteristics of tables What is a view in SQL Key characteristics of views View vs. table Creating tables and views Hands-on examples Try it yourself with dbForge Studio Further learning Conclusion What is a table in SQL For starters, let us focus on the definitions of two concepts: tables and views, what they have in common, and the differences between them. So, what is a table? In SQL Server, a table is a database object that stores data in rows and columns, similar to a spreadsheet. Each column in a table represents a specific attribute of the data, such as a name or date, and has a defined data type. Each row represents a single record containing values for each column. They can also contain constraints like primary keys, foreign keys, and indexes to enforce data integrity and improve performance. Key characteristics of tables In essence, tables have the following distinguishable traits: Tables store data physically in rows and columns within the database They represent a single entity and contain data directly They can be directly modified to manage data They contain all the data, including sensitive information, accessible to users with the appropriate permissions Changes to table schemas (e.g., adding/removing columns) require careful management of data integrity Tables establish relationships between different sets of data using referential constraints What is a view in SQL Up next is understanding a view in SQL Server. A view is a virtual table based on a SELECT query from one or more underlying tables. It does not store data itself but provides a way to present and manipulate data from these tables in a specific format. Views can simplify complex queries, enhance security by limiting access to specific data, and provide a consistent interface to the underlying tables. They can be queried and updated (with some restrictions) just like regular tables. Key characteristics of views Here are some of the prominent features of views: Views are virtual and do not take up space in the system They do not store data themselves, but rather present data stored in other tables based on a query They are often used to simplify complex queries and can be updatable, but there are restrictions They can limit access to specific data by exposing only certain columns or rows, enhancing security They can aggregate and join data from multiple tables, simplifying complex queries and presenting a unified interface Finally, views can provide a consistent interface to users even when the underlying table schemas change, as long as the view definition is maintained Learn how to [create views in MySQL](https://blog.devart.com/how-to-create-a-view-in-mysql.html) with best practices and tips in this article. View vs. table The main difference between a table and a view is that a table is an object that consists of rows and columns to store and retrieve data whenever the user needs it. In contrast, the view is a virtual table based on an SQL statement’s result set and will disappear when the current session is closed. However, there are a few more points to consider when comparing these two concepts: Table View Represents a single entity with a fixed structure Provides a virtual table with a flexible structure that can hide specific data Physically stores data in rows and columns within the database Does not store data and depends on underlying tables for data Directly modifiable with INSERT, UPDATE, and DELETE operations Can simplify complex queries; updatable with restrictions Uses primary and foreign keys Uses complex multiple tables joins Stores all necessary data, including sensitive information Enhances security by limiting data access, simplifies complex queries, and presents a unified view Creating tables and views The next step in our learning journey today is to discover how to create tables and views. To create a table in SQL Server, use the following syntax: CREATE TABLE [IF NOT EXISTS] TableName ( column1, column2, ..., constraints ); For example, this query will create the Employees table, specify its columns and their data types, with EmployeeID as the unique identifier for each row: CREATE TABLE Employees ( EmployeeID INT PRIMARY KEY, FirstName VARCHAR(50), LastName VARCHAR(50), Email VARCHAR(100), HireDate DATE ); The below syntax allows us to create a view: CREATE VIEW ViewName AS SELECT columns FROM tables [WHERE conditions]; As we said, this view is a virtual table formed as a result of a query and used to view or manipulate parts of the table. We can create the columns of the view from one or more tables. Its content is derived from the base table: CREATE VIEW EmployeeContactInfo AS SELECT EmployeeID, FirstName, LastName, Email FROM Employees WHERE HireDate > '2020-01-01'; Hands-on examples We have already mentioned that views can simplify complex queries. Now, we are going to dive deeper into this topic and explore scenarios that demonstrate how views can enhance security and provide a more straightforward interface for users and applications interacting with the database. Scenario 1: Simplifying data retrieval from multiple tables For this example, let us assume we have already created the following tables and populated them with test data: The Employees table with the EmployeeID , FirstName , LastName , and DepartmentID columns. The Departments table with the DepartmentID and DepartmentName columns. The Salaries table with the EmployeeID and Salary columns. This script creates a view to simplify the retrieval of employee details along with their department and salary: CREATE VIEW EmployeeDetails AS SELECT e.EmployeeID, e.FirstName, e.LastName, d.DepartmentName, s.Salary FROM Employees e JOIN Departments d ON e.DepartmentID = d.DepartmentID JOIN Salaries s ON e.EmployeeID = s.EmployeeID; Once the view is created, you can use a simple SELECT statement to retrieve employee details: SELECT * FROM EmployeeDetails WHERE DepartmentName = 'Information Technology'; Without the view, you would need to write a complex JOIN query every time. Scenario 2: Providing a filtered view of sensitive information The second scenario requires two tables for demonstration: The Customers table with the CustomerID , FirstName , LastName , Email , and CreditCardNumber columns. The Orders table with the OrderID , CustomerID , OrderDate , and TotalAmount columns. CREATE VIEW CustomerOrders AS SELECT c.CustomerID, c.FirstName, c.LastName, o.OrderID, o.OrderDate, o.TotalAmount FROM Customers c JOIN Orders o ON c.CustomerID = o.CustomerID; This view hides the sensitive CreditCardNumber column. In order to access the rest of the order details, you can use the following SELECT statement: SELECT * FROM CustomerOrders WHERE OrderDate > '2023-01-01'; Scenario 3: Aggregating data for reporting The next example features three tables: The Sales table with the SaleID , ProductID , SaleDate , Quantity , and TotalAmount columns. The Products table with the ProductID , ProductName , and CategoryID columns. The Categories table with the CategoryID and CategoryName columns. This view aggregates sales data by product category for reporting purposes: CREATE VIEW CategorySalesReport AS SELECT c.CategoryName, SUM(s.TotalAmount) AS TotalSales FROM Sales s JOIN Products p ON s.ProductID = p.ProductID JOIN Categories c ON p.CategoryID = c.CategoryID GROUP BY c.CategoryName; It simplifies the complex query involving JOINs and aggregations, making it easier to generate sales reports by category: SELECT * FROM CategorySalesReport ORDER BY TotalSales DESC; Try it yourself with dbForge Studio Even though SQL Server Management Studio (SSMS) is the most popular and familiar tool that allows you to work with SQL Server databases, it is not the only one. Moreover, in the continuously evolving world of database development, administration, and management, new GUIs keep appearing like mushrooms after the rain. How do you choose the tool that is perfect for you among all this variety? Let us compare dbForge Studio for SQL Server with SSMS so that you can make a proper decision on which solution best aligns with your daily requirements: Feature dbForge Studio for SQL Server SQL Server Management Studio User-friendly interface Boasts an intuitive and user-friendly interface, providing a smooth user experience to both beginners and seasoned developers. While powerful, SSMS can have a steeper learning curve, particularly for those new to SQL Server tasks. Advanced functionality Offers a wide range of advanced features, including a visual query builder, data and schema comparison tools, and advanced SQL editing capabilities. Provides the essentials but may lack some of the advanced features available in dbForge Studio. Integrated tools Comes with integrated tools for schema and data comparison, enabling seamless data synchronization and database management out of the box. While offering basic tools, SSMS may require auxiliary add-ins to expand its feature set. Data generation Provides a customizable data generation tool that delivers realistic test data, offering flexibility in data generation for specific tables and columns. Incorporates fundamental data generation features but may require extra scripts or tools for advanced and specific data generation requirements. Cross-platform support Supports Windows, macOS, and Linux, providing flexibility for users on different operating systems. Is primarily designed for Windows, which limits accessibility for macOS users. Take advantage of dbForge Studio for SQL Server by [downloading a free fully-functional 30-day trial version](https://www.devart.com/dbforge/sql/studio/download.html) and [installing it on your computer](https://docs.devart.com/studio-for-sql-server/getting-started/installing.html) . With a huge pack of advanced features and an intuitive GUI, this all-in-one tool can maximize productivity and make SQL Server database development, administration, and management process efficient. The Studio can also be of use when it comes to today’s topic, from generating test data to performing advanced UPDATE operations. For a more visual comparison of the two solutions, watch the [SSMS vs. dbForge Studio for SQL Server – Features Comparison](https://www.youtube.com/watch?v=UiVxy83826Y) video on the Devart YouTube channel. Further learning [SQL Server Tutorials](https://blog.devart.com/sql-server-tutorial) [dbForge Studio Documentation](https://docs.devart.com/studio-for-sql-server/) [dbForge Studio Video Tutorials](https://www.youtube.com/playlist?list=PLpO6-HKL9JxXSZgO3L0MxOTt3QxpFbJNt) [Devart Academy](https://www.devart.com/academy/) Conclusion To sum up, we have discussed SQL Server tables and views, along with their key features and differences. Tables serve as the primary storage structure, holding data in an organized, accessible format. Views, on the other hand, provide a virtual representation of this data, simplifying complex queries and enhancing data security. Effectively using both tables and views can optimize your everyday database operations. Tags [SQL Server](https://blog.devart.com/tag/sql-server) [sql tables](https://blog.devart.com/tag/sql-tables) [sql views](https://blog.devart.com/tag/sql-views) [tables and views sql](https://blog.devart.com/tag/tables-and-views-sql) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdifference-between-views-and-tables-in-sql.html) [Twitter](https://twitter.com/intent/tweet?text=Difference+Between+Views+and+Tables+in+SQL&url=https%3A%2F%2Fblog.devart.com%2Fdifference-between-views-and-tables-in-sql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/difference-between-views-and-tables-in-sql.html&title=Difference+Between+Views+and+Tables+in+SQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/difference-between-views-and-tables-in-sql.html&title=Difference+Between+Views+and+Tables+in+SQL) [Copy URL](https://blog.devart.com/difference-between-views-and-tables-in-sql.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/different-methods-to-copy-data-with-dbforge-sql-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Different Methods to Copy Data with dbForge SQL Tools By [dbForge Team](https://blog.devart.com/author/dbforge) August 16, 2021 [0](https://blog.devart.com/different-methods-to-copy-data-with-dbforge-sql-tools.html#respond) 3321 In the article, we are going to discuss different methods to copy and transfer databases between SQL Server instances. In addition, we explore the benefits and drawbacks of each method so that you can choose the one that suits you most. Very often, developers need to make a copy of the database to protect the files and avoid data loss, or when there is a need to free up space on the disk. In their daily operations, they may also duplicate databases for testing purposes that may result in a time-consuming and challenging task. However, we want to show how easy and efficient it is to migrate databases with dbForge Studio for SQL Server. In the article, we’ll discuss the following methods to transfer databases and show practical examples of how they work in dbForge Studio for SQL Server: Backup and Restore Detach and Attach Copy Database Wizard Schema & Data Compare Generate Schema Script Wizard Generate Scripts Folder Generate Schema Snapshot Let’s take a closer look at each method and its peculiarities. Backup and Restore When you need to migrate the entire database to another server or to prevent databases from data loss regardless of they are user’s errors or hardware failures, it can be recommended to use the Backup and Restore functionality. Restriction: A SQL Server version of the target instance must be the same or higher than on the source one. Benefits: Online mode for a source database. Connections can be saved. Minimum risk of data loss. Quick recovery of a backup file. Backup and restore files can be located on multiple devices. Maintain database integrity. Generate a single backup file. Drawbacks: Requires constant testing to maintain error-free and reliable database copy. A differential mode requires a full database backup made in advance. Prior to restoring a differential database backup, you should first restore its full backup. Requires a shared folder if you plan to share it with coworkers. Cannot be used on Microsoft Azure SQL Database. Requires more space. Does not allow you to back up separate objects and particular tables with data. Note : Saving a database diagram file (.dbd) and then opening it on another server is not a viable method for transferring a database. The thing is that the .dbd file contains only references to objects in the server’s database, not the DDL of the objects, making it impossible to transfer a database using this method. Copying a database using the Backup and Restore functionality Step 1: Back up a database 1. In Database Explorer , right-click the database you want to back up and select Tasks > Back up. 2. In the Backup wizard, set the backup options and then click Back up . 3. Once the backup process is complete, click Finish . Step 2: Restore a database. 1. In Database Explorer , right-click the database you want to restore and select Tasks > Restore . 2. In the Restore wizard, set up options to restore the database and click Restore . To learn more about how to backup and restore a database, see [How to restore a SQL Server database to a new server](https://docs.devart.com/studio-for-sql-server/database-tasks/restore-a-mssql-database.html) and [How to backup a SQL Server database to move it to another server](https://docs.devart.com/studio-for-sql-server/database-tasks/create-a-database-backup.html) topics. Detach and Attach If you want to migrate a database between different servers or SQL Server instances, you can use the Detach and Attach functionality. In this case, a database is attached to a new server or instance being in the same state as it was detached from the previous server. Restrictions: A SQL Server version of the target instance must be the same or higher than on the source one. Active connections should be closed on the database. Only a detached or copied database can be attached. The attached database cannot be selected for attaching. Database snapshots should be removed from the database. The following databases cannot be detached: being mirrored, system, suspect, replicated, and published. It is not supported in the SQL Server Express version. Benefits: The database to be migrated should be in offline mode. The fastest way to transfer databases. Drawbacks: The database is offline until database migration is in progress. Runs slower for large databases. With full-text indexes, full-text catalog files should be migrated manually. All connections will be lost. Requires a shared folder. Cannot be used on Microsoft Azure SQL Database. Should not be used if downtime is a critical point. To learn how to attach and detach a database in SQL Server, you can watch [this video](https://youtu.be/8jJgM6al5Ng) . Copying a database using the Attach and Detach functionality Step 1: Detach a database 1. In Database Explorer , right-click the database you want to detach and select Tasks > Detach Database. 2. In the Detach Database wizard, drop connections, update statistics, and click OK , and then click Close . Step 2: Attach the database 1. In Database Explorer , right-click the instance for which you want to attach a database and select Tasks > Attach Database. 2. In the Browse files dialog, select the file with a .mdf extension. In the Attach Database dialog, click OK , and then click Close . For more information about how to attach and detach a database, see [How to detach a database in SQL Server](https://docs.devart.com/studio-for-sql-server/database-tasks/detach-a-database.html) and [How to attach a database using dbForge Studio for SQL Server](https://docs.devart.com/studio-for-sql-server/database-tasks/attach-a-database.html) topics. Copy Database Wizard With the Copy Database wizard, you can migrate, copy, or override SQL Server databases with all their related objects between different servers. In addition, it can be used when upgrading a server. Restrictions: It is not supported in the SQL Server Express version. The following databases cannot be copied: system, marked for replication, inaccessible, loading, offline, recovering, suspect. SQL Server version should be the same or higher. Have sysadmin fixed server privileges on the source and target servers. SQL Agent should run on a target SQL Server. Benefits: Select a source and target servers and databases. Copy logins on a target server. Update the target database name and directory for the data and log files. Select a database to move, copy, or override. Handle errors and logging options. Move all database objects. Create a backup copy of the database. Drawbacks: Requires a shared folder for data and log files in case of changing the target file directory. Selecting the Move option automatically removes a source database. Cannot be used on Microsoft Azure SQL Database. Copying a database using the Copy Database Wizard 1. In Database Explorer , right-click the database and select Tools > Copy Database . 2 In the Copy Database wizard, select the source and target servers and databases and configure the migration process. Then, click Execute . For more information about moving databases with the help of the Copy Database wizard, see [How to copy a database to another SQL Server database engine](https://docs.devart.com/studio-for-sql-server/database-tasks/copy-a-database.html) topic. Schema and Data Compare functionality When you need to move some specific database objects or update a target database with some partial data from the source database, it is better to use the Schema and Data Compare/Sync functionality. At first, you need to copy a database structure with the help of the Schema Compare tool and then copy data in the previously compared database structure by using the Data Compare tool. Restrictions: Schemas of objects from the source and target databases should be identical only if you copy data without a database structure. Only tables and views with a primary key, a unique key, a unique index, or a unique constraint can be compared when migrating data with Data Compare. Benefits: Move specific database objects from the source to the target database. Automate and schedule data comparison and synchronization by using the command-line interface and a Task Scheduler. Analyze data differences visually. Maintain data consistency between two databases. It is possible to transfer data both from an old version of SQL Server to a new one and vice versa. In addition, the tools will generate objects with the correct syntax of the target version. Drawbacks: Data synchronization may modify or delete data in the target database. So, it is recommended to make a backup copy of the target database to prevent data loss. When migrating data from one database to another one, the needed amount of your local disk space should cover the total size of both source and target databases and the size of the generated script. Also, you can watch [this video](https://youtu.be/Ns6i3FN1FXw) to learn how to migrate a database with the Schema and Data Compare functionality provided in dbForge Studio for SQL Server. Copying a database structure using Schema Compare 1. On the Comparison menu, select New Schema Comparison . 2. In the New Schema Comparison wizard, select source and target databases to be compared, set schema comparison options, schema and table mapping, and click Compare . For more information about the workflow of the data comparison process, see [How to synchronize two schemas quickly](https://docs.devart.com/studio-for-sql-server/comparing-synchronizing-schemas/synchronizing-two-schemas.html) . 3. In the Schema Comparison document, select the schema objects and click Synchronize objects to the target database . 4. In the Schema Synchronization wizard, you can set output options to manage a synchronization script, set synchronization options, and click Synchronize . For more information, see [How to synchronize two schemas quickly](https://docs.devart.com/studio-for-sql-server/comparing-synchronizing-schemas/synchronizing-two-schemas.html) . Additionally, you can copy a database structure using the command line. For this, refer to [Automate schema comparison and synchronization from the command line](https://docs.devart.com/schema-compare-for-sql-server/using-the-command-line/comparing-and-sync-schemas-through-command-line.html) topic. Copying data using Data Compare After that, you can start copying data to this database schema. 1. On the Comparison menu, select New Data Comparison . 2. In the New Data Comparison wizard, select source and target databases to be compared, set up comparison options, and click Compare . For more information about the workflow of the data comparison process, see [How to set up data comparison](https://docs.devart.com/studio-for-sql-server/comparing-synchronizing-data/setting-data-comparison.html) . 3. In the Data Comparison document, select the objects to be included in the synchronization and click Synchronize data to the target database . 4. In the Data Synchronization wizard, you can save a synchronization script on your computer for later use or execute it against the target database after the synchronization is complete. Here you can also configure additional synchronization options. For more information about data synchronization workflow, see [How to synchronize databases](https://docs.devart.com/studio-for-sql-server/comparing-synchronizing-data/synchronizing-data.html) . Additionally, you can copy a database structure using the command line. For this, refer to [Automate data comparison and synchronization from the command line](https://docs.devart.com/data-compare-for-sql-server/using-the-command-line/comparing-and-synchronizing-data.html) topic. Generate Schema Script Wizard When you need to migrate a database to a lower version of a SQL Server instance, you may use the Generate Schema Script Wizard that allows creating the database schema and copying all its data. Benefits: Database migration can be performed to a lower version of SQL Server, and database recovery can be done on the higher version. Choose whether to generate a script for a database schema and/or data. Save a SQL file as an encrypted zip archive. Drawbacks: Takes much time and storage to migrate. May lead to memory overload when migrating large databases. Data cannot be encrypted. Copying a database using Generate Script Wizard 1. In Database Explorer , right-click the required database and select Tasks > Generate Scripts . 2 In the Generate Script Wizard, set up script generation options and select objects for the script. You can also configure how the script creation should be performed and handle error processing and logging options. Once done, click Generate . For more information about how to back up a database with the Generate Scripts Wizard, see [How to generate SQL Server database scripts](https://docs.devart.com/studio-for-sql-server/database-tasks/generate-database-scripts-in-sql-server.html) and [How to generate DDL statements for database objects](https://docs.devart.com/studio-for-sql-server/working-with-database-objects/generating-ddl-statements.html) topics. Then, on an SQL Server instance to which you want to migrate the database, execute the generated script. Generate a Script Folder SQL Schema Compare allows users to generate a script folder representing a database schema and data that can be used to update another database or script files. Restriction: Available only in the Enterprise version of dbForge Studio for SQL Server. Benefits: Generate the schema and data for one SQL file per object. Create a script folder from the command line. Specify a destination path for the script folder. The script file is generated per object, and all objects are grouped into subfolders. Modify the default script folder structure and file names templates. Generate a scripts folder from a database, a snapshot, or another script folder. Drawbacks: Enabling the Decrypt encrypted objects option may lead to slow performance. The decryption option is not available when creating a script folder from a database snapshot or another scripts folder Creating a script folder 1. On the Database menu, select Tasks > Create Scripts Folder or Snapshot . 2. In the dialog that opens, select a source type. Then, depending on the type, specify its details. 3. If you want to customize folder names and file names templates, click Scripts Folder Structure and then click OK . 4 In the Destination section, specify a path to store a script folder and then click Create . Additionally, you can create a script folder using the command line. For this, refer to [How to create a scripts folder or a snapshot](https://docs.devart.com/studio-for-sql-server/database-tasks/create-a-scripts-folder.html) topic. Generate a Database Snapshot Creating a snapshot may prevent losses in the case of database objects errors while editing. In this case, you can restore a database to the state when the snapshot was created. Restriction: Available only in the Enterprise and Professional versions of dbForge Studio for SQL Server. Benefits: Create a snapshot from the command line. Decrypt encrypted objects. Specify a destination path for the snapshot. Generate a snapshot from a database, scripts folder, or another snapshot. Reverting a database to the snapshot is faster than a backup recovery. A snapshot can be archived. Drawbacks: A transactionally consistent, read-only, and static copy of the database located on the same SQL Server instance. Represents information only about database structure. Cannot contain table data. Enabling the Decrypt encrypted objects option may cause poor performance. When creating a snapshot from a script folder, the decryption option is not available. Database snapshots cannot be modified. In the case of reverting the database, a rollback is not possible. Creating a database snapshot 1. On the Database menu, select Tasks > Create Scripts Folder or Snapshot . 2. In the dialog that opens, select a source type. Then, depending on the type you’ve chosen, specify its details. 3 In the Destination section, specify a path to the snapshot and click Create . Additionally, you can create a database snapshot using the command line. For this, refer to [How to create a scripts folder or a snapshot](https://docs.devart.com/studio-for-sql-server/database-tasks/create-a-scripts-folder.html) topic. Conclusion In the article, we have reviewed approaches to SQL Server database migration which can be implemented with the help of dbForge Studio for SQL Server. In addition, the article highlights restrictions, advantages, and disadvantages of each method, and goes through a few examples. Tags [copy database](https://blog.devart.com/tag/copy-database) [database backup](https://blog.devart.com/tag/database-backup) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdifferent-methods-to-copy-data-with-dbforge-sql-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Different+Methods+to+Copy+Data+with+dbForge+SQL+Tools&url=https%3A%2F%2Fblog.devart.com%2Fdifferent-methods-to-copy-data-with-dbforge-sql-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/different-methods-to-copy-data-with-dbforge-sql-tools.html&title=Different+Methods+to+Copy+Data+with+dbForge+SQL+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/different-methods-to-copy-data-with-dbforge-sql-tools.html&title=Different+Methods+to+Copy+Data+with+dbForge+SQL+Tools) [Copy URL](https://blog.devart.com/different-methods-to-copy-data-with-dbforge-sql-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/disabling-direct-mode-in-litedac-and-unidac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Disabling Direct Mode in LiteDAC and UniDAC By [DAC Team](https://blog.devart.com/author/dac) August 7, 2015 [4](https://blog.devart.com/disabling-direct-mode-in-litedac-and-unidac.html#comments) 4200 [LiteDAC](https://www.devart.com/litedac/) and [UniDAC](https://www.devart.com/unidac/) interact with SQLite in 2 approaches. The first approach is that an application uses functions of the linked SQLite library. For Windows OS it is sqlite3.dll, for Mac OS and iOS – libsqlite3.dylib, for Android – libsqlite.so. The second approach allows to connect to the database from the application directly, using so called Direct Mode. Direct Mode provides interaction with SQLite avoiding any external libraries linking. It is implemented by embedding the code of the SQLite library directly to the application. This approach allows to work with SQLite in cases, when using third-party libraries is impossible due to a number of reasons. For example, when developing applications for iOS or Mac OS, the Apple corporation prohibits deployment of any libraries along with the application. However, there are situations, when it is preferably to use the first mode. Thus, when developing applications for Android, there is no need to worry about SQLite client library presence, since it is included into this OS distribution. On the other hand, during project implementation, strict requirements may be imposed on the compiled application size. Such requirements are especially relevant for mobile development. Therefore, it is highly desirable to have an opportunity to disable modules, that won’t be used in the application, before compilation. LiteDAC and UniDAC allow to exclude the code of the SQLite client library, that implements Direct Mode, from a ready application. As a result, the size of the developed application may be decreased. To exclude support for Direct Mode from the project, do the following: Add the path to the [DAC Installation Folder]\\Source] folder to the Search Path property. When using UniDAC with source code edition, add the following path: [UniDAC folder with source code]\\Source\\UniProviders\\SQLite. Compile the application with the NOSTATIC option. Pay attention, that the Search Path option and compiler options are set for each supported platform separately! A simple sample application using LiteDAC can show the gain in the retrieved application size: Platform Normal Compilation Compilation with NOSTATIC Size difference Win32 4 268 Kb 3 752 Kb 516 Kb Win64 7 292 Kb 6 122 Kb 1 170 Kb iOS32 20 540 Kb 19 883 Kb 657 Kb iOS64 22 532 Kb 21 428 Kb 1 104 Kb Android 20 181 Kb 19 490 Kb 691 Kb Thus, the use of the NOSTATIC conditional compilation directive allows to exclude the Direct Mode module from the compilation in cases when it is not used. Note , that if you set the Direct property to True when using the NOSTATIC directive, the ‘Direct Mode disabled’ error will occur. Tags [direct mode](https://blog.devart.com/tag/direct-mode) [litedac](https://blog.devart.com/tag/litedac) [rad studio](https://blog.devart.com/tag/rad-studio) [unidac](https://blog.devart.com/tag/unidac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdisabling-direct-mode-in-litedac-and-unidac.html) [Twitter](https://twitter.com/intent/tweet?text=Disabling+Direct+Mode+in+LiteDAC+and+UniDAC&url=https%3A%2F%2Fblog.devart.com%2Fdisabling-direct-mode-in-litedac-and-unidac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/disabling-direct-mode-in-litedac-and-unidac.html&title=Disabling+Direct+Mode+in+LiteDAC+and+UniDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/disabling-direct-mode-in-litedac-and-unidac.html&title=Disabling+Direct+Mode+in+LiteDAC+and+UniDAC) [Copy URL](https://blog.devart.com/disabling-direct-mode-in-litedac-and-unidac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS jorge gaspar August 13, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 2:55 pm if I deactivate direct mode in sqlite, the Encryption Key option . I can still use the ? DAC Team August 13, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 4:48 pm Hello, Jorge! If the SQLite library you are using has support for encryption, then yes – you will be able to use the Encryption Key option after disabling Direct Mode. However, the Encryption Algorithm option will be unavailable in this case. Ian Tetriss October 24, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 6:51 pm Hello, I have a feeling that Lazarus doesn’t support Direct Mode; I cannot find any static library of SQLite. Is that true? Thanks DAC Team October 26, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 11:32 am Hello, Ian! You are right. Currently, LiteDAC and UniDAC have no support for SQLite Direct Mode in Lazarus. You can post a suggestion at our [Uservoice forum](https://devart.uservoice.com/forums/104635-delphi-data-access-components) . If your suggestion gets enough votes, we will consider the possibility to implement it in one of the next versions. Comments are closed."} {"url": "https://blog.devart.com/dive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Dive Deeper into Data Analysis with a New Version of Query Builder for SQL Server! By [dbForge Team](https://blog.devart.com/author/dbforge) August 26, 2021 [0](https://blog.devart.com/dive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html#respond) 2412 We are glad to announce that a new version of [dbForge Query Builder for SQL Server](https://www.devart.com/dbforge/sql/querybuilder/download.html) is now available for download. In this release, we extended the tool’s functionality with the Data Reporting and Analysis features namely Data Reports, Master-Detail Browser, and Pivot Tables. Data Reports dbForge Query Builder for SQL Server allows generating and visualizing different reports based on database objects or SQL queries and modifying reports with a huge set of options in the Report Builder. Master-Detail Browser The feature allows viewing data in the master-detail structure and analyzing data in tables or views, visualizing or modifying relations between tables, and updating table data in the Design and Data Views. Pivot Tables Users can easily generate summarized pivot tables, visualize data as charts, and manipulate data in the Pivot Table Designer. Availability Download the latest version of dbForge Query Builder for SQL Server, which is a part of dbForge SQL Tools, and evaluate the new functionality during a free 30-day trial period. You can [share your feedback](https://www.devart.com/dbforge/mysql/querybuilder/feedback.html) for us to improve our products in the future. Tags [data reports](https://blog.devart.com/tag/data-reports) [master-detail browser](https://blog.devart.com/tag/master-detail-browser) [pivot table](https://blog.devart.com/tag/pivot-table) [query builder](https://blog.devart.com/tag/query-builder) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Dive+Deeper+into+Data+Analysis+with+a+New+Version+of+Query+Builder+for+SQL+Server%21&url=https%3A%2F%2Fblog.devart.com%2Fdive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html&title=Dive+Deeper+into+Data+Analysis+with+a+New+Version+of+Query+Builder+for+SQL+Server%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html&title=Dive+Deeper+into+Data+Analysis+with+a+New+Version+of+Query+Builder+for+SQL+Server%21) [Copy URL](https://blog.devart.com/dive-deeper-into-data-analysis-with-a-new-version-of-query-builder-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Don’t Miss the Latest Maintenance Update of SQL Complete 6.9 By [dbForge Team](https://blog.devart.com/author/dbforge) November 15, 2021 [0](https://blog.devart.com/dont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html#respond) 2635 Here comes the maintenance update of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , your favorite SSMS and Visual Studio add-in for context-sensitive code autocompletion, formatting, and refactoring. Our team aspires to make SQL Complete the best SSMS add-in on the market – not only in terms of expansive functionality, but also as a bug-free product that does not cause a single trouble in daily use. Here is the list of fixed issues. Issue description Ticket # Issue with opening a SQL document from a Microsoft Visual Studio project D67752, D67753 Issue that occurred when working with SQL documents related to Visual Studio projects – Issue with loading the SQL Complete assembly after launching SSMS D67302 Issue that occurred when working with variables in the SQLCMD mode D69913 Issue that occurred when working with the CREATE SCHEMA statement in the SQL editor D70235 Application launch issue D67651 Document session restoration error D68594, D68627, D68638 The update is already available and can be installed from the SQL Complete menu in SSMS > Help > Check for Updates . And if you are not using SQL Complete yet, we gladly invite you to check it out, all shiny and polished, during a free 14-day trial. Just [download](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) it from our website and see all of its capabilities in action! Tags [dbforge](https://blog.devart.com/tag/dbforge) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [what's new sql complete](https://blog.devart.com/tag/whats-new-sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html) [Twitter](https://twitter.com/intent/tweet?text=Don%E2%80%99t+Miss+the+Latest+Maintenance+Update+of+SQL+Complete+6.9&url=https%3A%2F%2Fblog.devart.com%2Fdont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html&title=Don%E2%80%99t+Miss+the+Latest+Maintenance+Update+of+SQL+Complete+6.9) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html&title=Don%E2%80%99t+Miss+the+Latest+Maintenance+Update+of+SQL+Complete+6.9) [Copy URL](https://blog.devart.com/dont-miss-the-latest-maintenance-update-of-sql-complete-6-9.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/dont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Don’t Miss the Newly Updated dbForge Studio for Oracle v4.6! By [dbForge Team](https://blog.devart.com/author/dbforge) February 19, 2024 [0](https://blog.devart.com/dont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html#respond) 1638 It’s been a while since the last update of [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , your go-to IDE when it comes to developing and managing Oracle databases. But today is the day that we finally roll out a few useful goodies we’ve prepared, including newly supported statements and a bunch of new formatting profiles to make your daily work with Oracle a breeze. Support for CREATE, ALTER, and DROP VIEW statements Let’s start our tour with code completion —namely, with the newly introduced support for CREATE VIEW, ALTER VIEW, and DROP VIEW statements. We believe they’ll definitely come in handy. New formatting profiles We’ve also enhanced the integrated PL/SQL Formatter with a collection of new predefined profiles (in addition to the already available Default and Quest Software Toad ) that you can further customize to your liking. Take a look at what you’ve got now: #1. Collapsed #2. Commas before #3. Compact #4. Extended #5. Indented #6. MSDN SQL #7. Quest Software Toad #8. Right aligned #9. Stack compact You can view and configure formatting profiles in Tools > Options > Text Editor > Formatting > Profiles . Include security permissions Finally, we’ve augmented script generation with a new option called Include security permissions , which can be found in Tools > Options > Generate Schema Script > General > Generate Script AS . There, you’ll find this option among the General settings. Note that it is turned off by default. Download dbForge Studio for Oracle v4.6 today! That’s it for today, but there will surely be more and more pleasant surprises and new features up ahead! And now, we’d love to invite you to either [download dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) for a free 30-day trial or, if you already have it, get this update right now from the Help menu > Check for Updates . And, of course, feel free to share your impressions with us—your feedback helps us keep our products perfectly convenient and relevant for you. Get the updated dbForge Studio for Oracle in a multidatabase suite! We’d also love to remind you that if you happen to work with other database systems, such as SQL Server, MySQL, MariaDB, and PostgreSQL, you might as well try our suite called [dbForge Edge](https://www.devart.com/dbforge/edge/) , which comprises four Studios and offers a wide coverage of databases and cloud services. The updated dbForge Studio for Oracle is part of it. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [release](https://blog.devart.com/tag/release) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html) [Twitter](https://twitter.com/intent/tweet?text=Don%E2%80%99t+Miss+the+Newly+Updated+dbForge+Studio+for+Oracle+v4.6%21&url=https%3A%2F%2Fblog.devart.com%2Fdont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html&title=Don%E2%80%99t+Miss+the+Newly+Updated+dbForge+Studio+for+Oracle+v4.6%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html&title=Don%E2%80%99t+Miss+the+Newly+Updated+dbForge+Studio+for+Oracle+v4.6%21) [Copy URL](https://blog.devart.com/dont-miss-the-newly-updated-dbforge-studio-for-oracle-v4-6.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dotconnect-for-oracle-documentation-improved.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) dotConnect for Oracle Documentation Improved By [dotConnect Team](https://blog.devart.com/author/dotconnect) April 9, 2010 [0](https://blog.devart.com/dotconnect-for-oracle-documentation-improved.html#respond) 2703 We read all your feedback reports to know how we may make our products better. The opinion of our users is important to us. Unfortunately, substantial part of user feedback reports rate our documentation as poor. Last couple of months we have been working on dotConnect for Oracle documentation improvement. We have made improvements in two directions: Better table of contents, improving of current articles and writing new ones. Improving class reference (increasing number of samples, extending descriptions) You can download the latest version of our documentation [here](http://www.devart.com/dotconnect/oracle/dcoracle.chm) . We would like to know your opinion on the documentation changes, and which problems need more detailed coverage. [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdotconnect-for-oracle-documentation-improved.html) [Twitter](https://twitter.com/intent/tweet?text=dotConnect+for+Oracle+Documentation+Improved&url=https%3A%2F%2Fblog.devart.com%2Fdotconnect-for-oracle-documentation-improved.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dotconnect-for-oracle-documentation-improved.html&title=dotConnect+for+Oracle+Documentation+Improved) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dotconnect-for-oracle-documentation-improved.html&title=dotConnect+for+Oracle+Documentation+Improved) [Copy URL](https://blog.devart.com/dotconnect-for-oracle-documentation-improved.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/dotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [What’s New](https://blog.devart.com/category/whats-new) Devart’s dotConnect Universal Clinches Bronze in Visual Studio Magazine’s Reader’s Choice Awards By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) October 27, 2023 [0](https://blog.devart.com/dotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html#respond) 1807 We are thrilled to announce that Devart’s dotConnect Universal has been honored with the bronze award in the General Development Tools category at the 2023 Visual Studio Magazine Reader’s Choice Awards. This recognition is a testament to our commitment to delivering exceptional solutions for developers and our continuous efforts to enhance the development experience. Visual Studio Magazine’s annual awards highlight the most innovative and valuable tools in the industry, as chosen by the magazine’s readers. This year, dotConnect Universal stood out among the competition, earning the admiration of developers and industry professionals alike. The dotConnect Universal product is designed to provide a universal solution for data connectivity in various development environments. Its versatility, efficiency, and user-friendly features have contributed to its success and recognition by the developer community. We are honored to receive the bronze award in the General Development Tools category. This recognition is a result of our team’s dedication to creating tools that empower developers and streamline their workflows. The Visual Studio Magazine Reader’s Choice Awards are highly regarded in the industry, and winning in the General Development Tools category is a significant achievement for Devart. We would like to express our gratitude to the readers and the entire community for their support and trust in dotConnect Universal. As we celebrate this milestone, we remain committed to innovation and excellence in providing tools that meet the evolving needs of developers. This award serves as motivation to continue pushing boundaries and delivering solutions that make a positive impact on the development community. Tags [data connectivity](https://blog.devart.com/tag/data-connectivity) [dotconnect](https://blog.devart.com/tag/dotconnect) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html) [Twitter](https://twitter.com/intent/tweet?text=Devart%E2%80%99s+dotConnect+Universal+Clinches+Bronze+in+Visual+Studio+Magazine%E2%80%99s+Reader%E2%80%99s+Choice+Awards&url=https%3A%2F%2Fblog.devart.com%2Fdotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html&title=Devart%E2%80%99s+dotConnect+Universal+Clinches+Bronze+in+Visual+Studio+Magazine%E2%80%99s+Reader%E2%80%99s+Choice+Awards) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html&title=Devart%E2%80%99s+dotConnect+Universal+Clinches+Bronze+in+Visual+Studio+Magazine%E2%80%99s+Reader%E2%80%99s+Choice+Awards) [Copy URL](https://blog.devart.com/dotconnect-universal-clinches-bronze-in-visual-studio-magazine-readers-choice-awards.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/download-a-new-build-350305-of-dbforge-studio-for-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) Download a new build 3.50.305 of dbForge Studio for MySQL! By [dbForge Team](https://blog.devart.com/author/dbforge) June 23, 2009 [0](https://blog.devart.com/download-a-new-build-350305-of-dbforge-studio-for-mysql.html#respond) 2788 Dear users, Devart Development Team has released today a new build 3.50.305 of [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , which includes the following bug fixes: Problem with obtaining the SQL code for procedures was fixed (42100) The bug with application hang on while parsing some view texts was fixed (41333) The “Stop” button that stops query executing is working correctly now (41292) The bug with removing of a procedure during its renaming was fixed (41284) IndexOutOfRangeException on working with identifier’s aliases in SELECT statements was fixed (41712) IndexOutOfRangeException during tables editing was fixed (41716) The problem with incorrect detection of relations between tables in Database Diagram was fixed (41799) NullReferenceException on closing an SQL document was fixed (41876) NullReferenceException on project opening was fixed (41458) The problem with the unsaved “Use Unicode” option in connection NullReferenceException and InvalidOperationException on closing the application were fixed (41864) The problem with truncating data of the UTF8 database after editing in LOB editor was fixed (T5219) ObjectDisposedException on the automatically hidden tool window display was fixed (41299) These fixes have significantly contributed to the product performance improvement. Download now [dbForge Studio for MySQL 3.50.305](https://www.devart.com/dbforge/mysql/studio/download.html) and get more satisfaction and smoother work of the product. Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdownload-a-new-build-350305-of-dbforge-studio-for-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=Download+a+new+build+3.50.305+of+dbForge+Studio+for+MySQL%21&url=https%3A%2F%2Fblog.devart.com%2Fdownload-a-new-build-350305-of-dbforge-studio-for-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/download-a-new-build-350305-of-dbforge-studio-for-mysql.html&title=Download+a+new+build+3.50.305+of+dbForge+Studio+for+MySQL%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/download-a-new-build-350305-of-dbforge-studio-for-mysql.html&title=Download+a+new+build+3.50.305+of+dbForge+Studio+for+MySQL%21) [Copy URL](https://blog.devart.com/download-a-new-build-350305-of-dbforge-studio-for-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/download-install-postgresql-on-windows.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Download and Install PostgreSQL Database on Windows By [dbForge Team](https://blog.devart.com/author/dbforge) September 15, 2021 [0](https://blog.devart.com/download-install-postgresql-on-windows.html#respond) 12100 In this article, we will focus on starting a PostgreSQL server on your PC. Here, you will find a simple step-by-step PostgreSQL tutorial for Windows seasoned with comprehensive illustrations for your convenience. Download and Install PostgreSQL-Step by Step Tutorial PostgreSQL is an object-relational open-source system for database management (ORDBMS) based on Postgres 4.2. It supports most of the SQL standards and offers a variety of modern features. The software was developed at the Department of Computer Science, University of California, Berkeley. To start a PostgreSQL server on your personal machine, you will need to perform the initial setup. The process of setting up PostgreSQL melts down to the following three steps: Downloading the PostgreSQL Installer for Windows. The installation itself. Verifying the installation. Download PostgreSQL Installer for Windows To begin with, you need to [download the installer for Windows](https://www.enterprisedb.com/downloads/postgres-postgresql-downloads) . It is worth mentioning that the PostgreSQL client is available for download for free. As you might have noticed, it is available for Linux, macOS, and Windows. Moreover, different versions of the installer are available. Make sure to choose the desired one from the list and click the Download button. Allow the [PostgreSQL ODBC driver](https://www.devart.com/odbc/postgresql/) a couple of minutes to download and open the file. Install PostgreSQL on Windows 1. You will be presented with the following wizard once you launch the PostgreSQL installer: Simply click Next to continue. 2. You need to choose the directory to install the PostgreSQL on Windows 10. 3. You will see a list of the components that can be installed along with the main product: Depending on your preferences, select the components that you wish to be installed along with PostgreSQL. To get a closer look at what you are about to install, click on the corresponding component and you will see a brief description of what it is. Once the desired components are selected, hit Next . 4. Subsequently, it is required to choose a folder to store your data in. 5. At the next step, you need to enter a password to the database superuser ( postgres by default). 6. Now, enter the number of the port the server should listen to. 7. Here, choose the locale that will be used by the new database cluster. The list of all the possible options is available in the drop-down menu. 8. This is the final checkup point before the installation begins. Read the pre-installation summary carefully and make sure everything is configured correctly. Once you are ready for the PostgreSQL install, hit Next . You will see that the wizard is ready to proceed with the installation. Finally, click Next again and allow the wizard to complete the process. 9. That’s it! The PostgreSQL is now installed on your computer. Now, in case you are willing to download and install additional tools, you can use the Stack Builder. If you do not wish to do that, remove the tick from the corresponding box and click Finish . Note PostgreSQL server is configured to listen on local ports by default, meaning that it can only accept connections from clients running on the same machine. If you plan to access the server remotely, you will need to [enable remote connections to the postgres server](https://blog.devart.com/configure-postgresql-to-allow-remote-connection.html) . Verify the Installation In order to verify the installation, you can connect to the PostgreSQL database using the SQL Shell (psql) , pgAdmin , or dbForge Studio . Let us take a closer look at each one of them: SQL Shell (psql) psql is a front-end to PostgreSQL represented as a terminal. It allows you to interact with PostgreSQL by typing in the queries, executing them, and getting the query results. Let us describe how to connect to a database using psql: 1. Launch SQL Shell (psql) on your computer. For this either open the PostgreSQL folder in the start menu or type SQL Shell in the Windows search bar. 2. Enter the Server , Database , Port , Username , and Password . Press Enter . If you press Enter without making the configurations, they will be filled in automatically with the default values. Although, you will have to enter a password either way. 3. The following statement returns the current version of PostgreSQL: SELECT version(); Make sure to put a semicolon at the end of the query. Otherwise, you will get an error. 4. On pressing Enter , you will get the current PostgreSQL version on your system. pgAdmin pgAdmin is a GUI for PostgreSQL. It was created to simplify the PostgreSQL server administration. In order to connect a database using pgAdmin, please take these steps: 1. Find pgAdmin in the Windows start menu and open it. 2. In Browser, select Servers . Then, click Objects , point to Create , and click Server . 3. In the dialog box that pops up, enter the server name and go to the Connection tab . 4. Before clicking the Save button, enter the hostname and password for the postgres user. 5. After that, click the Servers and expand the tree. By default, the PostgreSQL database is called postgres . 6. Go back to the Tools menu and click Query Tool . 7. Enter the SELECT version(); query in the Query Editor and hit the Execute button. The result of the query will be displayed in the Data Output tab. dbForge Studio While pgAdmin is a standard tool that comes with the PostgreSQL installation, [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) is a competitive alternative to it. dbForge Studio is a graphical user interface solution created to help you develop and manage PostgreSQL databases. The tool allows you to edit, import, and export data, create data reports, pivot tables, and master-detail relations. All this is wrapped in the intuitive and user-friendly interface that makes working with PostgreSQL databases a lot easier. You are always welcome to test dbForge Studio . With your convenience in mind, we have created [a 30-day trial version](https://www.devart.com/dbforge/postgresql/studio/download.html) that is available for free. To verify the PostgreSQL installation, launch dbForge Studio on your PC. 1. If you open the application for the first time, you will see that the Database Connection Properties dialog box opens automatically. Alternatively, go to the Database menu and click New Connection . 2. On the General tab, fill in the corresponding boxes with the database connection properties: Host – the name or IP address of the PostgreSQL Server host. Port – the TCP/IP port to connect to the server. User – the name of the user account. Password – the password of the user account. 3. Once done, click Test Connection . If all is set, you will see the following message: 4. Having verified the connection, feel free to click Connect to save it. Conclusion In this article, we talked about installing PostgreSQL on your personal computer, starting the PostgreSQL server on Windows, and setting it up. Also, we have described the ways to add a postgres database and to verify the installation in different apps, including [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) . Should you require more information on the Devart products, check out the other [tools for PostgreSQL](https://www.devart.com/dbforge/postgresql/) . Tags [pgadmin](https://blog.devart.com/tag/pgadmin) [postgres](https://blog.devart.com/tag/postgres) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql installer](https://blog.devart.com/tag/postgresql-installer) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [sql shell](https://blog.devart.com/tag/sql-shell) [studio for postgresql](https://blog.devart.com/tag/studio-for-postgresql) [Windows](https://blog.devart.com/tag/windows) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdownload-install-postgresql-on-windows.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Download+and+Install+PostgreSQL+Database+on+Windows&url=https%3A%2F%2Fblog.devart.com%2Fdownload-install-postgresql-on-windows.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/download-install-postgresql-on-windows.html&title=How+to+Download+and+Install+PostgreSQL+Database+on+Windows) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/download-install-postgresql-on-windows.html&title=How+to+Download+and+Install+PostgreSQL+Database+on+Windows) [Copy URL](https://blog.devart.com/download-install-postgresql-on-windows.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/dynamic-database-creation-in-entity-framework.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Dynamic Database Creation in Entity Framework By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 31, 2010 [2](https://blog.devart.com/dynamic-database-creation-in-entity-framework.html#comments) 6894 Entity Framework 4 RC allows you to create and drop databases in run-time using SSDL for DDL generation. Now ObjectContext has CreateDatabase(), DropDatabase(), and CreateDatabaseScript() methods. They appeared in Entity Framework v4 CTP for [Code Only](http://blogs.msdn.com/efdesign/archive/tags/CodeOnly/default.aspx) and only for SQLClient initially but later they became available for other EF-providers. In this article we describe implementation of these methods in Devart data providers. We are using [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) as a data provider in the following examples. [Northwind](http://www.devart.com/dotconnect/efquerysamples.zip) is used as a sample database. DDL generation We have supported functionality for DDL script generation not only for Entity Framework v4 but also for Entity Framework v1 in the following ADO.NET data providers: [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) . This functionality is useful when you have a ready EF-model, but don’t have a DDL script. You can use it for simple database deployment also. During database creation tables, primary keys, and foreign keys are specified. If database server supports Id autogeneration, than identity specification will be created for primary keys. The views, functions, and tables for the entities from SSDL having DefiningQuery are not created. Please note, fully qualified names for a table will be generated if the table has a Schema attribute. So if you don’t want to link to a single schema, it is better to delete these attributes ( [Entity Developer](http://www.devart.com/entitydeveloper/) allows you to edit storage model for edml models conveniently). If your model was created by EDM Wizard and Visual Studio Entity Designer, you can edit it with the help of any XML Editor. Entity Framework v4 Use the following code to create a database: NorthwindEntityModel ctx = new NorthwindEntityModel();\nctx.CreateDatabase(); // The database structure was created on the server To generate a database script use the following code: NorthwindEntityModel ctx = new NorthwindEntityModel();\nstring ddlScript = ctx.CreateDatabaseScript();\n// The script of database creation was generated You can delete a database with the help of the following code: NorthwindEntityModel ctx = new NorthwindEntityModel();\nctx.DeleteDatabase(); Note . We don’t support the new public bool DatabaseExists() method of ObjectContext because it is ambiguous. DeleteDatabaseScript ObjectContext has the CreateDatabaseScript method, but it does not have a corresponding DropDatabaseScript() method. This method is required to receive database dropping script. You can do it manually: Add the Devart.Data.Oracle.Entity.dll for .NET run-time v4.0 reference to the project Extend the ObjectContext partial class with the following method: using System.Data.EntityClient;\nusing System.Data.Metadata.Edm;\nusing Devart.Data.Oracle.Entity;\n\npartial class NorthwindEntityModel {\n\n public string DeleteDatabaseScript() \n {\n EntityConnection entityConnection = this.Connection as EntityConnection;\n\n StoreItemCollection store = entityConnection.GetMetadataWorkspace().\n GetItemCollection(DataSpace.SSpace)\n as StoreItemCollection;\n\n OracleEntityProviderServices oracleEntityProviderServices\n = new OracleEntityProviderServices();\n\n return oracleEntityProviderServices.DeleteDatabaseScript(null, store);\n }\n} Use it in run-time: NorthwindEntityModel ctx = new NorthwindEntityModel();\nstring ddlScript = ctx.DeleteDatabaseScript(); Entity Framework v1 You can generate DDL for Entity Framework v1 too. To do this perform the following steps: Add Devart.Data.Oracle.Entity.dll for .NET run-time v2.0 reference to the project Extend the ObjectContext partial class with the following methods: using System.Data.EntityClient;\n using System.Data.Metadata.Edm;\n using Devart.Data.Oracle.Entity;\n\n partial class NorthwindEntityModel {\n\n public void CreateDatabase() {\n\n EntityConnection entityConnection = this.Connection as EntityConnection;\n\n StoreItemCollection store =\n entityConnection.GetMetadataWorkspace().GetItemCollection(DataSpace.SSpace)\n as StoreItemCollection;\n\n OracleEntityProviderServices oracleEntityProviderServices =\n new OracleEntityProviderServices();\n\n oracleEntityProviderServices.CreateDatabase(entityConnection.StoreConnection,\n null, store);\n }\n\n public void DeleteDatabase() {\n\n EntityConnection entityConnection = this.Connection as EntityConnection;\n\n StoreItemCollection store =\n entityConnection.GetMetadataWorkspace().GetItemCollection(DataSpace.SSpace)\n as StoreItemCollection;\n\n OracleEntityProviderServices oracleEntityProviderServices =\n new OracleEntityProviderServices();\n\n oracleEntityProviderServices.DeleteDatabase(entityConnection.StoreConnection,\n null, store);\n }\n\n public string CreateDatabaseScript() {\n\n EntityConnection entityConnection = this.Connection as EntityConnection;\n\n StoreItemCollection store =\n entityConnection.GetMetadataWorkspace().GetItemCollection(DataSpace.SSpace)\n as StoreItemCollection;\n\n OracleEntityProviderServices oracleEntityProviderServices =\n new OracleEntityProviderServices();\n\n return oracleEntityProviderServices.CreateDatabaseScript(null, store);\n }\n\n public string DeleteDatabaseScript() {\n\n EntityConnection entityConnection = this.Connection as EntityConnection;\n\n StoreItemCollection store =\n entityConnection.GetMetadataWorkspace().GetItemCollection(DataSpace.SSpace)\n as StoreItemCollection;\n\n OracleEntityProviderServices oracleEntityProviderServices =\n new OracleEntityProviderServices();\n\n return oracleEntityProviderServices.DeleteDatabaseScript(null, store);\n }\n} The functionality of the provider-specific DbProviderServices class is used for the DDL generation. Here is a full list of the provider-specific classes and assemblies: Connector Assembly Namespace DbProviderServices class name [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) Devart.Data.Oracle.Entity.dll Devart.Data.Oracle.Entity OracleEntityProviderServices [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) Devart.Data.MySql.Entity.dll Devart.Data.MySql.Entity MySqlEntityProviderServices [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) Devart.Data.PostgreSql.Entity.dll Devart.Data.PostgreSql.Entity PgSqlEntityProviderServices [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) Devart.Data.SQLite.Entity.dll Devart.Data.SQLite.Entity SQLiteEntityProviderServices Tags [entity framework](https://blog.devart.com/tag/entity-framework) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [sqlite](https://blog.devart.com/tag/sqlite) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdynamic-database-creation-in-entity-framework.html) [Twitter](https://twitter.com/intent/tweet?text=Dynamic+Database+Creation+in+Entity+Framework&url=https%3A%2F%2Fblog.devart.com%2Fdynamic-database-creation-in-entity-framework.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dynamic-database-creation-in-entity-framework.html&title=Dynamic+Database+Creation+in+Entity+Framework) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dynamic-database-creation-in-entity-framework.html&title=Dynamic+Database+Creation+in+Entity+Framework) [Copy URL](https://blog.devart.com/dynamic-database-creation-in-entity-framework.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 2 COMMENTS Gerhard Sommer January 4, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 11:35 am When I try to create a database script using dotConnect for Oracle 6.0.70.0 and .NET 4, the result is an empty string. I am using the Fluent API of Entity Framework CTP5 to create the model. The I get the ObjectContext using the IObjectContextAdapter interface on my DbContext. Best Regards, Gerhard Sommer Devart January 10, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 9:30 am Thank you for the report, we have already fixed this problem. The fix will be available in the nearest build. We plan to release the new build in a week or so. Comments are closed."} {"url": "https://blog.devart.com/dynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) Dynamics CRM OAuth 2.0 Authentication Support in Excel Add-ins 2.4 By [dotConnect Team](https://blog.devart.com/author/dotconnect) June 11, 2020 [0](https://blog.devart.com/dynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html#respond) 2394 Devart is glad to announce the release of [Excel Add-ins 2.4](https://www.devart.com/excel-addins/) with support for Dynamics CRM OAuth 2.0 authentication and minor improvements for Marketo and Shopify. The new version of Excel Add-in for Dynamics CRM allows users to connect to Dynamics CRM using OAuth 2.0 authentication. They can perform web login to Dynamics CRM instead of specifying their user id and password in the connection settings. This also means that when you store an OAuth 2.0 connection in workbooks or in Excel with the Allow saving password selected, in fact, an OAuth 2.0 refresh token is stored instead of user id and password. Feel free to [download the new versions of Devart Excel Add-ins](https://www.devart.com/excel-addins/universal-pack/download.html) , try the new functionality, and [leave feedback](https://www.devart.com/excel-addins/universal-pack/feedback.html?pn=Devart%20Excel%20Add-ins) ! Tags [dynamics crm](https://blog.devart.com/tag/dynamics-crm) [excel addins](https://blog.devart.com/tag/excel-addins) [what's new excel addins](https://blog.devart.com/tag/whats-new-in-excel-addins) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html) [Twitter](https://twitter.com/intent/tweet?text=Dynamics+CRM+OAuth+2.0+Authentication+Support+in+Excel+Add-ins+2.4&url=https%3A%2F%2Fblog.devart.com%2Fdynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html&title=Dynamics+CRM+OAuth+2.0+Authentication+Support+in+Excel+Add-ins+2.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html&title=Dynamics+CRM+OAuth+2.0+Authentication+Support+in+Excel+Add-ins+2.4) [Copy URL](https://blog.devart.com/dynamics-crm-oauth-2-0-authentication-support-in-excel-add-ins-2-4.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/dynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SSIS Components](https://blog.devart.com/category/products/ssis-components) [What’s New](https://blog.devart.com/category/whats-new) Dynamics CRM OAuth 2.0 Support and Other Improvements in SSIS Components 1.13 By [dotConnect Team](https://blog.devart.com/author/dotconnect) June 15, 2020 [0](https://blog.devart.com/dynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html#respond) 2313 Devart is glad to announce the release of [SSIS Data Flow Components 1.13](https://www.devart.com/ssis/) with support of OAuth 2.0 authentication for Dynamics CRM and a number of improvements for other cloud sources and databases. The new version of SSIS Data Flow Components for Dynamics CRM allows users to connect to Dynamics CRM using OAuth 2.0 authentication. They can perform web login to Dynamics CRM instead of specifying their user id and password in the connection settings. This also means that when you store an OAuth 2.0 connection in your SSIS package, in fact, an OAuth 2.0 refresh token is stored instead of user id and password. Besides, the new version of SSIS Components now support Dynamics CRM MultiSelect Option Set columns and PostgreSQL composite types. SSIS Data Flow Components for Magento now can obtain and load order items information for sales orders, and thus, allow you importing new sales orders into Magento 2.x. We have also improved web login process in SSIS Data Flow Components for BigQuery, allowing you to use system default browser. Additionally, SSIS Data Flow Components for Mailchimp now support the Opened table, providing access to the information about subscribers who opened a campaign email. You are welcome to [download](https://www.devart.com/ssis/universal-bundle/download.html) the updated version of our SSIS Data Flow Components and send [feedback](https://www.devart.com/ssis/universal-bundle/feedback.html?pn=Devart%20SSIS%20Data%20Flow%20Components) . Tags [bigquery](https://blog.devart.com/tag/bigquery) [dynamics crm](https://blog.devart.com/tag/dynamics-crm) [magento](https://blog.devart.com/tag/magento) [mailchimp](https://blog.devart.com/tag/mailchimp) [SSIS](https://blog.devart.com/tag/ssis) [what's new ssis](https://blog.devart.com/tag/whats-new-in-ssis) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fdynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html) [Twitter](https://twitter.com/intent/tweet?text=Dynamics+CRM+OAuth+2.0+Support+and+Other+Improvements+in+SSIS+Components+1.13&url=https%3A%2F%2Fblog.devart.com%2Fdynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/dynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html&title=Dynamics+CRM+OAuth+2.0+Support+and+Other+Improvements+in+SSIS+Components+1.13) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/dynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html&title=Dynamics+CRM+OAuth+2.0+Support+and+Other+Improvements+in+SSIS+Components+1.13) [Copy URL](https://blog.devart.com/dynamics-crm-oauth-2-0-support-and-other-improvements-in-ssis-components-1-13.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/easily-connect-to-postgresql-via-odbc-drivers-setup-guide.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) Easily Connect to PostgreSQL via ODBC Drivers: Setup Guide By [Max Remskyi](https://blog.devart.com/author/max-remskyi) February 24, 2023 [0](https://blog.devart.com/easily-connect-to-postgresql-via-odbc-drivers-setup-guide.html#respond) 2685 By using an [ODBC driver](https://blog.devart.com/category/products/odbc-drivers) to connect to a PostgreSQL database, developers can use a variety of programming languages and tools that support ODBC to interact with the database rather than being limited to just those with native support for PostgreSQL. So here is the guide on how to use ​​ [PostgreSQL ODBC Driver](https://www.devart.com/odbc/postgresql/) easily. Contents Download the Devart ODBC driver for PostgreSQL How to Install PostgreSQL ODBC driver How to Connect to PostgreSQL using ODBC Driver List of Compatibility Download the Devart ODBC driver for PostgreSQL Devart is a third-party company that provides ODBC drivers for various databases, including PostgreSQL. To download the Devart ODBC driver for PostgreSQL, you can follow these steps: Go to the Devart website. On the [page for the Devart ODBC driver for PostgreSQL](https://www.devart.com/odbc/postgresql/download.html) , click Download . Select the version of the driver that is compatible with your operating system and architecture (32-bit or 64-bit). Click Download to start the download process. Once the download is complete, run the installer and follow the prompts to install the driver. After the installation is complete, you should be able to configure a connection to your PostgreSQL database using the Devart ODBC driver. Note: The steps and UI may change based on the Devart website updates. It’s better to check the website for the latest steps. You might also learn about [10 ODBC drivers for marketing, planning and collaboration services](https://blog.devart.com/10-new-odbc-drivers-for-marketing-planning-collaboration-services.html) . How to Install PostgreSQL ODBC driver The PostgreSQL ODBC driver is available for operating systems, including Windows, macOS, and Linux. The Devart ODBC driver manager for PostgreSQL also supports all the major operating systems, and also the driver can be used with various applications and development environments. The PostgreSQL website also provides its official ODBC driver, which can be downloaded from the site and compiled for use on various operating systems. The official PostgreSQL ODBC driver supports Windows, macOS, and Linux, and it also provides support for many other operating systems, such as various versions of UNIX and BSD. It is important to note that the steps to install and configure the ODBC driver may vary depending on the operating system and the specific version of the driver that you are using. It’s always better to check the documentation provided by the driver developer or the database vendor to ensure that you are using the correct version of the driver and that you are following the correct installation and configuration steps for your specific operating system. If you are considering OLEDB or ODBC, we suggest [this article](https://blog.devart.com/oledb-vs-odbc-which-driver-to-choose.html) . Installation on Windows To install the Devart PostgreSQL ODBC driver on a [Windows operating system](https://docs.devart.com/odbc/postgresql/windows.htm) , you can follow these steps: Download the Devart ODBC driver for PostgreSQL from the Devart website. Make sure to select the version that is compatible with your operating system (32-bit or 64-bit) and architecture. Once the download is complete, double-click on the installer file to start the installation process. Follow the prompts to install the driver. You may be asked to agree to the terms and conditions of the license agreement and to specify the location where you want to install the driver. Once the installation is complete, you should be able to configure a connection to your PostgreSQL database using the Devart ODBC driver. To configure a connection, you can use the ODBC Data Source Administrator tool with Windows. Read more about configuration [here](https://blog.devart.com/configuring-an-odbc-driver-manager-on-windows-macos-and-linux.html) . This tool can be accessed by going to Start > Control Panel > Administrative Tools > Data Sources (ODBC). In the ODBC Data Source Administrator, you can add a new data source by clicking on the “Add” button. From the list of available drivers, select the Devart PostgreSQL ODBC driver and then follow the prompts to enter the necessary connection details, such as the server name, port number, and database name. Once you have configured the connection, you can test it by clicking on the “Test” button. If the test is successful, you should be able to connect to your PostgreSQL database using the Devart ODBC driver. Here is a piece on [4 ways how you can test the connection](https://blog.devart.com/4-ways-to-test-an-odbc-connection.html) . Installation on macOS Download the ODBC driver for PostgreSQL from the Devart website. Make sure to select the version that is compatible with your [operating system (macOS)](https://docs.devart.com/odbc/postgresql/macos.htm) . Once the download is complete, double-click on the disk image file (.dmg) to mount it. Double-click on the installer package (.pkg) to start the installation process. Follow the prompts to install the driver. You may be asked to agree to the terms and conditions of the license agreement and to specify the location where you want to install the driver. Once the installation is complete, you should be able to configure a connection to your PostgreSQL database using the Devart ODBC driver. To configure a connection, you can use the ODBC Administrator tool that comes with macOS. This tool can be accessed by going to Applications > Utilities > ODBC Administrator. In the ODBC Administrator, you can add a new data source by clicking on the “Add” button. From the list of available drivers, select the Devart PostgreSQL ODBC driver and then follow the prompts to enter the necessary connection details, such as the server name, port number, and database name. Once you have configured the connection, you can test it by clicking on the “Test” button. If the test is successful, you should be able to connect to your PostgreSQL database using the Devart ODBC driver. We also recommend to read about [Salesforce ODBC connectors advantages over cloud tools](https://blog.devart.com/salesforce-odbc-connectors-advantages-over-cloud-tools.html) . Installation on Linux Download the ODBC driver for PostgreSQL from the Devart website. Make sure to select the version that is compatible with your [operating system (Linux)](https://docs.devart.com/odbc/postgresql/linux.htm) . Once the download is complete, extract the archive file to a directory of your choice. Navigate to the directory where you have extracted the archive file. Run the installer script, typically named “install.sh”. You may need to make the script executable first by running the following command: chmod +x install.sh Run the installer script with administrative privileges. On most Linux systems, you can do this by running the following command: sudo ./install.sh Follow the prompts to install the driver. You may be asked to agree to the terms and conditions of the license agreement and to specify the location where you want to install the driver. Once the installation is complete, you should be able to configure a connection to your PostgreSQL database using the Devart ODBC driver. To configure a connection, you can use the unixODBC tools with most Linux distributions. This can include the odbcinst and isql commands. In the unixODBC tools, you can add a new data source by editing the appropriate configuration files. The exact steps for doing this will depend on your Linux distribution and the version of unixODBC that you are using. Once you have configured the connection, you can test it using the isql command. If the test is successful, you should be able to connect to your PostgreSQL database using the Devart ODBC driver. How to Connect to PostgreSQL using ODBC Driver There are [several ways of connecting to PostgreSQL databases](https://blog.devart.com/connect-to-postgresql-database.html) , using the standard psql utility and GUI tools. One more connecting option is via the Devart ODBC driver on different operating systems, including Windows, macOS, and Linux. The Devart ODBC driver for PostgreSQL provides support for different platforms, so you can use the same driver to connect to PostgreSQL database management systems on different operating systems. To connect to PostgreSQL using the Devart ODBC driver, you need to follow the installation instructions for your specific operating system, configure a connection to your PostgreSQL database, and then use the connection in your application to connect to the database management systems and perform various database operations. Note: The exact steps for installation and connection configuration may vary depending on the version of the Devart ODBC driver and the operating system you are using. It’s better to check the Devart website for the latest instructions. Windows Configuration To configure a connection to a PostgreSQL database using the Devart ODBC driver on Windows, you can follow these steps: Open the ODBC Data Source Administrator tool. You can do this by going to Start > Control Panel > Administrative Tools > Data Sources (ODBC). In the ODBC Data Source Administrator tool, click on the “System DSN” tab and then click on the “Add” button to add a new data source. From the list of available drivers, select the Devart PostgreSQL ODBC driver and then click on the “Finish” button. In the “Create a New Data Source” wizard, enter a name for the data source and then enter the connection details, such as the server name, port number, database management systems name, username, and password. Once you have entered the connection details, click on the “Test” button to test the connection. If the test is successful, you should see a message indicating that the connection was successful. If the test is successful, click the “OK” button to save the connection and close the wizard. You can now use the data source you created in your application to connect to the PostgreSQL database. You can read even more about configurations [here](https://blog.devart.com/installing-odbc-driver-and-creating-data-source-on-windows.html) . Linux Configuration 1. Install the unixODBC tools. On most Linux distributions, you can do this using the package manager for your distribution. 2. Create an ODBC data source configuration file. This file is usually located in the /etc/odbc.ini directory. You can create the file using a text editor. 3. In the ODBC data source configuration file, add the following information: [DataSourceName]\nDriver = Devart PostgreSQL ODBC Driver\nServer = server_name\nPort = port_number \nDatabase = database_name \nUsername = username \nPassword = password 4. Replace “DataSourceName” with a name for your data source, and replace the other values with the connection details for your PostgreSQL database. 5. Save the ODBC data source configuration file. 6. Use the odbcinst command to install the Devart ODBC driver for PostgreSQL. You can do this by running the following command: odbcinst -i -d -f /path/to/odbcinst.ini Replace “/path/to/odbcinst.ini” with the path to the odbcinst.ini file that comes with the Devart ODBC driver for PostgreSQL. 7. Use the isql command to test the connection to your PostgreSQL database. You can do this by running the following command: isql DataSourceName username password Replace “DataSourceName” with the name of the data source you created, and replace “username” and “password” with the appropriate values. 8. If the test is successful, you should be able to connect to your PostgreSQL database using the Devart ODBC driver. Note: The steps and the commands may change based on the Devart website updates. It’s better to check the website for the latest steps. MacOS Configuration 1. Install the unixODBC tools. On macOS, you can do this using the Homebrew package manager by running the following command: brew install unixodbc 2. Create an ODBC data source configuration file. This file is usually located in the /usr/local/etc/odbc.ini directory. You can create the file using a text editor. 3. In the ODBC data source configuration file, add the following information: [DataSourceName]\nDriver = Devart PostgreSQL ODBC Driver\nServer = server_name\nPort = port_number \nDatabase = database_name \nUsername = username \nPassword = password Replace “DataSourceName” with a name for your data source, and replace the other values with the connection details for your PostgreSQL database management systems. Save the ODBC data source configuration file. Use the odbcinst command to install the Devart ODBC driver for PostgreSQL. You can do this by running the following command: odbcinst -i -d -f /path/to/odbcinst.ini Replace “/path/to/odbcinst.ini” with the path to the odbcinst.ini file that comes with the Devart ODBC driver for PostgreSQL. Use the isql command to test the connection to your PostgreSQL database. You can do this by running the following command: isql DataSourceName username password Replace “DataSourceName” with the name of the data source you created, and replace “username” and “password” with the appropriate values. If the test is successful, you should be able to connect to your PostgreSQL database using the Devart ODBC driver. List of Compatibility ​​The Devart ODBC driver for PostgreSQL provides [compatibility with third-party tools](https://www.devart.com/odbc/third-party-tools.html) and platforms. Here is the full list of some of the used tools: Application Development Tools Database Management Systems BI & Analytics Software Office Software Suites Adobe ColdFusion Embarcadero Delphi & C++ Builder FileMaker Lazarus Microsoft Visual FoxPro Microsoft Visual Studio Omnis Stud PHP PowerBASI Python Aqua Data Studio dbForge StudiodBeaver EMS SQL Management Studio Informatica Cloud RazorSQL SQL Server Data Tools SQL Server Management Studio SQL Server Reporting Services Alteryx DBxtra Dundas B IBM SPSS Statistics MicroStrat Power BI Qlik Sense QlikView RStudio SAP Crystal Reports SAS JMP Tableau TARGIT TIBCO Spotfire LibreOffice Microsoft Access Microsoft Excel OpenOffice StarOffice Conclusion Connecting to a PostgreSQL database using an ODBC driver is a relatively straightforward process, but the specific steps you need to follow will depend on your operating system and the ODBC driver you are using. With the Devart ODBC driver for PostgreSQL, you can connect to your PostgreSQL database management systems from a wide range of third-party tools and platforms, including [Microsoft Excel](https://www.devart.com/odbc/excel/) , Microsoft Access, [Microsoft Power BI](https://www.devart.com/odbc/powerbi/) , [Tableau](https://www.devart.com/odbc/tableau/) , QlikView, [Oracle Data Integrator (ODI)](https://www.devart.com/odbc/oracle/) , Informatica PowerCenter, and Talend. Overall, connecting to PostgreSQL via an ODBC driver is a convenient and flexible way to access your data, as long as you have the necessary technical knowledge and follow the correct steps. Tags [odbc](https://blog.devart.com/tag/odbc) [PostgreSQL](https://blog.devart.com/tag/postgresql) [setup](https://blog.devart.com/tag/setup) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Feasily-connect-to-postgresql-via-odbc-drivers-setup-guide.html) [Twitter](https://twitter.com/intent/tweet?text=Easily+Connect+to+PostgreSQL+via+ODBC+Drivers%3A+Setup+Guide&url=https%3A%2F%2Fblog.devart.com%2Feasily-connect-to-postgresql-via-odbc-drivers-setup-guide.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/easily-connect-to-postgresql-via-odbc-drivers-setup-guide.html&title=Easily+Connect+to+PostgreSQL+via+ODBC+Drivers%3A+Setup+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/easily-connect-to-postgresql-via-odbc-drivers-setup-guide.html&title=Easily+Connect+to+PostgreSQL+via+ODBC+Drivers%3A+Setup+Guide) [Copy URL](https://blog.devart.com/easily-connect-to-postgresql-via-odbc-drivers-setup-guide.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) Easy Magento Bulk Order Processing Using Excel and Add-in By [Victoria Shyrokova](https://blog.devart.com/author/victorias) December 27, 2024 [0](https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html#respond) 565 Receiving a flood of orders in your Magento store is exciting, but handling each one separately is not that great. With Magento bulk order processing, you can quickly process shipments, create invoices, update tracking information, and modify the status of several orders at once. This will save you time and help you manage high volumes of orders without worrying about manual errors. Plus, you can easily import and export your Magento bulk order data into a spreadsheet and adjust literally any part of the ordering process directly from Excel. When combined with a robust add-in, managing your inventory and fulfilling multiple orders boils down to just a few clicks. There are also several Excel’s formulas you can use to automate tasks like calculating shipping costs, applying discounts, and generating reports. Table of contents How to streamline data from Magento to Excel Practical applications Key Excel formulas for inventory management Conclusion How to streamline data from Magento to Excel The Magento 2 order processing workflow is kind of clunky, but you can use plenty of tools to streamline the process. Methods to streamline data from Magento to Excel Magento offers a built-in import feature, but you need to create the CSV files manually, map the fields, and upload them into Magento. Exporting can be rather cumbersome too. You have to go into the Sales section, export the order data in CSV format, and then open that in Excel. If you’re working with large datasets or making frequent updates, this takes too much time. Unlike these standard methods, Excel Add-ins allow real-time data synchronization and bulk editing from within Excel. That means you can quickly update bulk orders without the hassle of switching between applications. Most of them include advanced features like customizable reporting and automated data updates. A few offer 30-day free trials to test the product before committing, such as the [Devart Excel Add-in](https://www.devart.com/excel-addins/magento/) . Setting up the Devart Excel Add-in The Devart Excel Add-in for Magento lets you import and export data from your store to Excel and vice versa with just a few clicks. You can work with products, orders, and categories like standard Excel sheets, and instantly refresh changes to your Magento store as soon as you’re finished. Plus, it allows you to use SQL queries to pull out exactly the information you need. Installing and setting up the Devart Excel Add-in is pretty simple. Here’s a step-by-step of what you need to do: Before you start: Verify that you’re using Microsoft Excel 2007 or newer (up to 2021). Close all opened workbooks in Excel. Make sure you have admin rights on your computer to install the add-in. Download and install .NET Framework 4.0 or later, and also Visual Studio 2010 Tools for Office Runtime. Step 1: Download the Excel Add-in Go to the Devart Excel Add-in for Magento page and click the download button. You will get a file named devartexcel.exe . Once downloaded, you should find it in the Downloads folder. If you’re using Chrome, just click the Downloads icon in the top right corner to open the file or locate the folder where it was saved. Step 2: Run the installer Next, double-click the downloaded installer file to launch the Setup Wizard . From here, simply follow the on-screen instructions. You’ll be asked to choose a destination folder . This is where the Devart Excel Add-in will be stored on your computer. Pick a spot where you have permission to save files, like your Documents folder, the Program Files directory, or your User folder. Then, you’ll have to select which components of the Add-in you want to install . There are several options, but don’t worry — just focus on the basics of Magento. Check the box next to Adobe Commerce . If you plan to work with other data sources, you might also want to add the SQL Server and MySQL components. If you’re unsure, it’s best to go for the Full installation option in the dropdown menu. The [Devart Excel Add-in Universal Pack](https://www.devart.com/excel-addins/universal-pack/) doesn’t take up much space and gives you all the features and flexibility you might need in the future. When you’re done, click Next . To make sure everything is set up correctly, don’t interrupt it by closing any windows or doing other things on your computer. Once the installation is finished, you’ll see a confirmation screen. Click Finish to close the Setup Wizard. Step 3: Connect to your Magento store With Devart Add-in installed and configured, you can now connect Excel to your Magento store. In the Devart tab, click the Get Data button . That will open the Import Data Wizard. In the wizard, choose Adobe Commerce as your data source. Next, fill in the rest of the fields with details about your Magento 2 store: Domain: Your store’s URL. User ID: Your Magento admin username. Password: Your Magento admin password. Don’t forget to check the Allow reuse connection in Excel checkbox so that you will have access to your connections later through the option Manage Connections from the Devart tab. This will open the Connections Editor , where you can choose an existing connection to make changes in a snap. For example, refreshing your login details or tweaking your API keys. Practical applications So, how exactly can you use Excel and Devart’s Add-in to streamline your Magento 2 order processing? Importing bulk orders using Excel Add-ins If you’ve connected your store and need to quickly import orders from Excel, go to the Devart tab and click Import Data . Then: Choose your order datasheet and select Sales Order as the entity to import Switch to the Visual Query Builder to create and edit SQL queries without writing code. You can pick database objects (e.g., tables), their columns, apply filters, and set sorting options. Review your data in the Data Preview window before completing the import. Click Edit Mode to modify cells or add new rows by filling in the green-highlighted empty row or right-clicking to insert a new row. After making edits, hit Commit to save changes back to Magento or Rollback to discard them. Lastly, access your Magento admin panel and check the Orders section to make sure everything migrated correctly. Managing inventory levels To analyze your Magento inventory and spot low-stock items: First, configure your Magento settings under Stores > Configuration > Catalog > Inventory . Next, go to the Devart tab, click Export Data , and select Product to export your product data into Excel. Review the data for accuracy, ensuring you have columns for Product SKU, Reorder Point, Supplier Information, Lead Time, Cost Price, Selling Price, and Last Updated. Use conditional formatting to highlight low stock levels —set cells to turn red when stock drops below the reorder point. Key Excel formulas for inventory management While the Devart Add-in simplifies Magento bulk order processing, you can combine it with Excel formulas to get even deeper insights and automate tasks. Here are some of the most useful: SUM and SUMIFS: Use SUM to calculate total quantities, like =SUM(B2:B10) . For more specific totals, use SUMIFS to filter by criteria, such as =SUMIFS(C2:C10, A2:A10, “Laptops”, B2:B10, “>10”) , which sums quantities in “Laptops” with stock levels over 10. COUNTIF and COUNTIFS: COUNTIF lets you count items meeting a condition, e.g., =COUNTIF(B2:B10, “<5”) for stock below 5. For multiple conditions, use COUNTIFS, like =COUNTIFS(A2:A10, “Electronics”, B2:B10, “<5”) . MIN and MAX: Use MIN to find the lowest stock level and MAX to identify top-selling items. For example, =MIN(B2:B10) and =MAX(C2:C10) . SORT: This function can organize your data. For instance, =SORT(A2:C10, 3, TRUE) sorts the range A2:C10 based on the values in the third column (e.g., stock levels) in ascending order. Conclusion Magento bulk order processing doesn’t have to be a nightmare. The solution is a simple Excel spreadsheet and a reliable add-in that integrates seamlessly with both platforms. If you’re looking to streamline your workflow, [try the Devart Excel Add-in for Magento 2](https://www.devart.com/excel-addins/magento/) . You’ll be able to import and export large datasets, customize your inventory management, and update your Magento store in real-time directly from Excel — all within a few minutes. Tags [bulk order processing](https://blog.devart.com/tag/bulk-order-processing) [magento bulk order processing](https://blog.devart.com/tag/magento-bulk-order-processing) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Feasy-magento-bulk-order-processing-using-excel-and-add-in.html) [Twitter](https://twitter.com/intent/tweet?text=Easy+Magento+Bulk+Order+Processing+Using+Excel+and+Add-in&url=https%3A%2F%2Fblog.devart.com%2Feasy-magento-bulk-order-processing-using-excel-and-add-in.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html&title=Easy+Magento+Bulk+Order+Processing+Using+Excel+and+Add-in) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html&title=Easy+Magento+Bulk+Order+Processing+Using+Excel+and+Add-in) [Copy URL](https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html) RELATED ARTICLES [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [How to Consolidate Customer Data Into Excel Using Powerful Add-ins](https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html) April 10, 2025 [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [Excel Add-ins 2.10 Are Coming Soon](https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html) November 4, 2024"} {"url": "https://blog.devart.com/easy-ways-to-quickly-exclude-objects-from-schema-synchronization.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Easy ways to quickly exclude objects from schema synchronization By [dbForge Team](https://blog.devart.com/author/dbforge) August 4, 2009 [0](https://blog.devart.com/easy-ways-to-quickly-exclude-objects-from-schema-synchronization.html#respond) 3794 Many users of [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) ask us how to exclude all the objects or a group of objects at once from synchronization. It is not a problem. Pay attention to the document with comparison results. In the middle of the grid, just between Source Objects and Target Objects columns, you can see two columns – one with check boxes (they will help you exclude the objects from the synchronization or include them), the other – with operations which will be applied to the Target objects during the synchronization. Document with comparison results 4 ways to exclude all the objects from the synchronization: 1. Select the check box in the header of the column with check boxes.  This will unselect check boxes next to all the objects shown in the grid and all of them will be excluded from the synchronization, no matter what operation the objects have. To include all the objects,  select the top check box again. Using the check box at the top of the grid 2. Highlight all the objects in the grid by pressing Ctrl+A and unselect a check box next to any object.  Note, you should expand all objects groups  in the grid before highlighting, otherwise such objects won’t be highlighted and then excluded from the synchronization as they are filtered by the group. (To include the highlighted objects into the synchronization, select any of their check boxes.) Excluding all the objects by using the check box at the header of the grid 3. Highlight the objects using Ctr+A and use the Exclude Selection option from the right-click menu.  (To include the objects, use the Include Selection option.) Excluding all the objects by using the right-click menu 4.  (not recommended) Select the check boxes  next to each object in the grid. 5 ways to exclude a group of objects from the synchronization: 1. Highlight required objects holding Ctrl or Shift keys, then unselect any of the objects’ check boxes.  (If you select any of these check boxes, the highlighted objects will be included into the synchronization.) Excluding a group of objects by using the check box at the header of the grid 2. Highlight the objects using Ctrl or Shift keys and use the Exclude Selection option from the right-click menu.  To include the objects, use the Include Selection option.) Excluding a group of objects using a right-click menu 3. This way is good for excluding a group of objects which can be filtered by a set of symbols in their names. For example, you would like to exclude all the objects containing the “address” word in the name. Enter address into the Filter field at the header of the grid. The grid will show all the objects matching this criterion. Select the check box at the header of the grid. Excluding a group of objects filtered by a set of symbols in their names 4. This way is good for excluding a group of objects with the same status, for example Different objects (ones found in both Source and Target databases, but have differences in DDL).  Click the Filter icon on the Comparison toolbar and select Different from the drop-down list.  The grid will show only Different objects.  Select the check box at the header of the grid. Excluding a group of objects filtered by their status 5. It combines previous two ways. You filter the objects by their status (e.g., Only in Source, Only in Target, etc) and by a set of symbols in the objects’ names. When the grid shows all the matching objects, select the check box at the header of the grid. [Download a free 30-day evaluation version](https://www.devart.com/dbforge/sql/schemacompare/download.html) of the dbForge Schema Compare for SQL Server and practice easy synchronization. Tags [Schema Compare](https://blog.devart.com/tag/schema-compare) [synchronize database](https://blog.devart.com/tag/synchronize-database) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Feasy-ways-to-quickly-exclude-objects-from-schema-synchronization.html) [Twitter](https://twitter.com/intent/tweet?text=Easy+ways+to+quickly+exclude+objects+from+schema+synchronization&url=https%3A%2F%2Fblog.devart.com%2Feasy-ways-to-quickly-exclude-objects-from-schema-synchronization.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/easy-ways-to-quickly-exclude-objects-from-schema-synchronization.html&title=Easy+ways+to+quickly+exclude+objects+from+schema+synchronization) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/easy-ways-to-quickly-exclude-objects-from-schema-synchronization.html&title=Easy+ways+to+quickly+exclude+objects+from+schema+synchronization) [Copy URL](https://blog.devart.com/easy-ways-to-quickly-exclude-objects-from-schema-synchronization.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/ef-core-relationships.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) Relationships in Entity Framework Core: Complete Guide for .NET Developers By [Anna Bilchenko](https://blog.devart.com/author/annabil) February 24, 2025 [0](https://blog.devart.com/ef-core-relationships.html#respond) 410 [Entity Framework Core (EF Core)](https://www.devart.com/dotconnect/what-is-entity-framework-core.html) is a modern object-relational mapper (ORM) for .NET Core and .NET applications, enabling efficient database interaction while minimizing the need for raw SQL queries. A fundamental aspect of EF Core is defining Entity Framework relationships, ensuring data consistency and referential integrity in relational databases. This guide explores one-to-one, one-to-many, and many-to-many relationships in EF Core, covering their implementation, configuration, and best practices. Additionally, we will introduce Entity Developer, a powerful ORM modeling tool which simplifies relationship management, code generation, and schema synchronization for EF Core applications. By the end of this article, you’ll have a clear understanding of how to structure EF Core entity relationships effectively and leverage tools to optimize database design and management in EF Core. Contents Understanding Entity Relationships in EF Core One-to-One Relationships in EF Core Many-to-One Relationships in EF Core Many-to-Many Relationships in EF Core Advanced Relationship Management in EF Core Tools for Simplifying ORM Work in EF Core Conclusion Understanding Entity Relationships in EF Core In terms of database design, entity relationships define how tables are connected. They are all about ensuring that the data remains consistent at all times and query execution always stays efficient. Entity Framework Core relationships are established with the help of navigation properties and foreign keys, which facilitates seamless class-based access to relational data. EF Core supports such main relationship types as: One-to-One (1:1) – Each item in one table connects to only one item in another table. This link is created using a foreign key and an EF Core navigation property. One-to-Many (1:N) – One item in a table can be linked to multiple items in another table. The connection is made using a foreign key in the related (dependent) table. Many-to-One (N:1) – The inverse of EF Core One-to-Many, where multiple items in one table connect to a single item in another table. Many-to-Many (N:N) – Multiple items in a single table can be linked to multiple ones in another table. A junction table that contains foreign keys is used to store these connections. Defining relationships with foreign keys and navigation properties is what allows EF Core to manage relational data in a productive way while making queries less complex. One-to-One Relationships in EF Core An EF Core One-to-One (1:1) relationship occurs when one entity is linked to exactly one related entity. For example, a User may have a UserProfile, where each user corresponds to a single profile. When to Use EF Core Relationships One-to-One? Separating concerns (e.g., storing sensitive user details separately). Optimizing performance (e.g., moving rarely accessed data to another table). Configuring Entity Framework Core One-to-One Relationships 1. By Convention: EF Core automatically detects one-to-one relationships, for which a navigation property needs to be present in both entities. public class User\n{\n public int UserId { get; set; }\n public string Username { get; set; }\n\n // Navigation property\n public UserProfile UserProfile { get; set; }\n}\n\npublic class UserProfile\n{\n public int UserProfileId { get; set; }\n public string Bio { get; set; }\n\n // Navigation property\n public User User { get; set; }\n} In this example: User has a primary key UserId . UserProfile has a primary key UserProfileId . UserProfile has a navigation property User , and EF Core will look for a foreign key property named UserId by convention. 2. Using Data Annotations: public class User\n{\n [Key]\n public int UserId { get; set; }\n public string Username { get; set; }\n\n // Navigation property\n public UserProfile UserProfile { get; set; }\n}\n\npublic class UserProfile\n{\n [Key]\n [ForeignKey(\"User\")]\n public int UserProfileId { get; set; }\n public string Bio { get; set; }\n\n // Navigation property\n public User User { get; set; }\n} In this example: User has a navigation property UserProfile to access the related UserProfile entity. UserProfile has a navigation property User to access the related User entity. 3. DbContext with Fluent API Configuration: using Microsoft.EntityFrameworkCore;\n\npublic class ApplicationDbContext : DbContext\n{\n public DbSet Users { get; set; }\n public DbSet UserProfiles { get; set; }\n\n protected override void OnModelCreating(ModelBuilder modelBuilder)\n {\n modelBuilder.Entity()\n .HasOne(u => u.UserProfile)\n .WithOne(up => up.User)\n .HasForeignKey(up => up.UserProfileId);\n }\n} In this example: HasOne : Specifies that a User has one UserProfile . WithOne : Specifies that a UserProfile is associated with one User . HasForeignKey : Specifies that UserProfileId in UserProfile is the foreign key that establishes the relationship with User . Usage of EF Core One-to-One Relationships using (var context = new ApplicationDbContext())\n{\n var user = new User\n {\n Username = \"johndoe\",\n UserProfile = new UserProfile\n {\n Bio = \"Software Developer\"\n }\n };\n\n context.Users.Add(user);\n context.SaveChanges();\n} Many-to-One Relationships in EF Core A Many-to-One (N:1) relationship is the inverse of an Entity Framework One-to-Many, where multiple entities reference a single parent entity. For instance, in an e-commerce system, multiple Orders belong to a single Customer, but each order is linked to only one customer. In EF Core relationships One-to-Many, such associations model cases where one entity links to multiple dependents. Configuring Many-to-One Relationships in EF Core 1. By Convention: EF Core automatically detects many-to-one relationships when an entity has a foreign key reference to another entity. public class Author\n{\n public int AuthorId { get; set; }\n public string Name { get; set; }\n\n // Navigation property\n public List Books { get; set; }\n}\n\npublic class Book\n{\n public int BookId { get; set; }\n public string Title { get; set; }\n\n // Foreign key property\n public int AuthorId { get; set; }\n\n // Navigation property\n public Author Author { get; set; }\n} In this example: Author has a collection navigation property Books to access related Book entities. Book has a reference navigation property Author to access the related Author entity. 2. Using Data Annotations: public class Author\n{\n [Key]\n public int AuthorId { get; set; }\n public string Name { get; set; }\n\n // Navigation property\n public List Books { get; set; }\n}\n\npublic class Book\n{\n [Key]\n public int BookId { get; set; }\n public string Title { get; set; }\n\n // Foreign key property\n [ForeignKey(\"Author\")]\n public int AuthorId { get; set; }\n\n // Navigation property\n public Author Author { get; set; }\n} In this example: Author has a collection navigation property Books to access related Book entities. Book has a reference navigation property Author to access the related Author entity. 3. DbContext with Fluent API Configuration: using Microsoft.EntityFrameworkCore;\n\npublic class ApplicationDbContext : DbContext\n{\n public DbSet Authors { get; set; }\n public DbSet Books { get; set; }\n\n protected override void OnModelCreating(ModelBuilder modelBuilder)\n {\n modelBuilder.Entity()\n .HasOne(b => b.Author)\n .WithMany(a => a.Books)\n .HasForeignKey(b => b.AuthorId);\n }\n} In this example: HasOne : Specifies that a Book has one Author . WithMany : Specifies that an Author can have many Books . HasForeignKey : Specifies that AuthorId in Book is the foreign key that establishes the relationship with Author . Usage of EF Core Many-to-One Relationships using (var context = new ApplicationDbContext())\n{\n var author = new Author\n {\n Name = \"George Orwell\",\n Books = new List\n {\n new Book { Title = \"1984\" },\n new Book { Title = \"Animal Farm\" }\n }\n };\n\n context.Authors.Add(author);\n context.SaveChanges();\n} Many-to-Many Relationships in EF Core An Entity Framework Core Many-to-Many (N:N) relationship occurs when multiple entities are associated with multiple related entities. For example, Students can enroll in multiple Courses, and each course can have multiple students. Similarly, Products can belong to multiple Categories, and categories can contain multiple products. Configuring EF Core Relationships Many-to-Many 1. Using a Join Table (Before EF Core 5.0): Before EF Core 5.0, a junction entity was required to model many-to-many relationships explicitly. public class Student\n{\n public int StudentId { get; set; }\n public string Name { get; set; }\n\n // Navigation property\n public List Courses { get; set; }\n}\n\npublic class Course\n{\n public int CourseId { get; set; }\n public string CourseName { get; set; }\n\n // Navigation property\n public List Students { get; set; }\n}\n\npublic class StudentCourse\n{\n public int StudentId { get; set; }\n public Student Student { get; set; }\n\n public int CourseId { get; set; }\n public Course Course { get; set; }\n} Explanation Join Table : The StudentCourse entity serves as the join table, containing foreign keys to both Student and Course . Composite Key : The HasKey method specifies that the combination of StudentId and CourseId serves as the composite primary key for the join table. Relationships : HasOne and WithMany methods define the relationships between StudentCourse and the respective entities ( Student and Course ). HasForeignKey specifies the foreign key properties in the join table. 2. Using Implicit Many-to-Many Support (EF Core 5.0+): EF Core automatically creates a join table when defining many-to-many relationships with navigation properties. public class Student\n{\n public int StudentId { get; set; }\n public string Name { get; set; }\n\n // Navigation property\n public List Courses { get; set; }\n}\n\npublic class Course\n{\n public int CourseId { get; set; }\n public string CourseName { get; set; }\n\n // Navigation property\n public List Students { get; set; }\n} Explanation Navigation Properties : Both Student and Course entities have navigation properties that are collections of the other entity type. Implicit Join Table : EF Core will automatically create a join table named StudentCourse (by default) with foreign keys to both Student and Course . Fluent API Configuration : The HasMany and WithMany methods in the OnModelCreating method are used to configure the many-to-many relationship. Usage of EF Core Many-to-Many Relationships You can use these entities to perform database operations as follows: using (var context = new SchoolDbContext())\n{\n var student = new Student\n {\n Name = \"Alice\",\n Courses = new List\n {\n new Course { CourseName = \"Mathematics\" },\n new Course { CourseName = \"Physics\" }\n }\n };\n\n context.Students.Add(student);\n context.SaveChanges();\n} Advanced Relationship Management in EF Core Managing Referential Integrity and Cascade Delete Entity Framework foreign keys enforce referential integrity, ensuring that related records remain consistent. When deleting a parent entity, cascade delete rules determine whether dependent entities are also removed. This is crucial when restructuring relationships or migrating data. EF Core supports three behaviors: Cascade: Deletes dependent entities automatically. Restrict: Prevents deletion if related entities exist. SetNull: Replaces foreign key values with NULL instead of removing the record. Here’s how you can configure cascade delete with the help of Fluent API: using Microsoft.EntityFrameworkCore;\n\npublic class LibraryDbContext : DbContext\n{\n public DbSet Authors { get; set; }\n public DbSet Books { get; set; }\n\n protected override void OnModelCreating(ModelBuilder modelBuilder)\n {\n modelBuilder.Entity()\n .HasOne(b => b.Author)\n .WithMany(a => a.Books)\n .HasForeignKey(b => b.AuthorId)\n .OnDelete(DeleteBehavior.Cascade);\n }\n} When you manage cascade delete in a proper way, this prevents orphaned records, maintaining the consistency of data. Compare EF Core with other [.NET ORM frameworks](https://blog.devart.com/best-postgresql-orm-in-dotnet.html) to learn which one will suit you better. Tools for Simplifying ORM Work in EF Core Developers working with EF Core, NHibernate, and LinqConnect often need a visual tool to manage entity relationships efficiently. Entity Developer simplifies ORM design by allowing developers to define relationships visually instead of manually coding them. Additionally, dotConnect enhances database connectivity, providing high-performance integration for EF Core and other ORMs. How Entity Developer Helps Manage EF Core Relationships [Entity Developer](https://www.devart.com/entitydeveloper/) allows programmers to graphically define One-to-One, One-to-Many, Many-to-One, and Many-to-Many relationships, eliminating the need for manual configuration in code. It automatically generates the EF Core model based on the visual schema, reducing errors and improving development speed across an entire project. .NET and C# programmers benefit from Entity Developer by optimizing ORM mapping, ensuring consistent data models, and reducing time spent on debugging relationship configurations. With its intuitive interface, it’s an essential tool for handling complex database schemas efficiently. Download a free trial of [Entity Developer](https://www.devart.com/entitydeveloper/download.html) to simplify EF Core relationship management. dotConnect – A Universal Connector for EF Core and Other ORMs dotConnect is a high-performance [ADO.NET provider](https://www.devart.com/dotconnect/entityframework.html) . What it does is enable EF Core applications to connect to PostgreSQL, MySQL, SQLite, Oracle, and more. dotConnect does not alter EF Core relationships, but it works well for enhancing database connectivity and query execution while boosting general application stability. With built-in optimizations for EF Core, NHibernate, and LinqConnect, dotConnect improves data access speed and reduces latency. If you are a developer needing database communication that runs smoothly across multiple ORM frameworks, this is your reliable solution. Try [dotConnect](https://www.devart.com/dotconnect/) for free: it’s the simplest way to experience fully optimized database connectivity for EF Core applications. Conclusion If your goal is to design efficient database schemas, you cannot do so without understanding One-to-One, One-to-Many, and Many-to-Many relationships in EF Core. When you properly configure relationships using conventions, Data Annotations, and Fluent API, this leads to increased data integrity, upscale performance, and maintainable code. Entity Developer’s visual approach to relationship management works well to simplify ORM processes, while dotConnect is the go-to solution for enhancing database connectivity and query execution. When developers apply best practices in EF Core, coupled with leveraging advanced ORM tools, they can build applications that are scalable and high-performance. Boost your EF Core database connectivity by getting started with [dotConnect](https://www.devart.com/dotconnect/) today. Tags [.NET Development](https://blog.devart.com/tag/net-development) [dotconnect](https://blog.devart.com/tag/dotconnect) [ef core](https://blog.devart.com/tag/ef-core) [entity developer](https://blog.devart.com/tag/entity-developer) [entity framework](https://blog.devart.com/tag/entity-framework) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [Anna Bilchenko](https://blog.devart.com/author/annabil) Always curious about how data flows and functions, I dive deep into the design and management of databases, from the first table sketch to fine-tuned performance. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fef-core-relationships.html) [Twitter](https://twitter.com/intent/tweet?text=Relationships+in+Entity+Framework+Core%3A+Complete+Guide+for+.NET+Developers&url=https%3A%2F%2Fblog.devart.com%2Fef-core-relationships.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/ef-core-relationships.html&title=Relationships+in+Entity+Framework+Core%3A+Complete+Guide+for+.NET+Developers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/ef-core-relationships.html&title=Relationships+in+Entity+Framework+Core%3A+Complete+Guide+for+.NET+Developers) [Copy URL](https://blog.devart.com/ef-core-relationships.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/ef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) EF Core Support Improvements in dotConnect for PostgreSQL 7.21 By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 29, 2021 [0](https://blog.devart.com/ef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html#respond) 2960 The new version of [Devart dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) includes significantly improved support for Entity Framework Core. It both supports the new data types and extends LINQ query translation capabilities. Additionally, we have improved Entity Framework Core Code-First Migrations support. WHERE Condition Support for Index in\nCode-First Migrations For Entity Framework Core 3\nand 5, dotConnect for PostgreSQL now supports specifying a condition for an index.\nThis allows creating [PostgreSQL partial indexes](https://www.postgresql.org/docs/current/indexes-partial.html) . Fluent mapping code: protected override void OnModelCreating(ModelBuilder modelBuilder)\n{\n modelBuilder.Entity()\n .HasIndex(t => t.NumericColumn)\n .HasFilter(\"\\\"NumericColumn\\\" < 1000\");\n} Code-First Migration code: migrationBuilder.CreateIndex(name: \"IX_Table_NumericColumn\", table: \"Table\", column: \"NumericColumn\", filter: \"\\\"NumericColumn\\\" < 1000\"); SQL DDL: CREATE INDEX \"IX_Table_NumericColumn\" ON \"Table\" (\"NumericColumn\") WHERE \"NumericColumn\" < 1000 LINQ to Entities Improvements dotConnect for PostgreSQL now supports translation of the following LINQ features to SQL for both EF Core 3 and EF Core 5: The static IsNullOrWhiteSpace() method of the String class The static Today property and instance DayOfWeek and Ticks properties of the DateTime class The following static methods of the Math class: Max(), Min(), Sqrt(), Log(), Log10(), Sin(), Cos(), Tan(), Asin(), Acos(), Atan() For EF Core 5, more LINQ features can be translated to SQL: The static Parse() method of the System.Net.IPAddress class The static Parse() method of the System.Net.NetworkInformation.PhysicalAddress class The following static methods of the MathF class: Abs(), Round(), Truncate(), Floor(), Ceiling(), Max(), Min(), Pow(), Sqrt(), Log(), Log10(), Sin(), Cos(), Tan(), Asin(), Acos(), Atan() Uri Data Type Mapping For Entity Framework Core 3 and 5, dotConnect for PostgreSQL now supports mapping the internet/intranet System.Uri type to PostgreSQL ‘text’ data type. public class Blog {\n public int Id { get; set; }\n public Uri Url { get; set; }\n public List Posts { get; set; }\n} CREATE TABLE \"Blog\" ( \n \"Id\" serial NOT NULL,\n \"Url\" text NULL,\n PRIMARY KEY (\"Id\")\n) IPAddress and PhysicalAddress data type\nmapping For Entity Framework Core 5, dotConnect for PostgreSQL now supports mapping network types System.Net.IPAddress and System.Net.NetworkInformation.PhysicalAddress to PostgreSQL ‘inet’ and ‘macaddr’/’macaddr8’ data types. public class AccessLog {\n public int Id { get; set; }\n public Uri Url { get; set; }\n public IPAddress IP { get; set; }\n public DateTime Timestamp { get; set; }\n} CREATE TABLE \"AccessLog\" ( \n \"Id\" serial NOT NULL,\n \"Url\" text NULL,\n \"IP\" inet NULL,\n \"Timestamp\" text NULL,\n PRIMARY KEY (\"Id\")\n) Dictionary data type mapping For Entity Framework Core 3\nand 5, dotConnect for PostgreSQL now supports mapping the dictionary .NET types\nto [HSTORE PostgreSQL\ndata type](https://www.postgresql.org/docs/current/hstore.html) . Mapping of the following\ntypes is supported: Dictionary SortedDictionary ImmutableDictionary ImmutableSortedDictionary The most natural approach would be mapping string to string, i.e. Dictionary , SortedDictionary , ImmutableDictionary , or ImmutableSortedDictionary , because PostgreSQL HSTORE data type stores and returns data as a collection of key/value pairs of strings. key must be not null, but value can be null. public class HstoreSample {\n public int Id { get; set; }\n public Dictionary Dictionary { get; set; }\n public ImmutableDictionary ImmutableDictionary { get; set; }\n} CREATE TABLE \"HstoreSample\" ( \n \"Id\" serial NOT NULL,\n \"Dictionary\" hstore NULL,\n \"ImmutableDictionary\" hstore NULL,\n PRIMARY KEY (\"Id\")\n) Sometimes using non-string\ntypes can be necessary, so we have supported a number of primitive types as generic\ntype arguments. The following .NET types can be used as both key and value: String Byte SByte Int16 Int32 Int64 Single Double Decimal DateTime DateTimeOffset Boolean Any TKey/TValue combinations\nare supported. You can use Dictionary , SortedDictionary , ImmutableDictionary or any others. It’s\nnot recommended, however, to use floating-point numeric types, like Single and Double , as a key, because of their imprecision. Nullable versions of these\ndata types (int?, DateTime?, etc.) are not supported. LINQ queries support the\nfollowing features for dictionaries: 1) Getting a value by the key The following LINQ query: var query = context.HstoreSample.Where(t => t.Dictionary[\"a\"] == \"first\").ToList(); results in the following SQL: SELECT * FROM \"HstoreSample\" WHERE \"Dictionary\" -> 'a' = 'first' 2) Check for the presence of a key in the dictionary LINQ: var query = context.HstoreSample.Where(t => t.Dictionary.ContainsKey(\"a\")).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE \"Dictionary\" ? 'a' 3) Check if the key/value pair\nis present in the dictionary (only for ImmutableDictionary and\nImmutableSortedDictionary) LINQ: var query = context.HstoreSample.Where(t => t.ImmutableDictionary.Contains(\"a\", \"first\")).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE \"ImmutableDictionary\" @> hstore('a', 'first') 4) Getting the number of\nkey/value pairs in the dictionary LINQ: var query = context.HstoreSample.Where(t => t.Dictionary.Count <= 3).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE ARRAY_LENGTH(AKEYS(\"Dictionary\"), 1) <= 3 5) Check if the dictionary is\nempty (only for ImmutableDictionary and ImmutableSortedDictionary) LINQ: var query = context.HstoreSample.Where(t => !t.ImmutableDictionary.IsEmpty).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE NOT (ARRAY_LENGTH(AKEYS(\"ImmutableDictionary\"), 1) = 0) 6) Concatenation of HSTORE values\n(only for ImmutableDictionary and ImmutableSortedDictionary) LINQ: var query = context.HstoreSample.Where(t => t.ImmutableDictionary.AddRange(t.AnotherDictionary).ContainsKey(\"a\")).ToList(); SQL: SELECT * FROM \"HstoreSample\" t WHERE \"ImmutableDictionary\" || \"AnotherDictionary” ? 'a' 7) Conversion of a string to\na dictionary: If a string column, parameter,\nor variable stores a valid HSTORE value, it can be converted to HSTORE, and you\ncan perform the corresponding operations on it. Such value can be converted\nto any of the supported types –  Dictionary/SortedDictionary/ImmutableDictionary/ImmutableSortedDictionary. LINQ: var query = context.TextTable.Where(t => ((Dictionary)(object)t.TextColumn).ContainsKey(\"a\")).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE CAST(\"TextColumn\" AS hstore) ? 'a' 8) Conversion of Dictionary and SortedDictionary to ImmutableDictionary and ImmutableSortedDictionary The following methods are\navailable: . ToImmutableDictionary () and . ToImmutableSortedDictionary (). For example, let’s convert a Dictionary to ImmutableDictionary type in order to use the . Contains ()\nmethod: LINQ: var query = context.HstoreSample.Where(t => t.Dictionary.ToImmutableDictionary().Contains(\"a\", \"first\")).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE \"Dictionary\" @> hstore('a', 'first') The following example converts Dictionary to ImmutableSortedDictionary in order to perform a comparison by value with an ImmutableSortedDictionary instance: LINQ: var dictionary = new Dictionary() { { \"a\", \"first\" }, { \"c\", \"3rd\" }, { \"b\", \"second\" } }.ToImmutableSortedDictionary();\nvar query = context.HstoreSample.Where(t => t.Dictionary.ToImmutableSortedDictionary() == dictionary).ToList(); SQL: SELECT * FROM \"HstoreSample\" WHERE \"Dictionary\" = :p__dictionary_0 Conclusion We are glad to bring you the updated dotConnect for PostgreSQL with the new features, and we don’t plan to stop. Support of HSTORE mapping to Dictionary/SortedDictionary/ImmutableDictionary/ImmutableSortedDictionary can be improved further depending on your [feedback](https://www.devart.com/dotconnect/postgresql/feedback.html) and suggestions. We are also going to extend support for mapping more .NET types to get the most from the huge PostgreSQL data type variety. Code-First Migrations has a potential of improvement too. Besides, we keep support of Entity Framework Core 6 with its new features and new .NET 6 types, like DateOnly and TimeOnly, our highest priority task. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [PostgreSQL](https://blog.devart.com/tag/postgresql) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html) [Twitter](https://twitter.com/intent/tweet?text=EF+Core+Support+Improvements+in+dotConnect+for+PostgreSQL+7.21&url=https%3A%2F%2Fblog.devart.com%2Fef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/ef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html&title=EF+Core+Support+Improvements+in+dotConnect+for+PostgreSQL+7.21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/ef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html&title=EF+Core+Support+Improvements+in+dotConnect+for+PostgreSQL+7.21) [Copy URL](https://blog.devart.com/ef-core-support-improvements-in-dotconnect-for-postgresql-7-21.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Embrace Data Protection and Regulatory Requirements With SQL Complete Suggestions By [dbForge Team](https://blog.devart.com/author/dbforge) September 2, 2020 [0](https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html#respond) 2989 We are pleased to present the latest version of our superior solution for SQL Server database development, management, and administration – SQL Complete v6.6. The update is packed with improvements and the major one is the support for the ADD SENSITIVITY CLASSIFICATION command introduced in SQL Server 2019. Meet data privacy standards with SQL Complete 6.6 General Data Protection Regulations enforced by European privacy law obliged companies who are subject to the GDPR, whether managing cloud-based or on-premises databases, to ensure that data in their database systems is aptly handled and protected according to GDPR principles. Following on this, SQL Server 2019 introduced the SENSITIVITY CLASSIFICATION feature bound to enhance database security and establish compliance with GDPR and other privacy regulations. ADD SENSITIVITY CLASSIFICATION command supported in SQL Complete 6.6 The Devart team strives to keep abreast of the times and we add the support for the ADD SENSITIVITY CLASSIFICATION command so that our users could achieve better data safeguarding and visibility. The SENSITIVITY CLASSIFICATION feature supported in SQL Complete 6.6 gives power to: – make database data compliant with GDPR and other data protection standards – achieve advanced data security – control access to tables/columns containing vulnerable data – monitor and alert on anomalous access to sensitive data SQL Complete 6.6 allows you to easily classify database columns by prompting sensitivity labels that show the vulnerability of the data in the database column. With the tool’s suggestions, you can quickly and effortlessly tag columns according to the data sensitivity level. Apart from the sensitivity label, a column may have another attribute – Information Type, which provides additional granularity into the type of data stored in the database column. Again, quick and comprehensive prompts by SQL Complete 6.6 significantly facilitate data classification. Quick discovery of classified columns in your database may play a pivotal role in database development and maintenance. In the suggestion window, SQL Complete 6.6 marks columns containing personal or confidential information according to GDPR with black or red circles depending on the sensitivity rank. CSV export settings Following our users’ requests, we’ve added a possibility to configure data export to CSV files. Now you can tailor data export to CSV options to suit your needs. The most beneficial thing is the ability to select a delimiter to separate data values and specify the characters that will surround data values. Query Result Grid accuracy With the new option we’ve added to SQL Complete 6.6, it becomes possible to configure result grid accuracy by specifying the number of digits to be displayed after a decimal separator. Now you can adjust the display of decimals in a grid to achieve a higher accuracy of your calculations. Seize the benefits of enhanced data discovery & classification with SQL Complete 6.6 Want to try a brand new shiny version? Don’t forget to upgrade your active version of SQL Complete! [Upgrade now](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) . Tags [SENSITIVITY CLASSIFICATION](https://blog.devart.com/tag/sensitivity-classification) [sql complete](https://blog.devart.com/tag/sql-complete) [what's new sql complete](https://blog.devart.com/tag/whats-new-sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fembrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html) [Twitter](https://twitter.com/intent/tweet?text=Embrace+Data+Protection+and+Regulatory+Requirements+With+SQL+Complete+Suggestions&url=https%3A%2F%2Fblog.devart.com%2Fembrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html&title=Embrace+Data+Protection+and+Regulatory+Requirements+With+SQL+Complete+Suggestions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html&title=Embrace+Data+Protection+and+Regulatory+Requirements+With+SQL+Complete+Suggestions) [Copy URL](https://blog.devart.com/embrace-data-protection-and-regulatory-requirements-with-sql-complete-suggestions.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/embrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [What’s New](https://blog.devart.com/category/whats-new) Embrace New Connectivity Opportunities with dbForge Data Compare for PostgreSQL v3.4 By [dbForge Team](https://blog.devart.com/author/dbforge) August 26, 2021 [0](https://blog.devart.com/embrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html#respond) 2654 The dbForge team is excited to release the new version of Data Compare for PostgreSQL v3.4 with enhanced connectivity capabilities. dbForge Data Compare for PostgreSQL is a powerful, fast, and easy-to-use tool for comparing and synchronizing PostgreSQL database data. The previous versions of Data Compare allowed connection to PostgreSQL and Amazon Redshift, and with this release, we are extending the list of supported connections. What’s New Connection to IBM Cloud using SSL IBM Cloud developed by the IBM company is a suite of cloud computing services that includes both platform as a service (PaaS) and infrastructure as a service (IaaS). IBM Cloud ranks number 5 among the top cloud providers. dbForge Data Compare for PostgreSQL now allows connection to IBM Cloud for the user to have more flexibility in managing their hybrid workloads. Connection to Amazon Aurora Amazon Aurora is a modern relational database service, сompatible with MySQL and PostgreSQL. It has been developed for the cloud and is managed by Amazon Relational Database Service (RDS). The popularity of Amazon Aurora is growing rapidly, and today lots of companies are moving their infrastructure to it. We are happy to announce that dbForge Data Compare for PostgreSQL now supports connection to Amazon Aurora opening new opportunities for users to do smart business in the cloud. [Download a free 30-day trial](https://www.devart.com/dbforge/postgresql/datacompare/download.html) of Data Compare for PostgreSQL and unlock your business’s potential with connectivity to best-in-class cloud databases. Tags [Connection to Amazon Aurora](https://blog.devart.com/tag/connection-to-amazon-aurora) [Connection to IBM Cloud](https://blog.devart.com/tag/connection-to-ibm-cloud) [dbForge Data Compare for PostgreSQL](https://blog.devart.com/tag/dbforge-data-compare-for-postgresql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fembrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html) [Twitter](https://twitter.com/intent/tweet?text=Embrace+New+Connectivity+Opportunities+with+dbForge+Data+Compare+for+PostgreSQL+v3.4&url=https%3A%2F%2Fblog.devart.com%2Fembrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/embrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html&title=Embrace+New+Connectivity+Opportunities+with+dbForge+Data+Compare+for+PostgreSQL+v3.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/embrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html&title=Embrace+New+Connectivity+Opportunities+with+dbForge+Data+Compare+for+PostgreSQL+v3.4) [Copy URL](https://blog.devart.com/embrace-new-connectivity-opportunities-with-dbforge-data-compare-for-postgresql-v3-4.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/empower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Empower Yourself With the Latest SSMS 19.2 and dbForge SQL Tools! By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) December 8, 2023 [0](https://blog.devart.com/empower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html#respond) 1721 On November 13, 2023, Microsoft’s new SQL Server Management Studio (SSMS) 19.2 saw the light of day. You can [download it here](https://learn.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver16) and [read the release notes](https://learn.microsoft.com/en-us/sql/ssms/release-notes-ssms?view=sql-server-ver16) to get a detailed overview of the newly introduced features and enhancements. And if you still haven’t switched to this version, we must say it’s the one you will definitely want if you are dealing with the most modern versions of SQL Server, Azure SQL Managed Instance, or, say, Azure SQL Database. You can install and run SSMS 19.2 alongside the earlier versions (e.g., SSMS 18.x, SSMS 17.x, and SSMS 16.x), which can in turn be used if your SQL Server version happens to be older than 2014. Also note that by default, SSMS 19.2 is installed together with Azure Data Studio 1.47.0, and the installer doesn’t ask you whether you need it or not. Yet, good and fast as it is, SSMS 19.2 may still not be an exhaustive solution for the power users of SQL Server. Yet you can always enhance even further with [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , a collection of add-ins and standalone applications that level up the strengths of SSMS and fill in nearly all of the gaps you may think of. Here are the included tools: [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is the flagship of the bundle, a high-end add-in that is designed to enhance your SQL coding and make it error-free with context-aware SQL code completion, smart object suggestions, syntax validation, formatting, refactoring, and debugging. The punch it packs is strong enough to make your routine coding twice as fast and effective. [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) is an add-in that helps you version-control database schemas and static table data, manage and track changes, view and resolve conflicts, and maintain the integrity of your databases. It is compatible with Git (GitHub, GitLab, and Bitbucket), Azure DevOps Server, Apache Subversion, TFVC, Mercurial, Perforce, and SourceGear Vault. [dbForge Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) is a tool that helps compare SQL Server database schemas, analyze differences, and synchronize them. It works with live databases, snapshots, script folders, backups, and remote repositories. [dbForge Data Compare](https://www.devart.com/dbforge/sql/datacompare/) is a similar tool that helps detect and analyze table data discrepancies in live databases, backups, and script folders. It synchronizes databases and recovers damaged or missing data with just a few clicks. [dbForge Data Pump](https://www.devart.com/dbforge/sql/data-pump/) is an add-in for data import and export that supports 14 most essential data formats. It populates databases with external source data, streamlines data migration, and automates recurring scenarios via templates. [dbForge Query Builder](https://www.devart.com/dbforge/sql/querybuilder/) helps create, execute, and optimize SQL queries and statements visually, on easily understandable diagrams. A built-in SQL editor offers context-sensitive autocompletion and smart formatting. [dbForge Unit Test](https://www.devart.com/dbforge/sql/unit-test/) is an add-in that allows writing unit tests in regular T-SQL and running multiple tests at once directly from SSMS. It helps develop stable and reliable code that can be properly regression-tested at the unit level. [dbForge Data Generator](https://www.devart.com/dbforge/sql/data-generator/) contains 200+ smart generators of realistic, consistent test data with flexible configuration and preserved inter-column data dependencies. Data population can be performed in a single click. [dbForge Documenter](https://www.devart.com/dbforge/sql/documenter/) generates comprehensive searchable documentation in a matter of minutes, fully eliminating time-consuming manual work. Extensive customization helps tailor the process to any schedule and requirements. [dbForge Index Manager](https://www.devart.com/dbforge/sql/index-manager/) is an add-in that helps fix SQL index fragmentation effortlessly. It collects fragmentation statistics and detects databases that require maintenance within seconds. Indexes can be rebuilt and reorganized visually. [dbForge Search](https://www.devart.com/dbforge/sql/search/) is a free add-in that enables easy search for SQL objects, text, and data across multiple databases. It eliminates the need to browse the SSMS Object Explorer to find a required column name or a piece of text in a stored procedure. Extensive filtering options include wildcards to substitute any characters in the search string. [dbForge Monitor](https://www.devart.com/dbforge/sql/monitor/) is a free add-in that tracks server status and performance in real time. With a convenient analytical dashboard at hand, it is easy to pinpoint the origin of any bottleneck and proactively address it. [dbFo](https://www.devart.com/dbforge/sql/event-profiler/) [rge Event Profiler](https://www.devart.com/dbforge/sql/event-profiler/) is a free tool that helps capture and analyze any specified SQL Server events, which in turn helps to investigate and troubleshoot server load and stability issues. [dbForge SQL Decryptor](https://www.devart.com/dbforge/sql/sqldecryptor/) is a free tool for decrypting encrypted stored procedures, functions, triggers, and views simply by pointing and clicking. It also helps edit database objects and view scripts with syntax highlighting. [dbForge DevOps Automation](https://www.devart.com/dbforge/sql/database-devops/) is an integrated solution that unites dbForge SQL Tools into a consistent DevOps cycle. The entire solution simplifies the workflow, levels up routine coding and testing, and facilitates fast and safe CI/CD. Download SSMS 19.2 and enhance it with dbForge SQL Tools today! Without a doubt, SSMS 19.2 is an important release, if not revolutionary, and it’s a perfect IDE that will help you catch up with the latest versions of SQL Server and those to come in the near future. The recipe for success is simple: [download SSMS 19.2](https://learn.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver16) and install it, then [download dbForge SQL Tools for a free trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) , install them, and give this killer combo a go for an entire month of free use. Take this time to explore their capabilities in full, and we bet you won’t be disappointed. Tags [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fempower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html) [Twitter](https://twitter.com/intent/tweet?text=Empower+Yourself+With+the+Latest+SSMS+19.2+and+dbForge+SQL+Tools%21&url=https%3A%2F%2Fblog.devart.com%2Fempower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/empower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html&title=Empower+Yourself+With+the+Latest+SSMS+19.2+and+dbForge+SQL+Tools%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/empower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html&title=Empower+Yourself+With+the+Latest+SSMS+19.2+and+dbForge+SQL+Tools%21) [Copy URL](https://blog.devart.com/empower-yourself-with-the-latest-ssms-19-2-and-dbforge-sql-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/enabling-auto_close-is-a-bad-idea.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Enabling AUTO_CLOSE Is a Bad Idea? By [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) February 12, 2016 [0](https://blog.devart.com/enabling-auto_close-is-a-bad-idea.html#respond) 5064 From a personal perspective, allowing a production database to run with AUTO_CLOSE option is not the best practice. Let me explain why you should not enable AUTO_CLOSE and the consequences of using this option. The other day, I had to look in Error Log on a test server. After a two-minute timeout, I saw a great number of messages stored in the log, and I decided to check the log size using xp_enumerrorlogs : DECLARE @t TABLE (lod_id INT PRIMARY KEY, last_log SMALLDATETIME, size INT)\nINSERT INTO @t\nEXEC sys.xp_enumerrorlogs\n\nSELECT lod_id, last_log, size_mb = size / 1048576.\nFROM @t lod_id last_log size_mb\n-------- --------------------- ---------------\n0 2016-01-05 08:46:00 567.05288505\n1 2015-12-31 12:53:00 1370.39249420\n2 2015-12-18 11:32:00 768.46394729\n3 2015-12-02 13:54:00 220.20050621\n4 2015-12-02 13:16:00 24.04152870\n5 2015-11-16 13:37:00 80.07946205\n6 2015-10-22 12:13:00 109.33527946 As usual, on test servers, I don’t bother with the Error Log size, because each start of SQL Server initiates cyclic change of log files: the current errorlog is renamed to errorlog.1 , an empty file errorlog is created and the earliest errorlog.6 is deleted. When I need to clean the logs, sp_cycle_errorlog can be helpful. But before cleaning the logs, it fell into my mind to see what interesting staff is recorded there. I’ve read the current log with the stored procedure xp_readerrorlog : EXEC sys.xp_readerrorlog And then I saw dozens of messages of this type: Starting up database '...'. On the one hand, there is nothing wrong with that. At each start, SQL Server opens data files and checks the boot page: Starting up database '...'.\nCHECKDB for database '...' finished without errors on ... (local time). But after I have filtered by the message of interest, the results made me curious: DECLARE @t TABLE (log_date SMALLDATETIME, spid VARCHAR(50), msg NVARCHAR(4000))\nINSERT INTO @t\nEXEC sys.xp_readerrorlog 0, 1, N'Starting up database'\n\nSELECT msg, COUNT_BIG(1)\nFROM @t\nGROUP BY msg\nHAVING COUNT_BIG(1) > 1\nORDER BY 2 DESC ------------------------------------------------------ --------------------\nStarting up database 'AUTOTEST_DESCRIBER'. 127723\nStarting up database 'MANUAL_DESCRIBER'. 12913\nStarting up database 'AdventureWorks2012'. 12901 A great number of such messages may result from the AUTO_CLOSE option set to ON. According to the online documentation, when you turn on the AUTO_CLOSE option, a database is shut down automatically and flush all resources after the last user logs off. When a new connection is requested, the database will automatically reopen…and so on ad infinitum. Some time ago, I’ve read that in earlier versions of SQL Server , AUTO_CLOSE was a synchronous process, which could cause long delays at repeated opening and closing of database files. Starting in SQL Server 2005 , the AUTO_CLOSE process became asynchronous, and partially the problem is gone now. But there are many issues with AUTO_CLOSE that still remain. To optimize performance, SQL Server changes pages in the buffer cache and does not write these pages to disk after each modification. Instead, SQL Server creates a checkpoint, at which it writes current pages modified in the memory, along with transaction log information from the memory to disk. When a database is shut down, CHECKPOINT is automatically executed. Accordingly, the disk load may greatly increase with the repeated database shutdowns. Moreover, each database shutdown flushes the procedure cache. So, when the database reopens, the execution plans will have to be generated over again. But what is even worse, at the shutdown, the buffer cache also flushes, which increases disk load upon running queries. What does Microsoft thinks about AUTO_CLOSE ? “When AUTO_CLOSE is set ON, this option can cause performance degradation on frequently accessed databases because of the increased overhead of opening and closing the database after each connection. AUTO_CLOSE also flushes the procedure cache after each connection” However, there are couple of nuances. In SQL Server 2000 or any Express edition, when you create a new database, the AUTO_CLOSE option will be enabled by default: USE [master]\nGO\n\nIF DB_ID('test') IS NOT NULL\n DROP DATABASE [test]\nGO\n\nCREATE DATABASE [test]\nGO\n\nSELECT is_auto_close_on\nFROM sys.databases\nWHERE database_id = DB_ID('test') is_auto_close_on\n----------------\n1 However, if you look at the upside, such SQL Server Express behavior is easy to explain, because this version sets a limit on the size of RAM usage – a maximum of 1 GB. But in future, if you will need to deploy a database using a script, it is better to be on the safe side and explicitly disable AUTO_CLOSE : ALTER DATABASE [test] SET AUTO_CLOSE OFF In the course of work, I’ve noticed one other interesting thing – when calling certain system functions or views, all databases with enabled AUTO_CLOSE options will open: USE [master]\nGO\n\nIF DB_ID('p1') IS NOT NULL\n DROP DATABASE [p1]\nGO\nCREATE DATABASE [p1]\nGO\nALTER DATABASE [p1] SET AUTO_CLOSE ON\nGO\n\nIF DB_ID('p2') IS NOT NULL\n DROP DATABASE [p2]\nGO\nCREATE DATABASE [p2]\nGO\nALTER DATABASE [p2] SET AUTO_CLOSE ON\nGO\n\nEXEC sys.xp_readerrorlog 0, 1, N'Starting up database ''p'\nGO LogDate ProcessInfo Text\n----------------------- ------------ ----------------------------------\n2016-01-25 17:36:40.310 spid53 Starting up database 'p1'.\n2016-01-25 17:36:41.980 spid53 Starting up database 'p2'. We call p1 : WAITFOR DELAY '00:03'\nGO\nSELECT DB_ID('p1')\nGO\nEXEC sys.xp_readerrorlog 0, 1, N'Starting up database ''p' But p2 “wakes up” as well: LogDate ProcessInfo Text\n----------------------- ------------ ----------------------------------\n2016-01-25 17:36:40.310 spid53 Starting up database 'p1'.\n2016-01-25 17:36:41.980 spid53 Starting up database 'p2'.\n2016-01-25 17:39:17.440 spid52 Starting up database 'p1'.\n2016-01-25 17:39:17.550 spid52 Starting up database 'p2'. And finally we get to the main point. On a server, different users actively accessed metadata, making databases with enabled AUTO_CLOSE open, which, in turn, caused the Error Log growth. Preventive measures, by the way, are very simple: DECLARE @SQL NVARCHAR(MAX)\n\nSELECT @SQL = (\n SELECT '\nALTER DATABASE ' + QUOTENAME(name) + ' SET AUTO_CLOSE OFF WITH NO_WAIT;'\nFROM sys.databases\nWHERE is_auto_close_on = 1\n FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)')\n\nEXEC sys.sp_executesql @SQL All tests were implemented on Microsoft SQL Server 2012 (SP3) (KB3072779) – 11.0.6020.0 (X64) . Conclusion It may seem logical to close database that isn’t in use to release resources and improve performance. But, in fact, it’s harming your database far more than helping. So, unless you are absolutely sure that this feature is essential for you, the best practice is to leave the AUTO_CLOSE setting OFF . Tags [SQL Server](https://blog.devart.com/tag/sql-server) [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenabling-auto_close-is-a-bad-idea.html) [Twitter](https://twitter.com/intent/tweet?text=Enabling+AUTO_CLOSE+Is+a+Bad+Idea%3F&url=https%3A%2F%2Fblog.devart.com%2Fenabling-auto_close-is-a-bad-idea.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enabling-auto_close-is-a-bad-idea.html&title=Enabling+AUTO_CLOSE+Is+a+Bad+Idea%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enabling-auto_close-is-a-bad-idea.html&title=Enabling+AUTO_CLOSE+Is+a+Bad+Idea%3F) [Copy URL](https://blog.devart.com/enabling-auto_close-is-a-bad-idea.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/enhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Enhanced Entity Framework Spatials support for Oracle, MySQL and PostgreSQL By [dotConnect Team](https://blog.devart.com/author/dotconnect) July 18, 2013 [1](https://blog.devart.com/enhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html#comments) 6328 Entity Framework v5 and v6 support spatial data types. They are represented as two DbGeometry and DbGeography data types from System.Data.Entity.dll in .NET Framework 4.5 (Entity Framework 5) or from EntityFramework.dll (Entity Framework 6). We have already supported Entity Framework Spatials for Oracle database (see [Using Entity Framework Spatials with Oracle Spatial and SharpMap](https://blog.devart.com/using-entity-framework-spatials-with-oracle-spatial-and-sharpmap.html) ). In the new versions of our providers we have improved Entity Framework Spatials support for Oracle and added spatials support for MySQL and PostgreSQL. Note for Users Who Upgrade dotConnect Providers from Earlier Versions Current Entity Framework Spatials mapping is implemented in dotConnect for Oracle v 8.3, dotConnect for MySQL v 8.3, and dotConnect for PostgreSQL v 7.3. If you had earlier version of any of these providers installed, and upgraded to these or later versions, you need to reset type mapping rules for the provider in Entity Developer in order to have the type new mapping rules for spatial types added. To do it, From the Entity Developer submenu of the Visual Studio Tools menu select Options . In the Options dialog box, expand the Entity Developer -> Servers Options node and select Oracle or MySQL or PostgreSQL options page depending on the provider you have installed. On the selected options page, click the Reset button. The following type mapping rules will be added: Oracle SDO_GEOMETRY (Server Type) -> Data.Spatial.DbGeometry (.NET Type) MySQL curve (Server Type) -> Data.Spatial.DbGeometry (.NET Type) geometry (Server Type) -> Data.Spatial.DbGeometry (.NET Type) geometrycollection (Server Type) -> Data.Spatial.DbGeometry (.NET Type) linestring (Server Type) -> Data.Spatial.DbGeometry (.NET Type) multicurve (Server Type) -> Data.Spatial.DbGeometry (.NET Type) multilinestring (Server Type) -> Data.Spatial.DbGeometry (.NET Type) multipoint (Server Type) -> Data.Spatial.DbGeometry (.NET Type) multipolygon (Server Type) -> Data.Spatial.DbGeometry (.NET Type) multisurface (Server Type) -> Data.Spatial.DbGeometry (.NET Type) point (Server Type) -> Data.Spatial.DbGeometry (.NET Type) polygon (Server Type) -> Data.Spatial.DbGeometry (.NET Type) surface (Server Type) -> Data.Spatial.DbGeometry (.NET Type) PostgreSQL geography (Server Type) -> Data.Spatial.DbGeography (.NET Type) geometry (Server Type) -> Data.Spatial.DbGeometry (.NET Type) Note that when resetting type mapping rules, any customizations you have made earlier will be lost. If you don’t want to reset type mapping rules, you may need to add the rules listed above manually to work with spatial types via Entity Framework. Common Functionality Spatial services and GIS libraries A set of spatial-specific configuration settings was implemented. All these settings are optional except for the spatial service that will be used for reading spatial objects from the database, saving them to the database, and run-time support for the functionality of DbGeometry/DbGeography classes. The following spatial services were implemented: NetTopologySuite spatial service Well-Known Text (WKT) spatial service Extended Well-Known Text (EWKT) spatial service Well-Known Binary (WKB) spatial service OracleObject spatial service (Devart dotConnect for Oracle only) Old SharpMap v0.9 spatial service (Devart dotConnect for Oracle only, outdated and not recommended to use) NetTopologySuite spatial service provides the richest functionality out-of-the-box. Other spatial services can be used together with user/3rd-party GIS libraries if you implement the interaction between them. SharpMap The new version of geospatial mapping library SharpMap 1.1 is now supported. New [SharpMap](http://sharpmap.codeplex.com/) release uses geometry types from the NetTopologySuite instead of the old SharpMap-specific implementation from SharpMap v0.9, so we supported NetTopologySuite GIS-library too. Migration from SharpMap v0.9 to NetTopologySuite significantly extends the supported functionality of DbGeometry/DbGeography classes. [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) retains the support of old geometry types of SharpMap GIS-library for compatibility, but this support is deprecated and not recommended to use in new projects. [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) and [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) don’t support SharpMap v0.9 at all, they only support NetTopologySuite GIS-library. NetTopologySuite Support for [NetTopologySuite](http://code.google.com/p/nettopologysuite/) geospatial mapping library is added to dotConnect for Oracle, MySQL, and PostgreSQL. Currently, the 1.13.2 version is supported. You need to deploy the following assemblies with the application in order to use NetTopologySuite together with SharpMap 1.1: SharpMap.dll NetTopologySuite.dll GeoAPI.dll PowerCollections.dll You may enable using NetTopologySuite in the application config file (the example here is for dotConnect for Oracle and Entity Framework 6): \n
\n \n \n \n Or you may enable using NetTopologySuite in the application code: var config = OracleEntityProviderConfig.Instance;\nconfig.SpatialOptions.SpatialServiceType = SpatialServiceType.NetTopologySuite; NetTopologySuite functionality support is implemented in the following assemblies: Devart.Data.Oracle.Entity.Spatials.dll (dotConnect for Oracle) Devart.Data.MySql.Entity.Spatials.dll (dotConnect for MySQL) Devart.Data.PostgreSql.Entity.Spatials.dll (dotConnect for PostgreSQL) You need to add a reference to the corresponding assembly to your project in order to use NetTopologySuite functionality in it. Increased Performance of Reading EWKT Representation of Spatial Data The new versions of our providers implement new Entity Framework provider options: config.SpatialOptions.AlwaysUseGeometryDefaultSrid – forces the provider to always use the config.SpatialOptions.GeometryDefaultSrid value instead of reading the actual SRID for the DbGeometry spatial from the database. Default value is false. config.SpatialOptions.AlwaysUseGeographyDefaultSrid – forces the provider to always use the config.SpatialOptions.GeographyDefaultSrid value instead of reading the actual SRID for the DbGeometry spatial from the database. Default value is false. Enabling these options may be useful for EWKT, NetTopologySuite, and SharpMap spatial services in case all the database objects have the same SRID value. Performance gain is the most significant for Oracle and MySQL as they don’t have the built-in functions for retrieving EWKT values, and complex SQL statements are generated for retrieving these values by default. It also reduces traffic a bit for all databases, which is useful when materializing a large number of spatial objects. Database-Specific Functionality Oracle The support for some database-specific Oracle Spatial functions in Entity Framework is added. SDO_FILTER SDO_NN SDO_RELATE SDO_WITHIN_DISTANCE You can find information on these functions in the [Spatial Operators section of Oracle Spatial Developer’s Guide](http://docs.oracle.com/cd/E11882_01/appdev.112/e11830/sdo_operat.htm) . These functions are supported in LINQ to Entities (via methods of the new Devart.Data.Oracle.Entity.OracleSpatialFunctions class), and in EntitySQL. For example, to create a LINQ to Entites query similar to the following SQL statement: SELECT *\n FROM SPATIAL_TABLE s\n WHERE SDO_RELATE(s.SPATIAL_COLUMN, :spatial_parameter, 'mask=touch+coveredby') = 'TRUE' you may use the following code: var spatialValue = DbGeometry.FromText(\"...\");\n\n var query = context.SpatialTable\n .Where(c => OracleSpatialFunctions.SdoRelate(c.SpatialColumn, geometryValue, \"mask=touch+coveredby\") == OracleSpatialFunctions.True)\n .ToList(); The following SQL statement is actually generated for this LINQ to Entities query: SELECT \nExtent1.ID,\n(CASE WHEN Extent1.SPATIAL_COLUMN IS NULL THEN NULL ELSE 'SRID=' || NVL(Extent1.SPATIAL_COLUMN.SDO_SRID, '0') || ';' || SDO_UTIL.TO_WKTGEOMETRY(Extent1.SPATIAL_COLUMN) END) AS \"SpatialColumn\",\nExtent1.NAME\nFROM SPATIAL_TABLE Extent1\nWHERE ((SDO_RELATE(Extent1.SPATIAL_COLUMN, SDO_GEOMETRY(:p__linq__0, :p__linq__0_srid), 'mask=touch+coveredby')) = 'TRUE') MySQL MySQL provides only [basic spatial functionality](http://dev.mysql.com/doc/refman/5.6/en/spatial-extensions.html) , however it can be sufficient for some tasks. dotConnect for MySQL provides the best possible support for existing MySQL spatial features and additionally implements the calculation of distances on the surface of a spheroid, which is not supported in MySQL out-of-the-box. MySQL Database-specific Functions The following database-specific MBR-based MySQL functions are supported in Entity Framework: MBRContains MBRDisjoint MBREqual MBRIntersects MBROverlaps MBRTouches MBRWithin You can find information on MySQL MBR-based (Minimal Bounding Rectangles) spatial functions in the [Functions for Testing Spatial Relations Between Geometric Objects section of MySQL 5.6 Reference Manual](http://dev.mysql.com/doc/refman/5.6/en/functions-for-testing-spatial-relations-between-geometric-objects.html) . These functions are supported in LINQ to Entities (via methods of the new Devart.Data.MySql.Entity.MySqlSpatialFunctions class), and in EntitySQL. When using DbGeometry/DbGeography methods in LINQ to Entities, shape-based spatial functions are used for MySQL 5.6 and higher by default. If you want to use shape-based spatial functions for some cases and MBR-based spatial functions for other cases, use DbGeometry/DbGeography methods for cases where shape-based spatial functions must be used, and for cases where MBR-based spatial functions must be used, use MySqlSpatialFunctions methods. If you want to use only MBR-based spatial functions, you may configure DbGeometry/DbGeography behaviour so that it will always generate MBR-specific SQL with the by setting config.SpatialOptions.UseObjectShapeBasedFunctions to false. Calculating Distances for DbGeography Devart dotConnect for MySQL implements correct calculation of distances on the surface of a spheroid (i. e. on the Earth surface) for the Distance method of the DbGeography class. This behaviour can be customized with the config.SpatialOptions.GeographyDistanceUnit Entity Framework provider configuration option. With this option you can specify the distance unit to return the result in. The following units are available: Meter (default unit) Kilometer Mile Nautical mile Yard Foot PostgreSQL [PostGIS](http://postgis.net/) provides very rich spatial functionality, and we have done our best to provide support for all the main spatial features for Entity Framework. Postgis version 2.0 (or higher) is required for working with Entity Framework Spatials. You can check the version by executing “select postgis_version()” in the database PostgreSQL Database-specific Functions The following database-specific PostgreSQL functions are supported in Entity Framework: ST_AsGML ST_AsLatLonText ST_AsKML ST_AsSVG ST_AsX3D ST_Affine ST_HausdorffDistance You can find information on these PostGIS functions in [PostGIS Special Functions Index](http://postgis.net/docs/manual-2.0/PostGIS_Special_Functions_Index.html) . Calculating Distances and Areas for DbGeography The behaviour of Distance, Length, and Area methods of the DbGeography class can be customized with the config.SpatialOptions.GeographyDistanceUnit and config.SpatialOptions.GeographyAreaUnit Entity Framework provider configuration options. With these options you can specify the distance and area units to return the result in. The following distance units are available for config.SpatialOptions.GeographyDistanceUnit: Meter (default unit) Kilometer Mile Nautical mile Yard Foot The following area units are available for config.SpatialOptions.GeographyAreaUnit: Square meter (default unit) Square kilometer Square mile Square yard Square foot Acre Hectare You may also enable the simple mode of distance and area calculations by setting the config.SpatialOptions.UseGeographySpheroidMeasurement option to False. This mode is faster but less precise. Demo We have prepared an updated demo project, based on the demo project from the previous article [Using Entity Framework Spatials with Oracle Spatial and SharpMap](https://blog.devart.com/using-entity-framework-spatials-with-oracle-spatial-and-sharpmap.html) . The updated sample works with Oracle, MySQL, and PostgreSQL. Demo project changes: Entity Framework 6 is used (previous demo project used Entity Framework 5) SharpMap 1.1 Final and NetTopologySuite are used (previously SharpMap v0.9 was used) Demo project can be opened with Visual Studio 2010 (the previous one only worked in Visual Studio 2012) Now target framework is .NET Framework 4.0 (previously, .NET Framework 4.5) For MySQL, demo project does not calculate region areas on the surface of a spheroid correctly. For PostgreSQL, all demo project features work correctly. [Download DevartSharpMapDemo sources](https://blog.devart.com/wp-content/uploads/2013/07/DevartSharpMapDemo.EF6_.zip) Conclusion We are glad to provide Entity Framework support improvements in [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , and [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) to our users. As for our future plans, further development of spatial functionality of our Entity Framework providers will depend on the feedback of our users. Update Please note that Spatial functionality has changed in Entity Framework 6.3/Entity Framework 6.. For more information about these updates, please see our newer article: [https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html](https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html) article. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html) [Twitter](https://twitter.com/intent/tweet?text=Enhanced+Entity+Framework+Spatials+support+for+Oracle%2C+MySQL+and+PostgreSQL&url=https%3A%2F%2Fblog.devart.com%2Fenhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html&title=Enhanced+Entity+Framework+Spatials+support+for+Oracle%2C+MySQL+and+PostgreSQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html&title=Enhanced+Entity+Framework+Spatials+support+for+Oracle%2C+MySQL+and+PostgreSQL) [Copy URL](https://blog.devart.com/enhanced-entity-framework-spatials-support-for-oracle-mysql-and-postgresql.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 1 COMMENT Seth @ Firebox Training August 27, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 12:15 am That was very informative. Keep up the good work. Looking forward to more informative posts from your side. Comments are closed."} {"url": "https://blog.devart.com/enhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) Enhancing Marketing Analytics: A Comprehensive Guide to Connecting HubSpot and Power BI By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) August 23, 2024 [0](https://blog.devart.com/enhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html#respond) 863 Understanding the nuances of marketing data is no longer a luxury but a strategic necessity; and this is where the fusion of two powerful tools, HubSpot and Power BI, takes center stage, offering an integration that transforms raw data into meaningful intelligence. Marketing analytics is the compass guiding businesses toward well-informed decisions, and the synergy between HubSpot, a leading CRM platform, and Power BI, Microsoft’s robust business intelligence tool, is rewriting the rules of this game. This comprehensive guide explores the intricacies of connecting HubSpot and Power BI, focusing on leveraging Devart ODBC drivers to enhance the integration process. Join us on this journey as we unravel the potential behind merging the strengths of HubSpot and Power BI, helping marketers and analysts streamline their workflows and extract deeper insights. Let’s dive into enhanced marketing analytics and discover how this integration can reshape the way you understand and use your data. Table of Contents Understanding the Need for Integration Getting Started: Setting the Foundation Connect HubSpot to Power BI: A Step-by-Step Guide Launch Power BIGet Data Get Data Choose Devart ODBC Driver Configure Connection Select Data Tables Load Data Transform and Visualize Optional: Schedule Data Refresh Key Features of HubSpot and Power BI Integration Optimizing Data Flow: Streamlining Your Analytics Pipeline Security and Compliance: Safeguarding Your Marketing Data Take the Next Step: Download Devart ODBC Drivers Understanding the Need for Integration The challenge often lies not in the scarcity of data but in its fragmentation across various platforms. Siloed data, scattered across different systems and tools, creates a bottleneck for marketers seeking a comprehensive view of their customer journey. The need for integration arises from the desire to break down these data silos and create a unified, coherent narrative. HubSpot, a robust CRM platform designed to streamline marketing efforts, holds a wealth of customer-centric data—from lead interactions to campaign performance. On the other hand, Power BI, armed with potent analytical capabilities, can transform this data into visualizations that tell a compelling story. By integrating HubSpot with Power BI, businesses bridge the gap between their marketing data and actionable insights. The synergy allows for a holistic understanding of customer behavior, allowing marketers to craft targeted strategies, identify trends, and measure the impact of their campaigns with unparalleled precision. Moreover, the integration eliminates the tedious manual processes involved in data aggregation. Instead of spending valuable time collating information from disparate sources, marketers can redirect their efforts toward interpreting the insights generated by the integration of HubSpot with Power BI. Getting Started: Setting the Foundation Embarking on the journey of integrating HubSpot and Power BI requires a solid foundation, and at the heart of this foundation lies the deployment of ODBC drivers. These drivers serve as the conduit, seamlessly connecting the data-rich environment of HubSpot with the analytical prowess of Power BI. Installation and Configuration The first step is to install [Devart ODBC Drivers](https://www.devart.com/odbc/) . Navigate to the official Devart website and follow the installation instructions. This straightforward process lays the groundwork for the subsequent integration. Once installed, the drivers need to be configured. This involves specifying connection details such as server addresses, authentication credentials, and other parameters. This configuration step is pivotal, as it establishes the communication link between HubSpot and Power BI. Compatibility and Versatility Devart ODBC Drivers are designed with compatibility and versatility in mind. Whether you’re working with HubSpot’s diverse data sets or tapping into the analytical capabilities of Power BI, these drivers act as the universal translator, ensuring seamless communication between the two platforms. The compatibility extends beyond the surface, catering to the evolving needs of both HubSpot and Power BI users. Keeping the drivers up-to-date ensures your integration remains robust, leveraging the latest features and optimizations. Security Measures Security is a necessity if you’re dealing with sensitive data. The ODBC Drivers prioritize data protection by implementing robust security measures. Encryption protocols and secure authentication mechanisms guarantee your data traverses the integration pipeline securely. Navigating through the installation and configuration of ODBC Drivers might seem like a technical labyrinth, but fear not – we’ll guide you through each step. As we progress in this comprehensive guide, you’ll gain a hands-on understanding of how these drivers form the backbone of a seamless integration of HubSpot and Power BI. Stay tuned for the next segment, where we’ll delve into the subtleties of connecting HubSpot to Power BI. Connect HubSpot to Power BI: A Step-by-Step Guide Now that we’ve laid the groundwork with the installation and configuration of the [ODBC Driver for HubSpot](https://www.devart.com/odbc/hubspot/integrations/hubspot-powerbi-connection.html) , it’s time to embark on the pivotal stage of connecting HubSpot to Power BI. This step-by-step guide will navigate you through the integration process, ensuring a seamless flow of data and insights. Launch Power BI Begin by opening your Power BI desktop application. If you don’t have it installed, download and install it from the official Power BI website. Get Data In Power BI, navigate to the Home tab and select Get Data . From the extensive list of data sources, locate and select ODBC . Choose Devart ODBC Driver A dialog will prompt you to select an ODBC data source. Here, select the Devart ODBC Driver that corresponds to your HubSpot configuration. Configure Connection Enter the necessary connection details, including the server address, database name, and authentication credentials. This information ensures the Power BI can establish a secure and reliable connection with your HubSpot CRM. Select Data Tables Once the connection is established, you’ll be presented with a list of available tables in your HubSpot database. We recommend conducting a test connection from the tool to make sure your data is correct. Choose the specific tables that contain the data you want to analyze in Power BI. Load Data After selecting the relevant tables, click Load to initiate the data import process. Power BI will retrieve the chosen data from HubSpot, readying it for analysis. Transform and Visualize With the data loaded, Power BI offers a suite of transformation and visualization tools. Shape your data to suit your analytical needs, creating meaningful charts, graphs, and dashboards. Optional: Schedule Data Refresh To keep your analytics up-to-date, consider scheduling data refreshes. This ensures the latest information from HubSpot is consistently available in your Power BI reports. Congratulations! You’ve successfully connected HubSpot to Power BI using ODBC drivers. This integration opens the door to a wealth of analytical possibilities, allowing you to extract actionable insights from your HubSpot CRM data. In the next section, we’ll delve into the key features making this HubSpot and Power BI integration a game-changer for marketers and analysts alike. Key Features of HubSpot and Power BI Integration So, you’ve seamlessly connected HubSpot to Power BI. Now, let’s explore the key features of this powerful combination. Real-Time Data Updates The integration ensures your Power BI reports reflect the latest information from HubSpot in real time. Whether tracking campaign performance, lead interactions, or customer behavior, you can make decisions based on the most up-to-date data. Unified Customer View HubSpot, as a CRM platform, houses a plethora of customer-centric data. By integrating it with Power BI, you create a unified customer view that consolidates information from various touchpoints. This comprehensive view empowers your team to understand customer journeys. Customizable Dashboards Power BI’s robust visualization capabilities allow you to create customizable dashboards tailored to your specific needs. From monitoring lead generation metrics to visualizing sales performance, you have the flexibility to design dashboards that align with your business objectives. Advanced Analytics and AI Integration Power BI brings advanced analytics and AI capabilities to the forefront. Leverage machine learning algorithms and predictive analytics to uncover hidden patterns, identify trends, and make data-driven predictions to shape your marketing strategy. Campaign Performance Tracking Dive deep into the performance of your marketing campaigns. With HubSpot data seamlessly integrated into Power BI, you can track key metrics such as click-through rates, conversion rates, and ROI with precision. Data Governance and Security Devart ODBC Drivers, serving as the linchpin of this integration, prioritize data governance and security. Ensure compliance with regulations and industry standards, safeguarding sensitive marketing data throughout the integration pipeline. Scalability and Flexibility As your business grows, so does the volume of data. The HubSpot and Power BI integration is designed with scalability in mind. Adapt to evolving data needs and business requirements, ensuring that your analytics infrastructure remains robust and flexible. With all these features, the integration of HubSpot and Power BI transcends traditional boundaries, providing a dynamic platform for marketers and analysts. As we move forward, we’ll explore how to optimize data flow, delve into advanced analytics, and showcase real-world examples of successful implementations. Optimizing Data Flow: Streamlining Your Analytics Pipeline With the HubSpot and Power BI integration in place, the next critical step is optimizing the flow of data between these two powerhouses. An efficient data flow ensures your analytics remain dynamic, responsive, and aligned with the pace of your business. Here’s how to fine-tune and optimize the data flow. Define Data Synchronization Frequency Tailor the data synchronization frequency to match the tempo of your business operations. Consider the nature of your marketing activities—some businesses may benefit from real-time updates, while others may find periodic refreshes more suitable. Strike a balance that aligns with your analytics goals. Automate Data Refresh Power BI offers the option to automate data refreshes, ensuring your reports and dashboards are always up-to-date. Set up a schedule that aligns with your HubSpot data update patterns. This automation minimizes manual intervention, allowing your team to focus on extracting insights rather than data management. Achieve Optimal Query Performance Fine-tune your queries to optimize performance. Consider the specific data points you need for analysis and refine your queries accordingly. This not only enhances the speed of data retrieval but also reduces the strain on resources, contributing to a more efficient analytics pipeline. Perform Data Cleaning and Transformation Power BI provides robust tools for data cleaning and transformation. Ensure that your data is in pristine condition before it enters the analytics ecosystem. Address inconsistencies, remove duplicates, and transform raw data into a format that enhances analytical capabilities. Utilize Indexing and Partitions Apply indexing and partitioning to accelerate data retrieval. This is especially crucial when dealing with large datasets. Thus you can enhance the efficiency of querying and reporting, resulting in a smoother analytics experience. Monitor and Fine-Tune Regularly monitor the performance of your analytics pipeline. Identify bottlenecks, assess resource utilization, and fine-tune configurations as needed. This proactive approach ensures that your integration remains resilient. Plan for Scalability Anticipate future data growth and make room for scalability. As your business expands, the volume of data will likely increase. Ensure that your analytics infrastructure is equipped to handle this growth well, without compromising performance or data integrity. Security and Compliance: Safeguarding Your Marketing Data The integration of HubSpot and Power BI, made possible with the [Devart ODBC Driver for HubSpot](https://www.devart.com/odbc/hubspot/) , prioritizes data protection and compliance, offering a robust framework to safeguard your valuable marketing insights. Encryption Protocols Devart ODBC Drivers employ industry-standard encryption protocols to secure data transmission between HubSpot and Power BI. This cryptographic layer ensures that sensitive information remains confidential, shielding it from unauthorized access during the integration process. Secure Authentication Mechanisms Authentication is the first line of defense against potential threats. Devart ODBC Drivers offer secure authentication mechanisms, ensuring that only authorized personnel can access and manipulate the integrated data. This layer of security provides an additional barrier against unauthorized intrusions. Regulatory Compliance Whether it’s GDPR, HIPAA, or other regulatory frameworks, the integration ensures that your marketing data is managed in accordance with the relevant guidelines. Regular Security Audits Conduct regular security audits to assess the robustness of your integrated analytics. Periodic evaluations help identify vulnerabilities and potential security gaps, allowing you to proactively address and fortify your data protection. Take the Next Step: Download Devart ODBC Drivers To unlock the full potential of your HubSpot and Power BI integration, it’s crucial to have the right tools at your disposal. Thus you can take the next step in transforming your marketing analytics with the powerful Devart ODBC Drivers. Your journey starts with a click. [Download Devart ODBC Drivers](https://www.devart.com/odbc/hubspot/download.html) and embark on a path where data becomes a strategic asset, driving your business toward success. Tags [Hubspot integration](https://blog.devart.com/tag/hubspot-integration) [odbc](https://blog.devart.com/tag/odbc) [odbc driver](https://blog.devart.com/tag/odbc-driver) [odbc driver for hubspot](https://blog.devart.com/tag/odbc-driver-for-hubspot) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html) [Twitter](https://twitter.com/intent/tweet?text=Enhancing+Marketing+Analytics%3A+A+Comprehensive+Guide+to+Connecting+HubSpot+and+Power+BI&url=https%3A%2F%2Fblog.devart.com%2Fenhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html&title=Enhancing+Marketing+Analytics%3A+A+Comprehensive+Guide+to+Connecting+HubSpot+and+Power+BI) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html&title=Enhancing+Marketing+Analytics%3A+A+Comprehensive+Guide+to+Connecting+HubSpot+and+Power+BI) [Copy URL](https://blog.devart.com/enhancing-marketing-analytics-a-comprehensive-guide-to-connecting-hubspot-and-power-bi.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/enjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Enjoy Creating Complex Queries with SQL Query Builder! By [dbForge Team](https://blog.devart.com/author/dbforge) May 18, 2010 [0](https://blog.devart.com/enjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html#respond) 4300 Devart development team is glad to present [dbForge Query Builder for SQL Server](https://www.devart.com/dbforge/sql/querybuilder/) , a modern and affordable tool that will bring you confidence in creating queries of any complexity. With dbForge Query Builder, Devart continues its initiative to produce efficient database experiences for all the people in SQL Server world. Designed specifically for people who are working with databases, dbForge Query Builder blends rich functionality and a simple user interface to provide: Visual query creation Users can quickly build any SELECT statements containing multiple tables, joins, conditions, and subqueries without typing any code. The state-of-art diagram visually presents all the elements of the query. Tables can be quickly added to a query via drag-and-dropping. Using the powerful expression editor, the user can effortlessly control columns, aliases, and functions in one place, set composite WHERE and HAVING conditions, and quickly change grouping and ordering. The queries can be saved to a file. Advanced SQL handling To speed up coding and script development, dbForge Query Builder offers the [SQL editor](https://www.devart.com/dbforge/sql/studio/sql-editor.html) with the following features: Syntax highlight during typing Automatic syntax check Code outlining with ability to create user-defined outlining regions Code commenting in one click Incremental search Extended options for code formatting SQL history Script navigation with the Document Outline window Enhanced working with data dbForge Query Builder gives new experiences to analyze and process received data efficiently. To get better insight into the query results, the users can group, filter, and sort data in the grid, view data rows as neat cards, print the selected data, display large bulks of data in the paginal mode, and work with binary and long text data fields using LOB (Large Object) windows. The auto-search mode can save hours of precious time spent on finding the required data. Quick control of the data update process is guaranteed with both cached and write-through update modes, and connection-level transactions. Customers can download a [free 30-day trial version](https://www.devart.com/dbforge/sql/querybuilder/download.html) and check the tool with own databases. Devart development team is looking forward to any comments and suggestions at [dbForge Query Builder feedback page](https://www.devart.com/dbforge/sql/querybuilder/feedback.html) . Learn how to use SQL Query Builder with the help of short videos [here](https://www.devart.com/dbforge/sql/querybuilder/resources.html) . Tags [query builder](https://blog.devart.com/tag/query-builder) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Enjoy+Creating+Complex+Queries+with+SQL+Query+Builder%21&url=https%3A%2F%2Fblog.devart.com%2Fenjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html&title=Enjoy+Creating+Complex+Queries+with+SQL+Query+Builder%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html&title=Enjoy+Creating+Complex+Queries+with+SQL+Query+Builder%21) [Copy URL](https://blog.devart.com/enjoy-creating-complex-queries-with-dbforge-query-builder-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/enjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [What’s New](https://blog.devart.com/category/whats-new) Enjoy the Updated Code Compare, Now Compatible With Visual Studio 2022! By [dbForge Team](https://blog.devart.com/author/dbforge) May 20, 2024 [0](https://blog.devart.com/enjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html#respond) 1279 Meet the newly updated [Devart Code Compare](https://www.devart.com/codecompare/) , a smart and flexible tool that provides you with the easiest possible way of comparing and merging source code, files, and folders. Now you can install and integrate it with Visual Studio 2022 and experience all of its capabilities right there! You can access Code Compare from the Tools menu > Code Compare . Key features of Code Compare File comparison First and foremost, you can use Code Compare to find differences between any blocks of text. The comparison window is divided into two panes—source and target—and the differences are shown with convenient highlighting. Source code comparison As for your source code, it can be compared just as effectively. In this respect, Code Compare supports lexical comparison of code in major programming languages, including C#, C++, Visual Basic, JavaScript, Java, and XML. You get syntax highlighting, matching of similar lines, and code outlining for collapsing and expanding regions of code. Here is a quick example of Code Compare showing a single difference between two similar SQL queries. It can be instantly merged both ways with a click on the corresponding blue arrow button. Folder comparison Besides blocks of text/code and files, Code Compare allows comparing and merging entire folders. Here, you get color coding for added, deleted, and modified files, filtering options for excluding or including certain file types, quick opening of individual files, and batch copying of files to another pane or to a selected folder. Version control integration We’ve saved the best for last. Code Compare helps you merge files and resolve conflicts in a variety of version control systems that support external comparison tools, including Git, Mercurial, Perforce, TFS, [and a few others](https://www.devart.com/codecompare/integration.html) . It has also never been easier to quickly resolve conflicts using tools like Sourcetree. Download Code Compare for a free 30-day trial today! Want to check it in action? [Download Code Compare for a free 30-day trial](https://www.devart.com/codecompare/download.html) , make sure that you’ve selected the Integrate into Visual Studio 2022 checkbox during the installation, and give it a go! Working with Code Compare is just as easy and simple as it sounds! Tags [Code Compare](https://blog.devart.com/tag/code-compare) [Visual Studio 2022](https://blog.devart.com/tag/visual-studio-2022) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html) [Twitter](https://twitter.com/intent/tweet?text=Enjoy+the+Updated+Code+Compare%2C+Now+Compatible+With+Visual+Studio+2022%21&url=https%3A%2F%2Fblog.devart.com%2Fenjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html&title=Enjoy+the+Updated+Code+Compare%2C+Now+Compatible+With+Visual+Studio+2022%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html&title=Enjoy+the+Updated+Code+Compare%2C+Now+Compatible+With+Visual+Studio+2022%21) [Copy URL](https://blog.devart.com/enjoy-the-updated-code-compare-now-compatible-with-visual-studio-2022.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/ensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Ensuring Quality of dbForge Schema Compare for SQL Server, v 2.0 By [dbForge Team](https://blog.devart.com/author/dbforge) February 2, 2010 [2](https://blog.devart.com/ensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html#comments) 3010 The most important characteristic of any software is its quality. The quality, according to the model specified in the ISO 9126 standard, consists of the following characteristics: Functionality Reliability Ease of use Efficiency Maintenance Portability Of course users are not concerned about any quality models and just tell us their suggestions and wishes. It’s these suggestions and wishes that are the most preferred description of any quality characteristic. Below we give the list of main aspects we paid special attention to when developing [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) in order to deliver software of the highest quality to our users. Dependencies. One of the key difficulties during object synchronization is caused by dependencies between objects and inside them. These dependencies are built basing on the information received from the server, but frequently there is not enough of it. To simplify the situation, we’ve implemented and tested a parser that looks for missing dependencies. Owing to the parser, all existing dependencies between database objects are found. So users can be sure that dbForge Schema Compare for SQL Server will be able to synchronize any object regardless of the number of objects it is referenced by, how they reference it, and the objects it references. Describe queries optimization. After we implemented support for all scheduled objects, we’ve decided to study describe queries properly (these are queries sent to the server in order to obtain object information) and, if possible, optimize the speed of their execution in order to avoid performing extra actions and to do the necessary ones as quickly as possible. During the work, we have found many defects and errors, rewrote some queries or their parts that increased the describe productivity several times. So users can be sure that dbForge Schema Compare for SQL Server does not perform any unnecessary actions and does not spend his or her precious time as well as server and desktop resources. Load Testing (Performance Testing). After query optimization a goal to learn full capabilities of dbForge Schema Compare was set, and we decided to make load testing. To perform such testing we have created an application that could generate scripts of different objects in unlimited numbers according to the preset template. Due to the load testing we can claim that dbForge Schema Compare for SQL Server can manage to synchronize large amounts of data. So users can be sure that dbForge Schema Compare for SQL Server will cope with synchronizing large amounts of objects, comparing large backups, generating large synchronization scripts etc. Autotesting. Because of the constant growth of the amount of data for testing and because of the growth of requirements for product quality we’ve decided to create autotests. To accomplish this task we’ve created a special application that integrates into dbForge Schema Compare for SQL Server and performs preset actions automatically. For example, it creates specified databases, generates synchronization script, checks if the script was generated according to the model. For the time being there are approximately 4000 of such autotests, and their amount is constantly growing. These autotests are performed for each build. So users can be sure that every build of dbForge Schema Compare he or she downloaded became more reliable and stable. Using users databases for testing. We have a set of databases sent by our users. Some users send us their own databases to help us reproduce a problem they encountered and fix the bug. We perform thorough application testing on users’ databases and make a new build with fixes of all bugs, if any were found. So users can be sure that, when synchronizing his or her database all bugs found in dbForge Schema Compare for SQL Server will be fixed and the needed database will be synchronized successfully. Usability testing. The convenience of dbForge Schema Compare for SQL Server usage also means a lot to us. That’s why we discuss the ways of improving UI and its elements. We often use the experience of Microsoft, a member of Cambridge Usability Group that has its own laboratories for studying such problems and is an absolute leader in usability questions. The new interface is created according to the Microsoft recommendations – for example, arrangement of elements in wizards, dialogue windows, the structure of main and popup menu etc. So users need minimal time to learn how to work with dbForge Schema Compare for SQL Server, and, of course, it looks very familiar to the Microsoft Visual Studio users. Configuration testing. We’ve performed configuration testing on the operation systems of the Windows family (2000, XP, 2008 Server, Vista, 7) of different capacity (x32 and x64), and on different SQL Server versions (2000, 2005, 2008 including Express Edition). Also dbForge Schema Compare for SQL Server was tested on different hardware – processors, monitors etc. So users can be sure of the reliability of the dbForge Schema Compare for SQL Server work regardless of the kind of hardware, OS, its capacity and settings, server version and edition he or she is using. dbForge Schema Compare for SQL Server also works with large fonts that is very important for disabled users. Help. This is the only document where you can find descriptions of all options and settings. Full and comprehensive product description is guaranteed by thorough testing performed by our specialists. For example, to get help on any comparison or synchronization option, it’s enough to press F1 in the window it is situated in, and find it in the help window that was opened. So users can be sure that, owing to the full and comprehensive help system of dbForge Schema Compare for SQL Server he or she will be able to find an answer to any question and learn the product functionality in the shortest period of time. Support. We appreciate users’ wishes, because we are working for them. No user wishes or suggestions are ignored. In dbForge Schema Compare for SQL Server v 2.0 all users’ wishes received since its first release were taken into account. So users can be sure that we appreciate and implement their wishes and are looking forward to receiving feedbacks to make dbForge Schema Compare for SQL Server even better. Tags [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html) [Twitter](https://twitter.com/intent/tweet?text=Ensuring+Quality+of+dbForge+Schema+Compare+for+SQL+Server%2C+v+2.0&url=https%3A%2F%2Fblog.devart.com%2Fensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/ensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html&title=Ensuring+Quality+of+dbForge+Schema+Compare+for+SQL+Server%2C+v+2.0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/ensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html&title=Ensuring+Quality+of+dbForge+Schema+Compare+for+SQL+Server%2C+v+2.0) [Copy URL](https://blog.devart.com/ensuring-quality-of-dbforge-schema-compare-for-sql-server-v-2-0.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 2 COMMENTS Eduardo Quintana April 19, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 3:37 pm Hi! I am a user of your superb SQL Squema Compare product. Today I have received an email alerting to the availability of a new version of this product (2.0.120). I have installed the new version, but I would like to know what is ‘new’ or ‘corrected’ in this version, but I could not locate a page in your site that shows this. Would you please inform me how to find this piece of information? Regards, Eduardo Quintana .jp April 20, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 2:37 pm You can see change list (product history) of dbForge Schema Compare for SQL Server here: [https://www.devart.com/dbforge/sql/schemacompare/revision_history.html](https://www.devart.com/dbforge/sql/schemacompare/revision_history.html) Comments are closed."} {"url": "https://blog.devart.com/entity-developer-6-0-new-orm-designer-for-telerik-data-access.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Developer 6.0 – New ORM Designer for Telerik Data Access By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 24, 2016 [4](https://blog.devart.com/entity-developer-6-0-new-orm-designer-for-telerik-data-access.html#comments) 6263 As Telerik announced a few months ago [in their blog](http://www.telerik.com/blogs/data-access-vnext-is-our-most-powerful-version-yet) , they deprecated visual designer and Visual Studio tools for their [Data Access ORM](http://www.telerik.com/data-access) (formerly, OpenAccess ORM) – a very popular and powerful data access framework. Since the Q2 2015 version, Data Access NuGet packages work only with code-only mapping. The deprecated [Telerik Data Access Visual Designer](http://docs.telerik.com/data-access/deprecated/feature-reference/tools/visual-designer/developemnt-environment-wizards-dialogs-model-tools-designer-designer) was very popular among Telerik Data Access users, and there are a lot of user posts and comments with demands for a visual model designer for this ORM on different websites. Seeing these requests, Devart decided to support Telerik Data Access in our ORM designer – Entity Developer. Entity Developer is a powerful ORM designer for several popular ORM solutions, and now we are glad to announce that it supports Telerik Data Access too! Entity Developer provides full support for Fluent Data Access mapping – it supports all kinds of inheritances, structures, composite IDs, etc. Our Designer supports all the familiar visual designer features, Telerik Data Access users are accustomed to and many more, and it can be used both as seamlessly integrated Visual Studio add-in and as a standalone application. Easy to Start, Easy to Use To start using our designer, just create a new Devart Data Access model via Create Model Wizard or open an existing .rlinq model that was created with the deprecated Visual Designer from Data Access Visual Studio integration. To create the Devart Data Access model for Visual Studio project, right-click the project in the Solution Explorer and choose Add -> New Item from the shortcut menu. In the displayed dialog select Devart Telerik Data Access Model. Then specify the model name in the Name box and click OK. Note: In the standalone Entity Developer application, on the File menu click New Model… in order to open the wizard. Create Model Wizard is displayed. Follow the wizard steps to create a database connection, select necessary tables, configure naming rules and other model settings, choose code generation templates and their options and automatically download and install the Telerik.DataAccess.Fluent NuGet package. In this wizard you can set the default parameters for the generated classes: default assembly and namespace, default schema, default identity generator, etc. You can also enable or disable automatic detection of many-to-many associations and TPT inheritances when creating the model. Entity Developer allows you to generate mapping containing all the schema details or preserve only the cross-database schema information to create database-independent models. Alternatively you may disable storing the database schema information in the mapping. After creating the model, Entity Developer can immediately generate class and mapping files for Telerik Data Access, and the generated code is ready-to-use in your project. After creating the model, it is saved as an Entity Developer for Telerik Data Access model file with the .daml extension. To create a Devart Data Access model from an existing .rlinq file that was created with Visual Designer from Data Access Visual Studio integration, just open the .rlinq file with Entity Developer and save the result model to a separate .daml file. The .rlinq file is not modified, it is only used as a source for an Entity Developer model. All the model entities, their relations and mapping are copied to the result Entity Developer model. After this you only need to add the new .daml model file to your project instead of the .rlinq file of the deprecated Telerik Visual Designer model. Generated Code Example Entity Developer generates the model context class, the Fluent mapping class, and POCO classes for model entities. For example, here is the generated code of the model context class: public partial class NorthwindDataAccessModel : OpenAccessContext\n {\n\n private static BackendConfiguration backend = GetBackendConfiguration();\n private static MetadataSource metadataSource = new NorthwindDataAccessModelMetadataSource();\n private static string connectionString = @\"NorthwindDataAccessModelConnectionString\";\n\n public NorthwindDataAccessModel() :\n base(connectionString, backend, metadataSource)\n {\n OnCreated();\n }\n\n public NorthwindDataAccessModel(BackendConfiguration backendConfiguration) :\n base(connectionString, backendConfiguration, metadataSource)\n {\n OnCreated();\n }\n\n public NorthwindDataAccessModel(string connection) :\n base(connection, backend, metadataSource)\n {\n OnCreated();\n }\n\n public NorthwindDataAccessModel(string connection, string cacheKey) :\n base(connection, cacheKey, backend, metadataSource)\n {\n OnCreated();\n }\n\n public NorthwindDataAccessModel(string connection, MetadataSource metadataSource) :\n base(connection, backend, metadataSource)\n {\n OnCreated();\n }\n\n public NorthwindDataAccessModel(string connection, BackendConfiguration backendConfiguration, MetadataSource metadataSource) :\n base(connection, backendConfiguration, metadataSource)\n {\n OnCreated();\n }\n public NorthwindDataAccessModel(string connection, string cacheKey, BackendConfiguration backendConfiguration, MetadataSource metadataSource) :\n base(connection, cacheKey, backendConfiguration, metadataSource)\n {\n OnCreated();\n }\n\n public IQueryable Categories\n {\n get\n {\n return this.GetAll();\n }\n }\n\n public static BackendConfiguration GetBackendConfiguration()\n {\n BackendConfiguration backend = new BackendConfiguration();\n backend.Backend = \"MsSql\";\n backend.ProviderName = \"System.Data.SqlClient\";\n\n CustomizeBackendConfiguration(ref backend);\n return backend;\n }\n\n static partial void CustomizeBackendConfiguration(ref BackendConfiguration config);\n partial void OnCreated();\n } Generated fluent mapping code: public partial class NorthwindDataAccessModelMetadataSource : FluentMetadataSource\n {\n protected override void SetContainerSettings(MetadataContainer container)\n {\n container.Name = \"NorthwindDataAccessModel\";\n container.DefaultNamespace = \"WindowsFormsApplication1\";\n container.DefaultMapping.NullForeignKey = true;\n OnSetContainerSettings(container);\n }\n protected override IList PrepareMapping()\n {\n List mappingConfigurations = new List();\n mappingConfigurations.Add(this.GetCategoryMappingConfiguration());\n \n OnPrepareMapping(mappingConfigurations);\n return mappingConfigurations;\n }\n \n #region Category Mapping\n\n public MappingConfiguration GetCategoryMappingConfiguration()\n {\n MappingConfiguration configuration = this.GetCategoryClassConfiguration();\n this.PrepareCategoryConfigurations(configuration);\n this.OnPrepareCategoryConfigurations(configuration);\n return configuration;\n }\n\n public MappingConfiguration GetCategoryClassConfiguration()\n {\n MappingConfiguration configuration = new MappingConfiguration();\n configuration.MapType(x => new { }).WithConcurencyControl(OptimisticConcurrencyControlStrategy.Changed).ToTable(\"dbo.Categories\");\n return configuration;\n }\n\t\n public void PrepareCategoryConfigurations(MappingConfiguration configuration)\n {\n configuration.HasProperty(x => x.CategoryID).ToColumn(@\"CategoryID\").IsIdentity(KeyGenerator.Autoinc).WithOpenAccessType(OpenAccessType.Int32).HasColumnType(\"int\").IsNotNullable().HasPrecision(10);\n configuration.HasProperty(x => x.CategoryName).ToColumn(@\"CategoryName\").WithOpenAccessType(OpenAccessType.Varchar).HasColumnType(\"nvarchar\").IsNotNullable().HasLength(15).IsUnicode();\n configuration.HasProperty(x => x.Description).ToColumn(@\"Description\").WithOpenAccessType(OpenAccessType.Varchar).HasColumnType(\"nvarchar\").IsNullable().IsUnicode();\n configuration.HasProperty(x => x.Picture).ToColumn(@\"Picture\").WithOpenAccessType(OpenAccessType.VarBinary).HasColumnType(\"varbinary\").IsNullable();\n }\n\n partial void OnPrepareCategoryConfigurations(MappingConfiguration configuration);\n\n #endregion\n \n #region Extensibility Method Definitions\n partial void OnSetContainerSettings(MetadataContainer container);\n partial void OnPrepareMapping(List mappingConfigurations);\n \n #endregion\n } And here is a code, generated for the Category model class: public partial class Category {\n\n public Category()\n {\n OnCreated();\n }\n\n public virtual int CategoryID\n {\n get;\n set;\n }\n\n public virtual string CategoryName\n {\n get;\n set;\n }\n\n public virtual string Description\n {\n get;\n set;\n }\n\n public virtual byte[] Picture\n {\n get;\n set;\n }\n \n #region Extensibility Method Definitions\n\n partial void OnCreated();\n \n #endregion\n } Design Approaches With Entity Developer you can use [Model-First](https://www.devart.com/entitydeveloper/model-first.html) and [Database-First](https://www.devart.com/entitydeveloper/database-first.html) approaches to design your ORM model and generate code for it. It introduces new approaches for designing ORM models, boosts productivity, and facilitates the development of database applications. The Update From Database and Update To Database wizards detect all the database changes that can affect the model, e.g. created and deleted tables and views, their columns and foreign keys, column datatype changes, etc. All changes are displayed in an easy-to-understand form, and you can select only a part of the changes to apply. Entity Developer also includes Generate Database Wizard, which generates a DDL script, creating database tables. Tweaking Configuration Entity Developer for Data Access is also capable to generate Data Access configuration. You may tweak the Data Access configuration settings in the Model Settings dialog box and then generate the configuration code either to the App.config file of the project or directly to the context class code. You can even disable configuration generation and create configuration yourself with the parial CustomizeBackendConfiguration(ref BackendConfiguration config) method. If a configuration is generated as context class code, it will look like the following: public static BackendConfiguration GetBackendConfiguration()\n {\n BackendConfiguration backend = new BackendConfiguration();\n backend.Backend = \"MsSql\";\n backend.ProviderName = \"System.Data.SqlClient\";\n backend.Runtime.AllowReadAfterDelete = true;\n backend.Runtime.AllowReadAfterDispose = true;\n backend.Runtime.ClassBehavior = DataAccessKind.ReadWrite;\n\n CustomizeBackendConfiguration(ref backend);\n return backend;\n } And if it is generated to the config file, the following XML code is added to it: \n
\n \n \n \n \n \n Rich Mapping Functionality Entity Developer for Data Access supports almost all features of Data Access Fluent Mapping API. It supports mapping entity to several tables, one-to-many, one-to-one and many-to-many associations, complex types, composite IDs, all kinds of inheritance hierarchies, enum types, etc. It provides wide support for different kinds of stored procedures and functions, allowing you to create both methods with no or scalar results and methods, returning datasets as complex types or entities, from the stored routines. Queries and Data When designing and especially when debugging model, it is often necessary to view table data or fill tables with some test data. Additionally, it’s very convenient to be able to check your model and mapping by querying data via the ORM. Entity Developer allows viewing and editing data of tables, views, and model entities, create and execute LINQ queries against the model, eliminating the need for additional applications and reducing time for accessing these operations. You can see more details about working with queries and data in Entity Developer [here](https://www.devart.com/entitydeveloper/data.html) . Productivity Entity Developer provides powerful features to automate or speed-up common model editing operations. To accelerate model design process, it provides wide support for drag-and-drop functionality. After you have established a database connection, you may drag database tables and views from the Database Explorer window to your model diagram to create classes for these tables with already defined mapping. Advanced model refactoring features go even further and allow such operations as creating a TPC inheritance hierarchy from a group of classes or extracting common properties from several classes to a complex type to be performed almost instantly. Tags [entity developer](https://blog.devart.com/tag/entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-0-new-orm-designer-for-telerik-data-access.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Developer+6.0+%E2%80%93+New+ORM+Designer+for+Telerik+Data+Access&url=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-0-new-orm-designer-for-telerik-data-access.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-developer-6-0-new-orm-designer-for-telerik-data-access.html&title=Entity+Developer+6.0+%E2%80%93+New+ORM+Designer+for+Telerik+Data+Access) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-developer-6-0-new-orm-designer-for-telerik-data-access.html&title=Entity+Developer+6.0+%E2%80%93+New+ORM+Designer+for+Telerik+Data+Access) [Copy URL](https://blog.devart.com/entity-developer-6-0-new-orm-designer-for-telerik-data-access.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 4 COMMENTS Aron October 10, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 4:26 pm Where is the link to the product/download/order page? Jeffrey Monroe November 16, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 7:17 pm Thanks for taking up the gap left by Telerik in regards to the Visual tools for Data Access. I do have two questions though: 1. Is there a setting to generate the SchemaUpdate method as reference here? : [http://docs.telerik.com/data-access/developers-guide/code-only-mapping/getting-started/how-to-create/fluent-mapping-getting-started-migrate-database](http://docs.telerik.com/data-access/developers-guide/code-only-mapping/getting-started/how-to-create/fluent-mapping-getting-started-migrate-database) 2. Is there a way to create additional indexes on the generated tables? Thanks Muthu Kumar June 6, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 5:45 pm We have been using telerik data access.. can we use your tool to migrate from telerik open access? Shalex December 25, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 4:20 pm 0. The product page is [https://www.devart.com/entitydeveloper/](https://www.devart.com/entitydeveloper/) . 1. A predefined Data Access template for Telerik Data Access Model (*.daml) includes the Generate Partial Class property (by default, False). Set it to True, as a result for each class of the model a partial class will be generated, in which it will be possible to add code that won’t be overwritten by the designer. Place this code in a partial class. 2. Creation of indexes is discussed at [https://forums.devart.com/viewtopic.php?t=29722](https://forums.devart.com/viewtopic.php?t=29722) . 3. To migrate from Telerik Data Access Visual Designer, please open an existing .rlinq model in Entity Developer, our tool will convert it to *.daml used by Entity Developer. Comments are closed."} {"url": "https://blog.devart.com/entity-developer-6-12-with-more-data-types-template-improvements-and-more.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Developer 6.12 with More Data Types, Template Improvements, and More By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 29, 2021 [0](https://blog.devart.com/entity-developer-6-12-with-more-data-types-template-improvements-and-more.html#respond) 3919 The new release of [Entity Developer 6.12](https://www.devart.com/entitydeveloper/) mainly focuses on Entity Framework Core support improvements, that are based on the feedback of our users. We will continue to work on these features, and some of them may be also ported for other ORMs in future. Entity Framework Core 6 Previously, EF Core 6 supported both .NET 5 and .NET 6 in its first preview versions. Previous Entity Developer versions supported EF Core 6, and you could design EF Core 6 models in both Entity Developer integrated to Visual Studio 2019 Preview and in the standalone and console versions. Currently EF Core 6 has discontinued .NET 5 support, and supports only .NET 6. .NET 6 support has also been removed from Visual Studio 2019 Preview, and is currently available only in Visual Studio 2022. The latter doesn’t have a stable version yet, and has the Preview status. Entity Developer doesn’t yet support integration to Visual Studio 2022, so now it supports EF Core 6 models only in the standalone and console versions. Support for New Types We continue extending the list of types, supported in EF Core models. The previous Entity Developer version introduced support for IPAddress and PhysicalAddress types. The new 6.12 version adds Uri to the list of supported types (to the Other Types section). Uri type is supported for EF Core 3, EF Core 5, and EF Core 6 models. EF Core providers that support this type, usually map it to a string data type in a database (varchar, text, etc.). .NET 6 also introduced new DateOnly and TimeOnly types that can be more suitable in some cases than more universal classical DateTime and TimeSpan types. Some EF Core 6 providers started to support these types, so the new Entity Developer version supports them too. DateOnly and TimeOnly types are now available in the Primitive Types section. The list of supported types in Entity Developer grew significantly over time, but the corresponding Type UI list in Property Editor dialog box had a small size by default. It has been increased significantly in the new Entity Developer version, making selection of a property type more convenient. Selecting property type Storing Model Connection Previously, a model connection string was always stored in one of the model files (in a edps file, specifically). However, in a number of cases this approach is not suitable because of the security considerations. This applies to cases when the model is stored in a code repository (Git/Mercurial/SVN/etc.), and more people have access to this repository than should have access to the connection string. Preiously, the only security feature, allowing you to limit sensitive security information stored in the model was the Persist Security Info parameter. If set to False, password was removed from the connection string when saving the model. This is not enough in many cases, so the new Entity Developer version offers additional security features. Now you can discard saving the connection string in model file completely. This offers the highest security, but can be excessive and inconvenient if you need to connect to the database often, for example, to sync model changes with the database or vice versa. That’s why we have supported storing connection string to Entity Developer storage for EF Core models. The connection string is still linked to the model, but is stored separately, in the Entity Developer settings directory for the current user: %APPDATA%\\Devart\\Entity Developer\\Storage\\ Connection strings are stored in the following files: %APPDATA%\\Devart\\Entity Developer\\Storage\\.xml As an alternative, you can also store the connection string in an environment variable of the current user. This alternative is suitable in many cases, but not universal. It may not fit for cases with very long connection strings or if the user already has a lot of environment variables. Storing model connection string As for our future plans for this feature, we are considering adding at least partial support of the ASP.NET Core User Secrets feature. This task has its own complications, because this feature is intended only for .NET Core applications and is closely associated with with Visual Studio projects. Code Generation Template Improvements Repository and Unit Of Work Template The Repository and Unit Of Work template has got the new “Generate UnitOfWork Repository Properties” property for EF Core models (by default, True). Before this property, the generated IUnitOfWork interface had only the Save() method, and obtaining repositories (which could be implemented in multiple ways) had to be implemented by the users. public partial interface IUnitOfWork : IDisposable\n {\n void Save();\n } Now, if you set “Generate UnitOfWork Repository Properties” to True, the IUnitOfWork interface provides access to all repositories: public partial interface IUnitOfWork : IDisposable\n {\n IRepository Emps { get; }\n IRepository Depts { get; }\n void Save();\n } Data Transfer Object Template The Data Transfer Object (DTO) template now has the new “Validation Framework” and “Validation Error Messages” properties for all ORMs. The main EF Core template had them before, for generating the DataAnnotation attributes for model classes. Now you can easily generate them for DTO classes as well. public partial class DeptDto\n {\n [Key]\n [Required()]\n public int Deptno { get; set; }\n \n [StringLength(14)]\n public string Dname { get; set; }\n \n [StringLength(13)]\n public string Loc { get; set; }\n \n public List Emps { get; set; }\n } EF Core Template One of the unique EF Core model feature is support for enum types, defined in the model, and generating code for them. However, in some cases when using such enums as property types of entities and DTO classes, there may be a problem with DTO template if output is generated to different projects. For example: The EF Core template generates everything to a project A. The Data Transfer Objects template generates DTO classes to another project B that doesn’t reference A. The Data Transfer Objects template generates DTO converter classes to a project C, which references A and B. In this case, there is a need to specify output for enums. For example, in the above case, we need to generate enums in a new project D and add references to it in all the above projects. For this purpose, we have added the new “Enum Output” property to the EF Core template, that allows you to specify output for generated enums. Conclusion Entity Developer development is heavily based on the [feedback](https://www.devart.com/entitydeveloper/feedback.html) of our users. Please do not hesitate to share your use cases and suggestions, and we will try our best to provide a better user experience for you. Tags [entity developer](https://blog.devart.com/tag/entity-developer) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [what's new entity developer](https://blog.devart.com/tag/whats-new-entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-12-with-more-data-types-template-improvements-and-more.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Developer+6.12+with+More+Data+Types%2C+Template+Improvements%2C+and+More&url=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-12-with-more-data-types-template-improvements-and-more.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-developer-6-12-with-more-data-types-template-improvements-and-more.html&title=Entity+Developer+6.12+with+More+Data+Types%2C+Template+Improvements%2C+and+More) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-developer-6-12-with-more-data-types-template-improvements-and-more.html&title=Entity+Developer+6.12+with+More+Data+Types%2C+Template+Improvements%2C+and+More) [Copy URL](https://blog.devart.com/entity-developer-6-12-with-more-data-types-template-improvements-and-more.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Developer 6.7 with Console App, .NET Core Support Improvements and More By [dotConnect Team](https://blog.devart.com/author/dotconnect) April 2, 2020 [0](https://blog.devart.com/entity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html#respond) 4521 Devart is glad to announce the release of [Entity Developer 6.7](https://www.devart.com/entitydeveloper/) – a visual ORM designer for a wide variety of ORMs. The new version comes with a console application added for convenient development processes automation and improves support for .NET Core development and third-party providers. Console Application The new 6.7 version comes with console entity developer version that allows you to automate your build and development processes and perform essential actions via the command line. This application, called ed.exe, is located in the Console subfolder of the Entity Developer application folder. For your convenience, the installer automatically adds this folder to the path environment variable, so you may call it from any folder without specifying the full path. The console version of Entity Developer can: Generate a model from a database Validate a model Re-generate code from the model Generate create database from a model script Generate update database from a model script Please try it and leave feedback. If you would like to see more features available via console, email us or write on [our forum](https://forums.devart.com/viewforum.php?f=32) , and we will consider extending the console application features. Other Improvements The new version of Entity Developer now supports adding NHibernate models to Visual Studio projects, targeting .NET Core or .NET Standard. Note that this requires NHibernate 5.1 or higher. Additionally, Entity Developer improves third-party providers support. When a third-party provider is added via NuGet, Entity Developer no longer requires it to be registered in the config file as before. The standalone version Entity Developer can now work with third-party providers from NuGet package cache. A complete list of improvements can be found [here](https://www.devart.com/entitydeveloper/) . Tags [command line](https://blog.devart.com/tag/command-line) [entity developer](https://blog.devart.com/tag/entity-developer) [nhibernate](https://blog.devart.com/tag/nhibernate) [what's new entity developer](https://blog.devart.com/tag/whats-new-entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Developer+6.7+with+Console+App%2C+.NET+Core+Support+Improvements+and+More&url=https%3A%2F%2Fblog.devart.com%2Fentity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html&title=Entity+Developer+6.7+with+Console+App%2C+.NET+Core+Support+Improvements+and+More) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html&title=Entity+Developer+6.7+with+Console+App%2C+.NET+Core+Support+Improvements+and+More) [Copy URL](https://blog.devart.com/entity-developer-6-7-with-console-app-net-core-support-improvements-and-more.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-developer-ef-code-first-dbcontext-template.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Developer – EF Code First DbContext Template By [dotConnect Team](https://blog.devart.com/author/dotconnect) May 17, 2011 [0](https://blog.devart.com/entity-developer-ef-code-first-dbcontext-template.html#respond) 5869 Note: To use the template “DbContext”, Entity Framework 4.1 and Entity Framework 4.0 must be installed on your computer. April 2011 saw the [release](http://blogs.msdn.com/b/adonet/archive/2011/04/11/ef-4-1-released.aspx) of a new version of [Entity Framework 4.1](http://msdn.microsoft.com/en-us/library/gg696172%28v=VS.103%29.aspx) that supports fluent mapping and DbContext API. The latest version of [Devart Entity Developer](http://www.devart.com/entitydeveloper/) is extended with the DbContext template that enables the use of new features in EF v4.1. Initially, fluent mapping was intended to be used in the Code-First (Code Only) approach. However, thanks to our new template, fluent mapping can now be used not only in the Code-First approach, but in the [Database-First](http://www.devart.com/entitydeveloper/database-first.html) / [Model-First](http://www.devart.com/entitydeveloper/model-first.html) approaches as well. DbContext API New DbContext API is the major novelty in Entity Framework 4.1. Being an ObjectContext wrapper, this is a more lightweight alternative to the use of [ObjectContext](http://msdn.microsoft.com/en-us/library/system.data.objects.objectcontext.aspx) in EFv1/EFv4. DbContext can be used both with fluent mapping and XML-mapping. Here we will not delve into details about the new API. For more information on DbContext, use the [MSDN](http://msdn.microsoft.com/en-us/library/system.data.entity.dbcontext%28v=vs.103%29.aspx) documentation and a number of publications on the [ADO.NET team blog](https://docs.microsoft.com/en-us/archive/blogs/adonet/) . In the Entity Developer template, you can define the following options of [DbContext configuration](http://msdn.microsoft.com/en-us/library/system.data.entity.infrastructure.dbcontextconfiguration%28v=vs.103%29.aspx) : [AutoDetectChangesEnabled](http://blogs.msdn.com/b/adonet/archive/2011/02/06/using-dbcontext-in-ef-feature-ctp5-part-12-automatically-detecting-changes.aspx) – Determines whether the automatic detection of changes in the configuration is enabled. The default value is true. [ProxyCreationEnabled](http://blogs.msdn.com/b/adonet/archive/2011/02/02/using-dbcontext-in-ef-feature-ctp5-part-8-working-with-proxies.aspx) – Determines whether the framework will create instances of dynamically generated proxy classes whenever it creates an instance of an entity type. Note that even if proxy creation is enabled, proxy instances will only be created for entity types that meet the requirements for being proxied. The default value is true. [ValidateOnSaveEnabled](http://blogs.msdn.com/b/adonet/archive/2010/12/15/ef-feature-ctp5-validation.aspx) – Determines if tracked entities are validated automatically when SaveChanges() is called. The default value is true. Fluent mapping The  DbContext template of  Entity Developer contains the FluentMapping option that determines, if the Entity Framework code only mapping is included into the generated code or XML mapping from the edml file is used. The default value is false, which means that XML mapping is used. Alternatively, if FluentMapping is set to True, fluent mapping is generated. This process generates column facets, the primary key for tables, foreign keys, complex types, taking into account inheritance of all three types (TPH, TPT, and TPC) and simple entity splitting. Below is a brief example of generated mapping for a simple model consisting of three classes with Table Per Hierarchy (TPH): protected override void OnModelCreating(DbModelBuilder modelBuilder)\n{\n    #region TphRoot\n\n    modelBuilder.Entity()\n        .HasKey(p => new { p.Id })\n        .Map(tph => {\n            tph.Requires(\"ConditionColumn\").HasValue(\"r\");\n            tph.ToTable(\"TPH_TABLE\", \"SCOTT\");\n        });\n    // Properties:\n    modelBuilder.Entity()\n        .Property(p => p.Id)\n            .IsRequired()\n            .HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity)\n            .HasColumnType(\"int\");\n    modelBuilder.Entity()\n        .Property(p => p.Name)\n            .HasMaxLength(16)\n            .HasColumnType(\"VARCHAR2\");\n\n    #endregion\n\n    #region TphChildA\n\n    modelBuilder.Entity()\n        .Map(tph => {\n            tph.Requires(\"ConditionColumn\").HasValue(\"a\");\n            tph.ToTable(\"TPH_TABLE\", \"SCOTT\");\n        });\n    // Property:\n    modelBuilder.Entity()\n        .Property(p => p.PropertyA)\n            .HasMaxLength(120)\n            .HasColumnType(\"NVARCHAR2\");\n\n    #endregion\n\n    #region TphChildB\n\n    modelBuilder.Entity()\n        .Map(tph => {\n            tph.Requires(\"ConditionColumn\").HasValue(\"b\");\n            tph.ToTable(\"TPH_TABLE\", \"SCOTT\");\n        });\n    // Property:\n    modelBuilder.Entity()\n        .Property(p => p.PropertyB)\n            .HasColumnType(\"BLOB\");\n\n    #endregion\n} Code-First Configuration Conventions The DbContext template of Entity Developer contains the DisabledConventions option that allows you to disable the configuration conventions of Code-First. The Entity Framework Code-First list of conventions is too large, that’s why our DisabledConventions list does not include various attribute conventions (for example, ColumnAttributeConvention, ComplexTypeAttributeConvention and others) that the user normally does not need to disable. One of the most useful conventions that gets disabled more often than not is [IncludeMetadataConvention](http://msdn.microsoft.com/en-us/library/system.data.entity.infrastructure.includemetadataconvention%28v=vs.103%29.aspx) . This convention determines whether it is necessary to create / use / delete the “EdmMetadata” table that stores the model hash which is used to check whether the model has changed since the database was created from it. The full list of code-first configuration conventions and their descriptions are published on [MSDN](http://msdn.microsoft.com/en-us/library/system.data.entity.modelconfiguration.conventions%28v=VS.103%29.aspx) . Simultaneous Use of Several Databases Previously, there was only one way of creating an EF-application that uses several databases, for example, Oracle + Microsoft SQL Server. First, we created several EF-models, one model per DBMS. Then we put the following resources into the application: one CSDL, one MSL and one SSDL per each database. Then, depending on the server, we changed the connection string that specified different resources. Fluent mapping is a real alternative to the approach involving several EF-models and different SSDL-resources. Now it is enough to create one EF-model in the Entity Developer application using the [Database-First](http://www.devart.com/entitydeveloper/database-first.html) or [Model-First](http://www.devart.com/entitydeveloper/model-first.html) approach. Following that, we remove the standard code generation template and enable the DbContext template. In the template properties, we need to set FluentMapping = True and DatabaseIndependent = True . The DatabaseIndependent property determines whether the database-specific data type is set when fluent mapping is generated. If DatabaseIndependent is set by default to False , the following code is generated: modelBuilder.Entity()\n .Property(p => p.Dname)\n .HasColumnName(\"DNAME\")\n .HasMaxLength(14)\n .HasColumnType(\"VARCHAR2\");\n modelBuilder.Entity()\n .Property(p => p.NCharField)\n .HasColumnName(\"NCHAR_FIELD\")\n .HasMaxLength(16)\n .HasColumnType(\"NCHAR\"); When set to True : modelBuilder.Entity()\n .Property(p => p.Dname)\n .HasColumnName(\"DNAME\")\n .HasMaxLength(14)\n .IsUnicode(false);\n modelBuilder.Entity()\n .Property(p => p.NCharField)\n .HasColumnName(\"NCHAR_FIELD\")\n .HasMaxLength(16)\n .IsFixedLength()\n .IsUnicode(true); In the latter case, the EF-provider is responsible for the selection of a particular database-specific data type. The provider can be defined in several ways. For more information on how to define a provider-specific connection, see Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite . Fluent Mapping Limitations When using the Code-First development approach, you should remember that fluent mapping in Entity Framework 4.1 does not fully support all capabilities provided by conventional XML-mapping. Thus, within fluent mapping, you can no longer use: stored procedures; compiled queries; complex entity splitting. See also the following article on [MSDN: What’s Not Supported (Entity Framework 4.1)](http://msdn.microsoft.com/en-us/library/gg696165%28v=VS.103%29.aspx) . Besides, when using the Code-First development approach, note that when dynamic database creation is used, you cannot influence the name generation mechanism for: primary keys; foreign keys; indexes; constraints; sequences and triggers (Oracle only). See also For more information on how to use fluent mapping (code first) in our EF-providers, see Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite . You can also download samples there. In the latest version of our EF-providers we have implemented the possibility to more flexibly configure the process of database creation and dropping. The specifics of configuration is described in the article New Features of Entity Framework Support in dotConnect Providers . Tags [code first](https://blog.devart.com/tag/code-first) [entity framework](https://blog.devart.com/tag/entity-framework) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-developer-ef-code-first-dbcontext-template.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Developer+%E2%80%93+EF+Code+First+DbContext+Template&url=https%3A%2F%2Fblog.devart.com%2Fentity-developer-ef-code-first-dbcontext-template.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-developer-ef-code-first-dbcontext-template.html&title=Entity+Developer+%E2%80%93+EF+Code+First+DbContext+Template) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-developer-ef-code-first-dbcontext-template.html&title=Entity+Developer+%E2%80%93+EF+Code+First+DbContext+Template) [Copy URL](https://blog.devart.com/entity-developer-ef-code-first-dbcontext-template.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-4-release-candidate-supported.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework 4 Release Candidate supported! By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 31, 2010 [0](https://blog.devart.com/entity-framework-4-release-candidate-supported.html#respond) 3062 We have supported new functionality of Entity Framework 4 including Entity Framework v4 Release Candidate for [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) . In this article we consider basic new supported possibilities in comparison with Entity Framework v1. Take into account that the new features of Entity Framework v4 like Persistence Ignorance (POCO), Self-tracking entities, Code Only etc. which don’t require support of the provider writers aren’t described here. EF run-time New basic features of Entity Framework v4 Release Candidate run-time help you to create powerful database applications. DDL generation. Now you can generate and execute a DDL script with the help of the CreateDatabase(), DropDatabase(), and CreateDatabaseScript() methods. You can use this approach if you have a model. New canonical functions. More than thirty new canonical functions have been added for different data providers. We have supported new statistical and scalar functions. Translating String.StartsWith, String.EndsWith and String.Contains to LIKE in LINQ to Entities. This functionality was successfully supported for Entity Framework v4 and Entity Framework v1. Database-specific built-in functions in LINQ. Entity Framework v1 allows to define database-specific functions in provider manifest. These functions can be called with the help of Entity SQL. You can use these database-specific functions in LINQ in Entity Framework v4 with the help of a scalar method definition for every specific function. The classes representing aggregate and scalar-valued functions are described in the table below: Connector Assembly Scalar-valued functions Aggregate functions [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) Devart.Data.Oracle.Entity.dll OracleFunctions OracleAggregateFunctions [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) Devart.Data.MySql.Entity.dll MySqlFunctions MySqlAggregateFunctions [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) Devart.Data.PostgreSql.Entity.dll PgSqlFunctions PgSqlAggregateFunctions [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) Devart.Data.SQLite.Entity.dll SQLiteFunctions SQLiteAggregateFunctions For more information about dynamic database creation and canonical functions see articles Dynamic Database Creation in Entity Framework and Entity Framework Canonical Functions . Visual Studio Model First. This feature was partially supported in VS 2010. We have written T4-templates for [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) to work with it. So if you want to generate a DDL script just select an appropriate T4-template in model properties and define a connection string to the needed provider. The names of DDL-templates for data providers are presented in the table below: Connector DDL template name [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) Devart SSDLToOracle.tt [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) Devart SSDLToMySql.tt [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) Devart SSDLToPostgreSql.tt For more information about Model First see article [Model First with Entity Framework 4](http://blogs.msdn.com/adonet/archive/2009/11/05/model-first-with-the-entity-framework-4.aspx) . Entity Developer Designing Entity Framework v4 models was supported in [Entity Developer](http://www.devart.com/entitydeveloper/) and code generation was enhanced. ComplexTypes returned from stored procedures, and FK Properties and FK Assosiations were also supported. See also : Entity Framework: SQL Generation Enhancements for IN Clause Dynamic Database Creation in Entity Framework for Oracle, MySQL, PostgreSQL and SQLite Tags [entity framework](https://blog.devart.com/tag/entity-framework) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [sqlite](https://blog.devart.com/tag/sqlite) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-4-release-candidate-supported.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+4+Release+Candidate+supported%21&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-4-release-candidate-supported.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-4-release-candidate-supported.html&title=Entity+Framework+4+Release+Candidate+supported%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-4-release-candidate-supported.html&title=Entity+Framework+4+Release+Candidate+supported%21) [Copy URL](https://blog.devart.com/entity-framework-4-release-candidate-supported.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework 6.3 and .NET Core 3 Support By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 13, 2019 [0](https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html#respond) 6155 .NET Core 3 .NET Core was presented by Microsoft in 2016, but its 1.x versions had limited set of features comparing to Full .NET Framework. Since then .NET Core has been drastically improved. .NET Core 2.0 has a significant part of Full .NET Framework features and includes new functionality and significant performance optimizations. This year, a new .NET Core 3 is coming. It is currently available as [.NET Core 3.0 Preview 9](https://devblogs.microsoft.com/dotnet/announcing-net-core-3-0-preview-9/) , and despite its preview status, Microsoft officially recommends it to full production use. Besides, Microsoft sends some signals that the development of Full .NET Framework is frozen, its 4.8 version will be the last, and will receive mostly fixes after .NET Core 3.0 release, and the latter will become the recommended platform for all the new applications immediately after release. Meanwhile the [Porting desktop apps to .NET Core](https://devblogs.microsoft.com/dotnet/porting-desktop-apps-to-net-core/) and [Port your code from .NET Framework to .NET Core](https://docs.microsoft.com/en-us/dotnet/core/porting/) articles are posted on Microsoft resources. .NET Core 3 has very many new features, and it’s impossible to describe them in a blog article. You can start learning about them by reading the [What’s new in .NET Core 3.0](https://docs.microsoft.com/en-us/dotnet/core/whats-new/dotnet-core-3-0) article. As for performance improvements, you can read the following articles about them: [Performance Improvements in .NET Core vs .NET Framework 4.7](https://devblogs.microsoft.com/dotnet/performance-improvements-in-net-core/) [Performance Improvements in .NET Core 2.1](https://devblogs.microsoft.com/dotnet/performance-improvements-in-net-core-2-1/) [Performance Improvements in .NET Core 3.0](https://devblogs.microsoft.com/dotnet/performance-improvements-in-net-core-3-0/) Devart’s ADO.NET providers for Oracle, MySQL, PostgreSQL, and SQLite support all the .NET Core versions: previous 1.x and 2.x releases as well as the new .NET Core 3.0, and can be used on Windows, Linux and Mac OS. They are available as the following NuGet packages: [Devart.Data.Oracle](https://www.nuget.org/packages/Devart.Data.Oracle/) [Devart.Data.MySql](https://www.nuget.org/packages/Devart.Data.MySql/) [Devart.Data.PostgreSql](https://www.nuget.org/packages/Devart.Data.PostgreSql/) [Devart.Data.SQLite](https://www.nuget.org/packages/Devart.Data.SQLite/) We are currently studying, which features to add to our providers for .NET Core 3. The features will be added based on feedback of our users, so please visit [our forum](https://forums.devart.com/viewforum.php?f=41) or feedback pages and post your feature requests there. Entity Framework 6.3 Entity Framework 6 users stood aside of this rapid .NET Core development spurt. Till recently, they only could migrate to .NET Core by migrating to [Entity Framework Core](https://docs.microsoft.com/en-us/ef/core/) at the same time. While Entity Framework Core has a decent degree of compatibility with classic Entity Framework 6, some of Entity Framework 6 features are still not implemented, and some have significant differences in behavior and implementation (see the [Entity Framework Core and Entity Framework 6 comparison](https://docs.microsoft.com/en-us/ef/efcore-and-ef6/index) . Besides, fast .NET Core development causes a significant number of growth issues when updates cause some previous features to stop working (see [here](https://github.com/aspnet/EntityFrameworkCore/issues) ). This holds migration of existing Entity Framework 6 projects to Entity Framework Core. Besides, functionality differences make some of the developers choose familiar Entity Framework 6 for new projects too. The following diagram shows the NuGet.org download statistics and demonstrates that despite rapid growth of interest to Entity Framework Core, Entity Framework 6 is still highly popular for use in new projects, and there are no signs of growth recession. Entity Framework 6.3 is a real game-changer, because it now supports .NET Core 3 and opens a way to both migrate existing Entity Framework Classic applications and create new ones, using .NET Core 3 with EF 6.3. Entity Framework 6.3 is currently at the pre-release stage of development ( [Preview 9](https://devblogs.microsoft.com/dotnet/announcing-entity-framework-core-3-0-preview-9-and-entity-framework-6-3-preview-9/) ), but it is already fully-functional and can be used for migration to .NET Core 3. With the appearance of Entity Framework 6.3, we announce the first major addition to .NET Core related features of our ADO.NET providers for .NET Core 3: Entity Framework 6.3 support in our new NuGet packages: [Devart.Data.Oracle.EF6](https://www.nuget.org/packages/Devart.Data.Oracle.EF6/) [Devart.Data.MySql.EF6](https://www.nuget.org/packages/Devart.Data.MySql.EF6/) [Devart.Data.PostgreSql.EF6](https://www.nuget.org/packages/Devart.Data.PostgreSql.EF6/) [Devart.Data.SQLite.EF6](https://www.nuget.org/packages/Devart.Data.SQLite.EF6/) These NuGet packages contain .NET Standard 2.1 assemblies, compatible with .NET Core 3 runtime. We tried hard to maintain all the existing features in order to ease migration of existing projects to Entity Framework 6.3 and .NET Core 3. There are, however, some differences, described below, that must be considered during migration. Provider Registration for Entity Framework 6.3 Registering an Entity Framework provider is different for Entity Framework 6.0 – 6.2 and for Entity Framework 6.3. Entity Framework 6.0 – 6.2 allows registering Entity Framework provider in the app.config file of the application. Here is an example for dotConnect for Oracle: \n\n \n \n
\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Entity Framework 6.3 does not use such registration. Instead it uses two code based approaches: The first one is to use the DbConfigurationType attribute. It suits only for DbContext. Here is an example for dotConnect for Oracle: [DbConfigurationType(typeof(Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration))]\n public class MyContext : DbContext {\n \n // ...\n } The second way is to use the static SetConfiguration method of the DbConfiguration class. It can be used both with DbContext and ObjectContext. Here is an example for dotConnect for Oracle: public class MyContext: ObjectContext {\n \n static MyContext() {\n \n DbConfiguration.SetConfiguration(new Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration());\n }\n \n // ...\n } Both of the above examples use the OracleEntityProviderServicesConfiguration class that suites for most users. All our Entity Framework 6.3 providers have a similar class: Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration (dotConnect for Oracle) Devart.Data.MySql.Entity.MySqlEntityProviderServicesConfiguration (dotConnect for MySQL) Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServicesConfiguration (dotConnect for PostgreSQL) Devart.Data.SQLite.Entity.SQLiteEntityProviderServicesConfiguration (dotConnect for SQLite) If Entity Framework 6.3 provider configuration must be customized, you will need to create a descendant of the DbConfiguration class and set values for SetProviderFactory() and SetProviderServices() in it. Then use this custom class for provider registration. Here is an example for dotConnect for Oracle: public class OracleConfiguration: DbConfiguration {\n \n public OracleConfiguration() {\n \n SetProviderFactory(\"Devart.Data.Oracle\", Devart.Data.Oracle.OracleProviderFactory.Instance);\n SetProviderServices(\"Devart.Data.Oracle\", Devart.Data.Oracle.Entity.OracleEntityProviderServices.Instance);\n } \n } Entity Framework Spatials Support for Entity Framework 6.3 Entity Framework Spatials are supported in dotConnect for Oracle, dotConnect for MySQL, dotConnect for PostgreSQL. This support was implemented way back for Entity Framework 5 and Entity Framework 6, working on Full .NET Framework. Most Entity Framework Spatials features, supported for previous Entity Framework versions, are supported for Entity Framework 6.3 and .NET Core 3 too, with the exception of the following. Outdated SharpMap and unsigned NetTopologySuite Entity Framework spatial services are not supported, because they don’t have .NET Core-compatible implementations. Signed NetTopologySuite is upgraded to the version 15.5.3. However, since Entity Framework features are not required for many Entity Framework users, our NuGet packages don’t reference to the NetTopologySuite package. If you need to use NetTopologySuite, install its NuGet package manually via Package Manager Console: Install-Package NetTopologySuite -Version 1.15.3 Entity Framework spatial service registration changed. Now it must be performed as early as possible – prior to Entity Framework 6.3 engine initialization. If you have code with Database.SetInitializer preceding Entity Framework spatial service initialization that worked OK with Entity Framework 6.2, it won’t work with Entity Framework 6.3. For example, the following code is correct for Entity Framework 6.2, but won’t work with Entity Framework 6.3: public class MyContext: DbContext {\n \n static MyContext() {\n \n Database.SetInitializer(null);\n \n var config = OracleEntityProviderConfig.Instance;\n config.SpatialOptions.SpatialServiceType = SpatialServiceType.NetTopologySuiteSigned;\n config.SpatialOptions.GeographyDistanceUnit = DistanceMeasurementUnit.Kilometer;\n config.SpatialOptions.GeographyAreaUnit = AreaMeasurementUnit.SquareKilometer;\n }\n } For Entity Framework 6.3, Database.SetInitializer must be called later, like in the following example: public class MyContext: DbContext {\n \n static MyContext() {\n \n var config = OracleEntityProviderConfig.Instance;\n config.SpatialOptions.SpatialServiceType = SpatialServiceType.NetTopologySuiteSigned;\n config.SpatialOptions.GeographyDistanceUnit = DistanceMeasurementUnit.Kilometer;\n config.SpatialOptions.GeographyAreaUnit = AreaMeasurementUnit.SquareKilometer;\n \n Database.SetInitializer(null);\n }\n } [Here](https://blog.devart.com/wp-content/uploads/2019/09/DevartNetTopologySuiteDemo.EF6_.NetCore.zip) you can download sample projects, working with Spatials Data using dotCoonnect for Oracle, PostgreSQL, and MySQL. To test these project, create the necessary database objects using the DatabaseInitializationScript.sql file from the archive subfolder for the corresponding dotConnect provider, for example \\Devart dotConnect for Oracle Demo\\DatabaseInitializationScript.sql Replace the connection string in the project with the actual connection string to the corresponding database. For example, for Oracle, the connection string to replace is in the \\Devart dotConnect for Oracle Demo\\OracleModel.cs file. Entity Developer Of course, Entity Framework 6.3 is now completely supported in our popular and powerful ORM designer – [Entity Developer](https://www.devart.com/entitydeveloper/) . It is supported for our providers – [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](https://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) , as well as for Microsoft SqlClient for SQL Server. And now you can select the Devart Entity Model project item when adding an item to a .NET Core project in Microsoft Visual Studio 2019 ( [Visual Studio 2019 version 16.3 Preview](https://visualstudio.microsoft.com/vs/preview/) or higher is recommended). All the Entity Framework 6 templates for C# and VB.NET were updated to support .NET Core 3.0 and loading a connection string from the appsettings.json file. Here is the list of updated templates: DbContext EntityObject POCO Entity Self-Tracking Entity The following properties were added to these templates: Json File Base Path – Specifies the base path for file-based providers when a connection string from the appsettings.json file is used. The special reserved value %CurrentDirectory% means that the current working directory of the application is used as the base path. Include Environment Variables – Determines whether context configuration will be extended with environment variables when a connection string from the appsettings.json file is used. Visual Studio 2019 Support Visual Studio 2019 support is greatly improved in our products. Now installers of all our ADO.NET providers and Entity Developer fully support both integration into the release version of Visual Studio 2019 (ver 16.0/16.1/16.2) and into preview versions (ver 16.3 Preview). You can even integrate our products simultaneously into both the release and preview instances if both are simultaneously installed. Besides our products now work with Optimize rendering for screens with different pixel densities turned on. Conclusion We will continue to extend the functionality of our ADO.NET providers for Oracle, MySQL, PostgreSQL, and SQLite and add features for .NET Core 3, based on user feedback. We are going both to extend the legacy features support to ease migration of existing Full .NET Framework projects and implement new features to improve support of Windows, Linux, and Mac OS platforms. Besides we plan to further improve Entity Developer, both as a Visual Studio add-in and as a standalone application, to make development of .NET Core projects more efficient. Tags [.NET Core](https://blog.devart.com/tag/net-core) [dotconnect](https://blog.devart.com/tag/dotconnect) [entity developer](https://blog.devart.com/tag/entity-developer) [entity framework](https://blog.devart.com/tag/entity-framework) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [sqlite](https://blog.devart.com/tag/sqlite) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-6-3-and-net-core-3-support.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+6.3+and+.NET+Core+3+Support&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-6-3-and-net-core-3-support.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html&title=Entity+Framework+6.3+and+.NET+Core+3+Support) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html&title=Entity+Framework+6.3+and+.NET+Core+3+Support) [Copy URL](https://blog.devart.com/entity-framework-6-3-and-net-core-3-support.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework 6 Support for Oracle, MySQL, PostgreSQL, SQLite, DB2 and Salesforce By [dotConnect Team](https://blog.devart.com/author/dotconnect) January 17, 2013 [24](https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html#comments) 14887 Article was updated on December 5th, 2013 Entity Framework 6 Alpha 2 support is implemented in Devart ADO.NET providers: [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](https://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) , [dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) , [dotConnect for DB2](https://www.devart.com/dotconnect/db2/) and [dotConnect for Salesforce](https://www.devart.com/dotconnect/salesforce/) . You need to download the corresponding Entity Framework NuGet package to use it in your applications. You can read about Entity Framework 6 features in the corresponding [MSDN articles](https://blogs.msdn.microsoft.com/adonet/2012/12/10/ef6-alpha-2-available-on-nuget/) . New Assemblies Code-First Migrations Entity Framework Spatials Support Provider Registration Config File Registration Registration in entityFramework Section DbProviderFactory Registration Registration Examples Registration of Entity Framework Code-First Migrations Code-based Registration New Registration Classes Code-based Registration Examples Migrating Projects from Entity Framework v4/v5 Entity Developer Code-First or Entity Data Model Designer Obsolete Workarounds Using Non-Unicode Strings Using Non-LOB Strings Using DATE fields instead of TIMESTAMP Samples Conclusion New Assemblies In order to support Entity Framework 6, we have included new assemblies compiled under .NET Framework 4.0 to the installation packages of our providers. We have added the revision number “6” to the assembly versions to easily distinguish them from the corresponding Entity Framework v4/v5 assemblies. I.e., if a Devart.Data.Xxx.Entity.dll assembly for Entity Framework v4/v5 has the “7.4.147.0” version number, the corresponding Entity Framework 6 assembly will have the “7.4.147.6” version. Entity Framework 6 support assemblies are installed to the GAC and to the provider installation folder (by default to %Program Files%\\Devart\\dotConnect\\Xxx\\Entity\\EF6 where Xxx can be ‘Oracle’, ‘MySQL’, ‘PostgreSQL’, ‘SQLite’, ‘DB2’ or ‘Salesforce’). Code-First Migrations With the release of Entity Framework 6 there is no more need to place Code-First Migrations functionality in a separate assembly, so there are no more Devart.Data.Xxx.Entity.Migrations.dll assemblies. Code-First Migrations functionality is integrated to the main Devart.Data.Xxx.Entity.dll Entity Framework assembly. Entity Framework Spatials Support Entity Framework Spatials are currently supported only in Devart dotConnect for Oracle. Spatials support was first introduced in Entity Framework 5 on .NET Framework 4.5. Now, with the release of Entity Framework 6, we have supported Entity Framework Spatials for .NET Framework 4.0 too. There is also an additional Devart.Data.Oracle.Entity.SharpMap.dll assembly, compiled for .NET Framework 4.0, to support the SharpMap library. For more details on Spatials support see the “ [Using Entity Framework Spatials with Oracle Spatial and SharpMap](https://blog.devart.com/using-entity-framework-spatials-with-oracle-spatial-and-sharpmap.html) ” blog article. Provider Registration Entity Framework provider registration has been changed in Entity Framework 6. In earlier versions of Entity Framework it’s enough to register DbProviderFactory in the machine.config or application config file. Now you need to perform some additional actions. You can register an Entity Framework provider in two ways: using config file registration or сode-based registration. Config File Registration Registration in entityFramework Section After you install the EntityFramework 6 NuGet package, the following lines will be added to your application config file: \n\n \n \n
\n \n \n \n \n You need to remove the defaultConnectionFactory registration add to add the Entity Framework provider registration by registering it in the entityFramework section. Here is the example, registering Devart dotConnect for Oracle: \n\n \n \n
\n \n \n \n \n \n \n Note: Don’t forget to replace ‘7.4.147.6’ with the actual Entity Framework provider assembly version. DbProviderFactory Registration When our provider is installed, it automatically places provider registration record to the DbProviderFactories section in the global machine.config file, and there is no need to duplicate it in the local application config file, while the application is developed and tested locally. However, to deploy your application, you need to register the Entity Framework provider in the DbProviderFactories section. Here is an example, registering Devart dotConnect for Oracle: \n\n \n \n
\n \n \n \n \n \n \n \n \n \n \n \n \n Note: Don’t forget to replace ‘7.4.147.0’ with the actual provider assembly version. Registration Examples The registration is similar for other dotConnect providers. Here is an example, registering all our Entity Framework providers – for Oracle, MySQL, PostgreSQL, SQLite, DB2 and Salesforce: \n\n \n \n
\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Registration of Entity Framework Code-First Migrations To register migration SQL generator for Code-First Migrations, use the SetSqlGenerator method of the DbMigrationsConfiguration descendant for Entity Framework 6. internal sealed class Configuration : DbMigrationsConfiguration {\n public Configuration() {\n AutomaticMigrationsEnabled = false; \n this.SetSqlGenerator(OracleConnectionInfo.InvariantName, new OracleEntityMigrationSqlGenerator());\n }\n // ...\n} Code-based Registration Entity Framework provider can be registered with the special DbConfigurationType attribute or with the static SetConfiguration method of the new DbConfiguration class. You need to path a specially configured DbConfiguration descendant to this method. New Registration Classes To ease code-based registration, we have included new DbConfiguration descendants to our Entity Framework providers: ProviderAssemblyClass Name Devart dotConnect for Oracle Devart.Data.Oracle.Entity.dll Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration Devart dotConnect for MySQL Devart.Data.MySql.Entity.dll Devart.Data.MySql.Entity.MySqlEntityProviderServicesConfiguration Devart dotConnect for PostgreSQL Devart.Data.PostgreSql.Entity.dll Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServicesConfiguration Devart dotConnect for SQLite Devart.Data.SQLite.Entity.dll Devart.Data.SQLite.Entity.SQLiteEntityProviderServicesConfiguration Devart dotConnect for DB2 Devart.Data.DB2.Entity.dll Devart.Data.DB2.Entity.DB2EntityProviderServicesConfiguration Devart dotConnect for Salesforce Devart.Data.Salesforce.Entity.dll Devart.Data.Salesforce.Entity.SalesforceEntityProviderServicesConfiguration Code-based Registration Examples dotConnect for Oracle Entity Framework provider registration with the DbConfigurationType attribute (can be used for DbContext only): [DbConfigurationType(typeof(Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration))]\n public class MyContext : DbContext {\n\n // ...\n } dotConnect for Oracle Entity Framework provider registration with the static SetConfiguration method of the new DbConfiguration class (can be used for both DbContext and ObjectContext): public class MyContext: ObjectContext {\n\n static MyContext() {\n\n DbConfiguration.SetConfiguration(new Devart.Data.Oracle.Entity.OracleEntityProviderServicesConfiguration());\n }\n\n // ...\n } If you have installed our Entity Framework provider, machine.config already contain the provider registration record in the DbProviderFactories section, and you don’t need to duplicate it in the local application config file while the application is developed and tested locally. You can use only code-based registration during development time. However, when you deploy your project, you need to register the Entity Framework provider in the DbProviderFactories of the application config file, like it is described above in the DbProviderFactory Registration section . Migrating Projects from Entity Framework v4/v5 To migrate your project from Entity Framework v4/v5 to Entity Framework v6 perform the following steps: Install EntityFramework 6. For example, you can do it with the following command in Package Manager Console: Install-Package EntityFramework -Pre Remove references to old assemblies, specific to Entity Framework v4/v5 from the project: System.Data.Entity.dll Devart.Data.Xxx.Entity.dll Devart.Data.Xxx.Entity.Migrations.dll (for Oracle, MySQL, PostgreSQL, or SQLite) Devart.Data.Oracle.Entity.SharpMap.dll (for Oracle) Add references to new, Entity Framework 6 compatible, assemblies to the project: Devart.Data.Xxx.Entity.dll Devart.Data.Oracle.Entity.SharpMap.dll (for Oracle if you use Oracle Spatials and SharpMap) Register Entity Framework provider with one of the ways described above Entity Developer For Entity Developer users migration is very simple – just change Entity Framework Version option to “Entity Framework 6” in Model Settings. After this close the model and open it in Entity Developer again. Entity Developer will generate valid code, compatible with Entity Framework 6. Currently if you use standalone Entity Developer (not the one integrated to the Visual Studio), you need to manually add the EntityFramework.dll assembly to the GAC of .NET Framework 4.0 (4.5) or to place this assembly to the directory where Entity Developer is installed. This is not required, however some Entity Developer features may not work correctly for Entity Framework 6 models without it. We plan to overcome this limitation in future releases. Code-First or Entity Data Model Designer See [Microsoft recommendations](https://archive.codeplex.com/?p=entityframework) on migration to Entity Framework 6. Obsolete Workarounds Entity Framework 6 introduces custom conventions including [lightweight conventions](https://archive.codeplex.com/?p=entityframework) . This makes the following Devart dotConnect for Oracle properties obsolete and not recommended to use: config.CodeFirstOptions.UseNonUnicodeStrings config.CodeFirstOptions.UseNonLobStrings config.CodeFirstOptions.UseDateTimeAsDate These are settings used for workarounds, related to configuring property mapping when the explicit definition of mapping is incomplete. Here are the examples how to replace these options in a way, recommended in Entity Framework 6. Using Non-unicode Strings A simple lightweight convention can be used instead of the config.CodeFirstOptions.UseNonUnicodeStrings property. public class MyContext : DbContext {\n\n protected override void OnModelCreating(DbModelBuilder modelBuilder) {\n\n modelBuilder\n .Properties()\n .Where(p => p.PropertyType == typeof(string))\n .Configure(p => p.IsUnicode(false));\n }\n\n // ...\n } This convention makes all the string properties non-unicode. Using Non-LOB Strings Instead of the config.CodeFirstOptions.UseNonLobStrings property, you can use the following lightweight convention. public class MyContext : DbContext {\n\n protected override void OnModelCreating(DbModelBuilder modelBuilder) {\n\n modelBuilder\n .Properties()\n .Where(p => p.PropertyType == typeof(string) && \n p.GetCustomAttributes(typeof(MaxLengthAttribute), false).Length == 0)\n .Configure(p => p.HasMaxLength(2000));\n }\n\n // ...\n } This convention sets MaxLength to 2000 characters for all string properties, which don’t have MaxLength set explicitly with the MaxLengthAttribute attribute. Using DATE fields instead of TIMESTAMP You can use the following simple lightweight convention instead of the config.CodeFirstOptions.UseDateTimeAsDate property. public class MyContext : DbContext {\n\n protected override void OnModelCreating(DbModelBuilder modelBuilder) {\n\n modelBuilder\n .Properties()\n .Where(p => p.PropertyType == typeof(DateTime))\n .Configure(p => p.HasPrecision(0));\n }\n\n // ...\n } This convention sets zero precision for all DateTime properties. This makes Oracle use DATE type instead of TIMESTAMP(7) because DATE does not store fractions of seconds. Samples Here you can download the archive with the Entity Framework 6 version of Code-First samples [CrmDemo.EF6CodeFirst_.zip](https://blog.devart.com/wp-content/uploads/2013/01/CrmDemo.EF6CodeFirst_.zip) . In this archive you can find the full version of the above examples that use EF Code-First for each ADO.NET provider: Devart dotConnect for Oracle Devart dotConnect for MySQL Devart dotConnect for PostgreSQL Devart dotConnect for SQLite Devart dotConnect for DB2 as well as for standard Microsoft .NET Framework Data Provider for SQL Server (SqlClient) Conclusion We are glad to provide the new Entity Framework provider functionality – support for Entity Framework 6 – to our users. We hope that you will not have any difficulties migrating to Entity Framework 6. In any case, we are glad to help our users if they have any troubles with migration. As for our future plans on Entity Framework provider development, they are determined largely by your feedback and suggestions via product feedback pages, [forum](https://forums.devart.com/viewforum.php?f=30) and [UserVoice](https://devart.uservoice.com/forums/105163-ado-net-entity-framework-support) . This article was updated on December 5th, 2013. The information regarding working with Entity Framework 6 and dotConnect for DB2 was added. Update from January 4th, 2017: Please note that Entity Framework related assemblies in dotConnect data providers were renamed, and their versioning was changed. See more details [here](https://www.devart.com/news/2016/dotconnects-ef7.html#assembly_name_change) . Tags [code-first migrations](https://blog.devart.com/tag/code-first-migrations) [entity framework](https://blog.devart.com/tag/entity-framework) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+6+Support+for+Oracle%2C+MySQL%2C+PostgreSQL%2C+SQLite%2C+DB2+and+Salesforce&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html&title=Entity+Framework+6+Support+for+Oracle%2C+MySQL%2C+PostgreSQL%2C+SQLite%2C+DB2+and+Salesforce) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html&title=Entity+Framework+6+Support+for+Oracle%2C+MySQL%2C+PostgreSQL%2C+SQLite%2C+DB2+and+Salesforce) [Copy URL](https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 24 COMMENTS Free Naught America clips March 30, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:18 am I’ll immediately snatch your rss feed as I can’t in finding your e-mail subscription hyperlink or newsletter service. Do you’ve any? Kindly permit me understand so that I could subscribe. Thanks. Marina Nastenko April 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 6:50 am An option of subscribing via e-mail has been added, on the top of the page on the right, you can find an e-mail subscription box. Steven Engesl September 18, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 11:14 am I’m migration an asp.net mvc application using devart for mysql from EF5 to EF6 RC1. I followed this post to do that and I got an error. When I try to register the provider in the config file, it complains that is an unknown tag. The only way I can get it to work is to remove tag from the config file and keep SetSqlGenerator() in the constructor of the Configuration class of EF Migrations. Am I doing something wrong here? Regards, Steven. Shalex November 1, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 2:18 pm Registration of migrationSqlGenerator via *.config was available in the pre-release versions of EF6, but EF6 RTM doesn’t support this. As a workaround, please specify migration SQL generator with the SetSqlGenerator method of the DbMigrationsConfiguration descendant for Entity Framework 6. Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite December 2, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 3:41 pm […] You can find the updated version of the samples for Entity Framework 6 in this blog article: Entity Framework 6 Support for Oracle, MySQL, PostgreSQL, SQLite and Salesforce. […] Ibrahim Shaib January 22, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 7:10 pm Hello, We .net developers (team – 3 nos) who have developed several applications using entityframework. We are interested in buying your product for Entity Framework 6 oracle support. The question we have is what is the license convering each of the five editions (Developer, Professional, Standard, Mobile, and Express ) in terms of how many systems (development machine/Server) can we install each of the editions. Your response will help our decision making. Thank you. Ibrahim Shaib. Seva Zaslavsky January 29, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 10:14 pm If you see an error “Your project references the latest version of Entity Framework; however, an Entity Framework database provider compatible with this version could not be found for your data connection. …” while trying to generate an .edmx from Database using EF 6, make sure that you’ve built the project that contains your .edmx *before* trying to generate the .edmx in the current configuration (Release|Any CPU, Debug|Any CPU, etc). Seva Zaslavsky January 30, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 5:14 pm Additional information on Code-Based ef provider registration appears here: [http://go.microsoft.com/fwlink/?LinkId=260883](http://go.microsoft.com/fwlink/?LinkId=260883) Ron Sheppard May 13, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 4:58 pm Does the express edition support Entity Framework 6 out of the box? I can’t seem to find the Devart.Data.Oracle.Entity assembly anywhere? Carlos June 22, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 4:48 pm Hi, I’m trying to use code first and I get an error, enclose the code. Thank you. ———————————————————————————- public class MyBdContext : DbContext { public MyBdContext() : base(“DemoEF6”) { } public MyBdContext(DbConnection connection): base(connection, true) { } public DbSet Personas { get; set; } } ———————————————————————————- ——————————————————————————————- [Table(“persona”, Schema = “public”)] public class Persona { [Key] [Column(“idpersona”)] [DatabaseGenerated(DatabaseGeneratedOption.Identity)] public int IdPersona { get; set; } [Column(“apellidos”)] public string Apellidos { get; set; } [Column(“nombres”)] public string Nombres { get; set; } } Carlos June 22, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 4:49 pm An exception of type ‘System.InvalidOperationException’ occurred in mscorlib.dll but was not handled in user code Additional information: The ‘Instance’ member of the Entity Framework provider type ‘Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServices, Devart.Data.PostgreSql.Entity, Version=7.3.135.0, Culture=neutral, PublicKeyToken=09af7300eec23701’ did not return an object that inherits from ‘System.Data.Entity.Core.Common.DbProviderServices’. Entity Framework providers must inherit from this class and the ‘Instance’ member must return the singleton instance of the provider. This may be because the provider does not support Entity Framework 6 or later; see [http://go.microsoft.com/fwlink/?LinkId=260882](http://go.microsoft.com/fwlink/?LinkId=260882) for more information. David Penn August 22, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 8:53 pm This simply does not work in EF6. I followed your steps at [https://www.devart.com/dotconnect/oracle/articles/tutorial_ef.html](https://www.devart.com/dotconnect/oracle/articles/tutorial_ef.html) . After Step 6, you get a greyed out option for EF6. I have put the provider in the app.config as specified. I then even put the DbProviderFactories section in just in case. It’s pretty frustrating, downloading a demo, following what should be simple steps, and not even being able to get it to work. Daniel Lucazeau April 4, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 12:53 pm In PostgreSQL example, MyDbContextSeeder is never called. How can we buy a product with not running sample ? dotConnect Team June 10, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 12:36 pm to Ibrahim Shaib: The dotConnect product is licensed for developers (not for a workstation or a server). Every licensed developer is entitled to install dotConnect for Oracle on any number of workstations provided the product is used only by him for sole purposes of development, testing, and deployment. to Seva Zaslavsky: Thank you for your note. We will add this information to our EF tutorial. JIC: we recommend using Entity Developer (Devart Entity Model, *.edml) instead of EDM Designer (ADO.NET Entity Data Model, *.edmx) because it is adjusted for working with Oracle and has an advanced functionality: [https://www.devart.com/entitydeveloper/ed-vs-edm.html](https://www.devart.com/entitydeveloper/ed-vs-edm.html) . Additionally, Entity Developer adds registration of EF6-provider in app.config automatically. to Ron Sheppard: The Entity Framework support is available only in the Professional and Developer editions of dotConnect for Oracle: [https://www.devart.com/dotconnect/oracle/editions.html](https://www.devart.com/dotconnect/oracle/editions.html) . to Carlos: Looks like you did not update the provider version in \\CrmDemo.EFCodeFirst.PostgreSql\\app.config . You can check your current version via the Tools > PostgreSQL > About menu. Please note that the revision number of provider in the entityFramework section is *.6 (7.3.135.6) but it should be *.0 (7.3.135.0) in DbProviderFactories. to David Penn: Please refer to Seva Zaslavskys note. You should rebuild your project after adding the *.config entries and before running EDM Wizard. JIC: we recommend using Entity Developer (Devart Entity Model, *.edml) instead of EDM Designer (ADO.NET Entity Data Model, *.edmx) because it is adjusted for working with Oracle and has an advanced functionality: [https://www.devart.com/entitydeveloper/ed-vs-edm.html](https://www.devart.com/entitydeveloper/ed-vs-edm.html) . Additionally, Entity Developer adds registration of EF6-provider in app.config automatically. dotConnect Team June 10, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 12:37 pm to Daniel Lucazeau: Please compare the original \\CrmDemo.EFCodeFirst\\CrmDemo.EFCodeFirst.PostgreSql\\MyPgSqlContext.cs file with the one in your modified project. You have changed System.Data.Entity.Database.SetInitializer(new MyDbContextDropCreateDatabaseAlways()); to Database.SetInitializer(new MyDbContextDropCreateDatabaseAlways()); That is exactly the reason of the issue you have encountered. Jose Carlos July 31, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 12:53 am How do I do the Code-based Registration when it comes to Postgres? There is only one example with Oracle, and I coudn’t find a way to replicate it with dotConnect for Postgres. I tried this: [DbConfigurationType(typeof(Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServices))] public class MyContext : DbContext { // … } and then: (this one does not compile) public class MyContext: ObjectContext { static MyContext() { DbConfiguration.SetConfiguration(new Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServices()); } // … } How do I do that? I really need it… dotConnect Team July 31, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 1:32 pm to Jose Carlos: Please use [DbConfigurationType(typeof(Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServicesConfiguration))] instead of [DbConfigurationType(typeof(Devart.Data.PostgreSql.Entity.PgSqlEntityProviderServices))] If this doesn’t help, specify the exact text of the error you have encountered. A small test project will be appreciated. Armando January 28, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 1:32 pm Hi, I installed the latest version of dotConnect for SQLite I downloaded your demo code and updated EF to 6.1.3 I updated the version number in the app.config as below I got this error “Additional information: The ‘Instance’ member of the Entity Framework provider type ‘Devart.Data.SQLite.Entity.SQLiteEntityProviderServices, Devart.Data.SQLite.Entity, Version=5.3.563.0, Culture=neutral, PublicKeyToken=09af7300eec23701’ did not return an object that inherits from ‘System.Data.Entity.Core.Common.DbProviderServices’. Entity Framework providers must inherit from this class and the ‘Instance’ member must return the singleton instance of the provider. This may be because the provider does not support Entity Framework 6 or later; see [http://go.microsoft.com/fwlink/?LinkId=260882](http://go.microsoft.com/fwlink/?LinkId=260882) for more information.” Please help, I want to buy your libraries to create a database agnostic application that works with Sql Server, SQLite and Oracle. dotConnect Team February 17, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 5:34 pm to Armando: Please check your app.config, it should contain the following entries:
=09af7300eec23701\" /> Replace 5.3.563 here with your current version of dotConnect for SQLite. Please note that the revision number of provider in the entityFramework section is 6 (5.3.563.6) but it should be 0 (5.3.563.0) in DbProviderFactories. Rudi February 23, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 4:02 pm Hi DevartTeam, our software team may integrate your SQLite provider in our software. Therefore I started to investigate you product. I’ve downloaded the latest provider for SQLite (in my case it’s Devart.Data.SQLite.Entity Version=5.3.592.0) and tried to execute your CrmDemo.EFCodeFirst example using EF6 (6.0.1). Now when I execute the demo I experience the same problem as Armando does: The ‘Instance’ member of the Entity Framework provider type ‘Devart.Data.SQLite.Entity.SQLiteEntityProviderServices, Devart.Data.SQLite.Entity, Version=5.3.592.0, Culture=neutral, PublicKeyToken=09af7300eec23701’ did not return an object that inherits from ‘System.Data.Entity.Core.Common.DbProviderServices’. Entity Framework providers must inherit from this class and the ‘Instance’ member must return the singleton instance of the provider. This may be because the provider does not support Entity Framework 6 or later; see [http://go.microsoft.com/fwlink/?LinkId=260882](http://go.microsoft.com/fwlink/?LinkId=260882) for more information. Do you have any solution or advice how I can solve my problem. Kind regards, Rudi. dotConnect Team February 23, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 5:07 pm to Rudi We are sorry, because of some technical troubles, some recent comments were not displayed in our blog. Please see our answer to Armando above your comment. Hassan April 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 4:28 pm Hi DevartTeam, I downloaded “dotConnect for PostgreSQL 7.4 Professional Trial” from your website. I get the following error when I going to add ADO.NET in my project (Console Application, in VS 2013, I d). “Your project references the latest version of Entity Framework; however, an Entity Framework database provider compatible with this version could not be found for your data connection. …” I am using Entity framework 6, installed through “Package Manager Console” using command “Install-Package EntityFramework -Version 6.0.0″… And install “dotConnect Express for PostgreSQL” from NuGet Packages. And also Rebuild before setting the app.config file. My app.config is: Please Help me.. Shalex December 25, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 3:35 pm 1. We recommend you to use Entity Developer (the Devart Entity Model item, *.edml) instead of EDM Designer (the ADO.NET Entity Data Model item, *.edmx) because it is adjusted for working with dotConnect providers and has an advanced functionality: [https://www.devart.com/entitydeveloper/ed-vs-edm.html](https://www.devart.com/entitydeveloper/ed-vs-edm.html) . Additionally, Entity Developer adds registration of EF6 provider in app.config automatically and offers designer (Database First / Model First) for EF Core. 2. In case of using EDM Designer (the ADO.NET Entity Data Model item, *.edmx), the *.config entry should be: Replace 7.11.1278 here with your current version of dotConnect for PostgreSQL. Shalex December 25, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 3:44 pm Sorry, the XML entry to *.config is not displayed in my previous answer. Please contact us via [https://www.devart.com/company/contactform.html](https://www.devart.com/company/contactform.html) for getting assistance. Comments are closed."} {"url": "https://blog.devart.com/entity-framework-and-ideablade-notice.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [How To](https://blog.devart.com/category/how-to) Entity Framework and IdeaBlade By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 26, 2010 [3](https://blog.devart.com/entity-framework-and-ideablade-notice.html#comments) 3476 Our active user, Simon Kingaby , in his own blog has described experience of the web-based enterprise applications development by help of the following technologies: [Devart dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [Entity Developer](http://www.devart.com/entitydeveloper/) , Oracle 11 Database, Entity Framework, IdeaBlade DevForce, Microsoft Silverlight 3, Prism 2, and Unity. You can view these articles by the following links: [IdeaBlade DevForce – Model Setup Walk-through – Background](http://onemanwenttomow.wordpress.com/2010/02/25/ideablade-devforce-model-setup-walk-through-background/) [IdeaBlade DevForce – Model Setup Walk-through – Step 1: The Entity Framework Project](http://onemanwenttomow.wordpress.com/2010/03/02/) Because of the increasing popularity of IdeaBlade among our users we plan to consider ways to improve its support. At first, we will implement a complete support of the custom attributes. Currently the serialization/deserialization of the conceptual model elements is provided only for part of them. Serialization of the conceptual model elements will be supported completely and [EntityDeveloper](http://www.devart.com/entitydeveloper/) will show and edit them in the next versions. These changes will be useful for the users of the third-party tools which save additional information to the XML model as custom attributes. Also it will be useful for users who want to customize code generation – they can use attribute values in the code generation templates. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [Oracle](https://blog.devart.com/tag/oracle) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-and-ideablade-notice.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+and+IdeaBlade&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-and-ideablade-notice.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-and-ideablade-notice.html&title=Entity+Framework+and+IdeaBlade) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-and-ideablade-notice.html&title=Entity+Framework+and+IdeaBlade) [Copy URL](https://blog.devart.com/entity-framework-and-ideablade-notice.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 3 COMMENTS Ward Bell March 29, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 4:31 pm I am the VP of Technology at IdeaBlade. We are grateful to the community and to Devart for their interest and support. We are working with Devart to make the modeling experience as seamless and productive as possible. TSCOTT August 26, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 12:03 pm I am very interested is using both DevArt sqllite connector and IdeaBlade. Any update on the compatibility of these two products? [Shalex](http://www.devart.com) September 2, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 4:14 am We plan to investigate and improve the compatibility of dotConnect products with IdeaBlade as we mentioned in the article. But there is no timeframe at the moment. Comments are closed."} {"url": "https://blog.devart.com/entity-framework-canonical-functions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework Canonical Functions By [dotConnect Team](https://blog.devart.com/author/dotconnect) February 10, 2010 [1](https://blog.devart.com/entity-framework-canonical-functions.html#comments) 5718 Article was updated on 3/4/2010 This article can be useful for programmers who want to develop cross-database applications and use the canonical functions. [Entity Framework Canonical Functions](http://msdn.microsoft.com/en-us/library/bb738626(VS.100).aspx) are a set of functions, which are supported by all Entity Framework providers. These canonical functions are translated to the corresponding data source functionality for the provider. The tables below contain information about these functions supported by the Devart products. Functions supported by the following DBMS: [MySQL](http://www.devart.com/dotconnect/mysql/) , [PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , [Oracle](http://www.devart.com/dotconnect/oracle/) , [SQLite](http://www.devart.com/dotconnect/sqlite/) , SQL Server 2005, and SQL Server 2008 are marked in green. Please note, that the “EF version” column contains the number of the Entity Framework version, in which this function is defined. Aggregate Canonical Functions Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 Avg v1 BigCount * v1 Count v1 Max v1 Min v1 StDev v1 StDevP v4 Sum v1 Var v4 VarP v4 * Only SQL Server has the aggregate function COUNT_BIG(expr). This function is compiled to the ordinary COUNT(expr) in other databases. You can read more about Aggregate Canonical functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738514(VS.100).aspx) . Math Canonical Functions Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 Abs v1 Ceiling v1 Floor v1 Power v4 Round(value) v1 Round(value, digits) v4 Truncate v4 You can read more about Math Canonical functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738542(VS.100).aspx) . String Canonical Functions Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 Concat v1 Contains v4 EndsWith v4 IndexOf v1 Left v1 Length v1 LTrim v1 Replace v1 Reverse v1 / * Right v1 RTrim v1 Substring v1 StartsWith v4 ToLower v1 ToUpper v1 Trim v1 * The Reverse function is supported with PostgreSQL 9.1 and higher. You can read more about String Canonical functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738534(VS.100).aspx) . Date and Time Canonical Function Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 AddNanoseconds * v4 AddMicroseconds v4 AddMilliseconds v4 AddSeconds v4 AddMinutes v4 AddHours v4 AddDays v4 AddMonths v4 AddYears v4 CreateDateTime v4 CreateDateTimeOffset v4 CreateTime v4 CurrentDateTime v1 CurrentDateTimeOffset v4 CurrentUtcDateTime v1 Day v1 DayOfYear v4 DiffNanoseconds v4 DiffMilliseconds v4 DiffMicroseconds v4 DiffSeconds v4 DiffMinutes v4 DiffHours v4 DiffDays v4 DiffMonths v4 DiffYears v4 GetTotalOffsetMinutes v1 Hour v1 Millisecond v1 Minute v1 Month v1 Second v1 TruncateTime v4 Year v1 * Some DBMS can store nanoseconds into the dates, but the .NET type [DateTime](http://msdn.microsoft.com/en-us/library/system.datetime(VS.100).aspx) doesn’t allow nanoseconds storage. It can recognize only 100-nanosecond intervals. In the dotConnect for SQLite implementation, AddNanoseconds does not have sufficient accuracy to store the count of individual nanoseconds and can only store an integer count of 100-nanosecond intervals. Thus, if AddNanoseconds(224) is called, only 200 nanoseconds are added to the value in the database. You can read more about Date and Time Canonical functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738563(VS.100).aspx) . Bitwise Canonical Functions Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 BitWiseAnd v1 BitWiseNot v1 BitWiseOr v1 BitWiseXor v1 You can read more about Bitwise Canonical functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738567(VS.100).aspx) . Other Canonical Functions Canonical Function Name EF version Oracle MySQL PostgreSQL SQLite DB2 MS SQL Server 2005 MS SQL Server 2008 NewGuid v1 * You can read more about this group of functions in [MSDN](http://msdn.microsoft.com/en-us/library/bb738544(VS.100).aspx) . * For PostgreSQL, several methods for generating GUID are supported. With dotConnect for PostgreSQL you can select the method to use with the config.QueryOptions.NewGuidGenerationMethod property (see [Query Options](http://www.devart.com/dotconnect/postgresql/docs/?QueryOptions.html) topic of dotConnect for PostgreSQL documentation. In conclusion, please note that the functionality of different DBMS varies. For that reason, only some of the functions can be implemented through standard routines and SQL statements. Sometimes, one and the same function can return different values, since its accuracy is different in different DBMS. This is especially true for mathematical and aggregate functions. This article was updated on 3/4/2010. All of the tables in the article were updated, because the support for canonical functions in the latest version of dotConnect for SQLite had been extended to include the following four statistical functions (StDev, StDevP, Var, VarP) and forty-one scalar functions. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [sqlite](https://blog.devart.com/tag/sqlite) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-canonical-functions.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Canonical+Functions&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-canonical-functions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-canonical-functions.html&title=Entity+Framework+Canonical+Functions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-canonical-functions.html&title=Entity+Framework+Canonical+Functions) [Copy URL](https://blog.devart.com/entity-framework-canonical-functions.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 1 COMMENT Canonical Purpose “EntityFunctions.TruncateTime” doesn’t occur in MYSQL | CodersDiscuss.com June 3, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 10:24 am […] TruncateTime is backed by MySQL. […] Comments are closed."} {"url": "https://blog.devart.com/entity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Code-First Migrations support for Oracle, MySQL, PostgreSQL and SQLite By [dotConnect Team](https://blog.devart.com/author/dotconnect) February 13, 2012 [1](https://blog.devart.com/entity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html#comments) 10837 ADO.NET EF providers Devart [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) implement support for Entity Framework 4.3 Beta 1 (Entity Framework Code-First Migrations). To use it in your applications, you need to download and install the corresponding Entity Framework NuGet package first. You can read about Entity Framework Code-First Migrations functionality in the corresponding [MSDN](http://blogs.msdn.com/b/adonet/archive/2012/02/09/ef-4-3-released.aspx) articles. Code-First Migrations Features Database Initialization Strategies EdmMetadata and __MigrationHistory Tables dbo ColumnTypeCasingConvention Configuration and Deployment of Entity Framework Provider New Assembly Registration of Code-First Migrations SQL Generator EntityFramework.dll Assembly Versions Migration Operation Support Additional Database-specific Customization of Migrations Creating Tables Creating Columns Creating Indexes Modifying Columns Dropping Tables Ideas for Future Code-First Migrations Features Entity Framework Code-First Migrations continues the development of Entity Framework Code-First functionality from Entity Framework 4.1 and Entity Framework 4.2, extending the functionality of dynamic database creating and deleting with the possibility of dynamic database schema modification (adding new columns/tables/foreign keys, creating and modifying existing database objects). Devart Entity Framework providers support basic Entity Framework Code-First functionality for a long time. This support was described in the following blog articles: Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite – general article on Code-First functionality with an example of creating an application. Entity Developer – EF Code First DbContext Template – this article describes how to ease application development with [Entity Developer](http://www.devart.com/entitydeveloper/) – a powerful visual Entity Framework model designer, which is included in our providers. The information in these articles is applicable to Entity Framework Code-First Migrations development as well, just some new features were added. We won’t duplicate the information from these articles in the current one, and we recommend to take a look at them if you have not read them earlier. Database Initialization Strategies When using Code-First functionality of Entity Framework 4.1 and Entity Framework 4.2, only three database initialization strategies are available: DropCreateDatabaseAlways, DropCreateDatabaseIfModelChanges and CreateDatabaseIfNotExists. They allow delete and re-create the entire database schema. Entity Framework 4.3 adds the MigrateDatabaseToLatestVersion migrations database initializer that allows updating database schema according to the latest Entity Framework model version automatically. EdmMetadata and __MigrationHistory Tables Entity Framework 4.3 does not use the “ EdmMetadata ” table, so, unlike Entity Framework 4.1 or 4.2, you don’t need to remove the IncludeMetadataConvention convention explicitly if you don’t want to use this table. However, there is a new table “ __MigrationHistory ” in Entity Framework 4.3 and you cannot enable or disable its presence in the database with some convention. dbo In Entity Framework 4.1 and Entity Framework 4.2 the problem of automatic mapping database objects to the dbo schema could be solved easily. If you needed the EdmMetadata table, the IgnoreSchemaName had to be used, otherwise, you could remove IncludeMetadataConvention and specify the schema name for each class explicitly when mapping classes. In Entity Framework 4.3 there is the __MigrationHistory table, that cannot be excluded, and this is the reason why an attempt to create the “dbo”.”__MigrationHistory” table occurs. So, there is only one solution for Entity Framework 4.3 – always use the IgnoreSchemaName mode (set it in a constructor or static constructor of your DbContext-descendant or in the OnModelCreating method): var config=Devart.Data.Oracle.Entity.Configuration.OracleEntityProviderConfig.Instance; \nconfig.Workarounds.IgnoreSchemaName = true; ColumnTypeCasingConvention This case is specific for dotConnect for Oracle and users of dotConnect providers for MySQL, PostgreSQL and SQLite may skip it. You should explicitly remove ColumnTypeCasingConvention for DbContext in its overridden OnModelCreating method to use dotConnect for Oracle with Code-First (Entity Framework 4.1 and Entity Framework 4.2). protected override void OnModelCreating(DbModelBuilder modelBuilder) {\n\n modelBuilder.Conventions\n .Remove(); \n\n // ...\n} It is not enough for Entity Framework 4.3 Beta 1 because an internal DbContext is created in the Code-First Migrations engine, and this DbContext is not aware of the removed conventions of the user’s one. That’s why we implemented a new Entity Framework provider configuration parameter – ColumnTypeCasingConventionCompatibility . This parameter allows you to avoid explicitly turning off the ColumnTypeCasingConvention for each DbContext when working with Devart dotConnect for Oracle. Just set it to true once in a constructor or static constructor of your DbContext-descendant or in the OnModelCreating method: var config=Devart.Data.Oracle.Entity.Configuration.OracleEntityProviderConfig.Instance; \nconfig.Workarounds.ColumnTypeCasingConventionCompatibility = true; Entity Framework 4.1 and Entity Framework 4.2 Code-First users can also use this option instead of removing the convention explicitly. Configuration and Deployment of Entity Framework Provider New Assembly A new assembly was added to our providers; you need to add references to this assembly to your projects, using Code-First Migrations: ProviderAssembly Devart dotConnect for Oracle Devart.Data.Oracle.Entity.Migrations.dll Devart dotConnect for MySQL Devart.Data.MySql.Entity.Migrations.dll Devart dotConnect for PostgreSQL Devart.Data.PostgreSql.Entity.Migrations.dll Devart dotConnect for SQLite Devart.Data.SQLite.Entity.Migrations.dll When deploying these projects, you should deploy this additional assembly as well. Registration of Code-First Migrations SQL Generator Registration of Code-First Migrations SQL Generator is performed in the Configuration class constructor – the descendant of the DbMigrationsConfiguration class, which is added to a project when executing the Enable-Migrations command in the Package Manager Console. Specify the Using directive: using Devart.Data.Oracle.Entity.Migrations; If user’s DbContext stores its connection string in the application config file with the same name as the DbContext class, you need to register only the SQL generator for the specific provider: public Configuration()\n{\n this.SetSqlGenerator(OracleConnectionInfo.InvariantName, \n new OracleEntityMigrationSqlGenerator());\n\n // ...\n} If DbContext uses another way to create a connection, then you need to specify both SQL generator and connection string: public Configuration()\n{\n var connectionInfo = OracleConnectionInfo.CreateConnection(\"User Id=SCOTT; \nPassword=TIGER; Server=ORA;\");\n this.TargetDatabase = connectionInfo;\n this.SetSqlGenerator(connectionInfo.GetInvariantName(), \n new OracleEntityMigrationSqlGenerator());\n\n // ...\n} These registrations configure only Code-First Migrations. You still need to register ADO.NET provider in the global machine.config file or in a local application config file. See the Deployment topic in the documentation of the corresponding provider for the registration examples. EntityFramework.dll Assembly Versions Our Devart.Data.(Xxx).Entity.Migrations.dll assemblies are built with the current release version of the EntityFramework.dll assembly, that is installed with the Entity Framework by the following NuGet package command: Install-Package EntityFramework The current EntityFramework.dll version is 4.3.1. This is the version we build our assemblies with. Users of the current Entity Framework Code-First Migrations version should have no problems. However, users of the previous 4.3.0 version or those who have installed and earlier version of Entity Framework Code-First Migrations on some purpose with, for example, the following command Install-Package EntityFramework -Version 4.3.0.0 and users of Entity Framework beta versions, using the command Install-Package EntityFramework -IncludePrerelease may face assembly version conflict because our assembly requires EntityFramework.dll exactly of the 4.3.1 version. This problem can easily be solved by specifying binding redirect in the user’s config file. For example, for binding with 4.3.0.0 version to the current 4.3.1.0 you need to add the following lines to the config file: \n \n \n \n \n \n \n Migration Operation Support Entity Framework Code-First Migrations offer many operations that can be performed with database objects. However, not all of them can be supported for every DBMS. The following table lists these operations and contains information on their support in dotConnect providers for different DBMSs. Operation Oracle MySQL PostgreSQL SQLite Table operations CreateTable DropTable RenameTable MoveTable * * * * Column operations AddColumn AlterColumn / ** *** DropColumn *** RenameColumn *** Constraint operations AddPrimaryKey *** DropPrimaryKey *** AddForeignKey / **** DropForeignKey *** Index operations CreateIndex DropIndex RenameIndex / ***** SQL operations Sql * MoveTable operation is not supported because of using the IgnoreSchemaName workaround. ** The ALTER COLUMN statement that changes the column type cannot be executed for a LOB column (CLOB/NCLOB/BLOB) in Oracle. See details in the next section. *** The ALTER TABLE statement is very limited in SQLite (see [here](http://www.sqlite.org/lang_altertable.html) ). Most of the table modification operations are impossible to implement. **** The ADD FOREIGN KEY constraint operation for a foreign key, consisting of a single column, is supported for the case when a new column for the foreign key is created. ***** Index renaming is supported starting from MySQL 5.7 and higher. For MySQL 5.6 or lower, delete and recreate the necessary index instead. We cannot automatically drop/recreate index (with 5.6 or lower) because Code-First Migrations don’t provide the necessary metadata to the underlying EF-provider. So we have only the names of old and new indexes but the corresponding columns are not known. Additional Database-specific Customization of Migrations Migrations can be customized. Our first release with Entity Framework Code-First Migrations support includes customizations for Oracle only. Our customization allow specifying tablespace for tables, creating expression-based indexes, virtual columns, etc. We are considering the possibility to support such functionality for other databases if users of these databases are interested in this functionality. Creating Tables CreateTable operation can be customized with the CreateTableConfiguration class. It has the following public properties: Tablespace – specifies TABLESPACE for the table. Unlogged – if set to True, an unlogged table is created. False by default. If the primary key is specified as database-generated (Identity option), then a sequence and an insert trigger will be generated for this table to make this column autoincrement. You can specify custom names for this sequence and trigger instead of auto-generated default ones with the following CreateTableConfiguration class properties: IdentityTriggerName – specifies the name of the trigger. IdentitySequenceName – specifies the name of the sequence. The sequence and the insert trigger will be generated only if the primary key is specified as database-generated (Identity option). An example of the generated migration code before the customization: public partial class CreateBlogTable : DbMigration\n {\n public override void Up()\n {\n CreateTable(\n \"Blogs\",\n c => new {\n BlogId = c.Int(nullable: false, identity: true),\n Name = c.String(unicode: false),\n })\n .PrimaryKey(t => t.BlogId);\n\n }\n\n public override void Down()\n {\n DropTable(\"Blogs\");\n }\n } The example of the migration code with the customization code added: using Devart.Data.Oracle.Entity.Migrations;\n public partial class CreateBlogTable : DbMigration\n {\n public override void Up()\n {\n CreateTable(\n \"Blogs\",\n c => new {\n BlogId = c.Int(nullable: false, identity: true),\n Name = c.String(unicode: false),\n },\n anonymousArguments: new CreateTableConfiguration() {\n Tablespace = \"MY_TABLESPACE\"\n })\n .PrimaryKey(t => t.BlogId);\n\n }\n\n public override void Down()\n {\n DropTable(\"Blogs\");\n }\n } Creating Columns AddColumn operation can be customized with the AddColumnConfiguration class. It has the following public properties: NotNullConstraintName – the name of the NOT NULL constraint. By default, an unnamed NOT NULL constraint is created. CheckConstraintName – the name of the check constraint. CheckConstraintExpression – the expression of the сheck constraint (for example, “COLUMN_NAME IN (‘A’, ‘B’, ‘C’)”). VirtualColumnExpression – the expression for the virtual column (for example, “FIRST_COLUMN * SECOND_COLUMN – THIRD_COLUMN”). Creating Indexes CreateIndex operation can be customized with the CreateIndexConfiguration class. It has the following public properties: IsBitmap – determines whether to create a bitmap index. False by default. Reversed – determines whether to create a reverse ordered index. False by default. Unsorted – determines whether to create an unsorted index. False by default. Unlogged – determines whether to create an unlogged index. False by default. Expression – specifies the expression for the expression-based (function-based) index. Tablespace – specifies TABLESPACE for an index. Modifying Columns Oracle has a set of limitations on column modification operations. It is not allowed to modify data type for CLOB/NCLOB/BLOB columns even for an empty table. It is also not allowed to specify the NOT NULL constraint for a column that already has one, and to specify that the column is nullable if it is already nullable. We have made a workaround for the the latter limitation by generating more complex DDL, however there is no workaround for the LOB column modification limitation because it is impossible to determine whether the user wants to change data type. That’s why any ALTER’s for LOB columns are not allowed by default. However user may customize ALTER COLUMN behaviour with the AlterColumnConfiguration class by passing the configured instance of this class to the AlterColumn method of migration. AlterColumnConfiguration class has the following public properties: LobModificationAllowed – determines whether to allow LOB column modification. Data type is not modified, only other column parameters are modified. False by default. ErrorCatchingForNullableAltering – determines whether to generate an additional DDL code for catching and processing errors that occur in case of unsuccessful modification of the NULL/NOT NULL constraint for columns that have the same nullability as the one applied by this modification. True by default. DataTypeAlteringEnabled – determines whether to modify data type. True by default. DefaultValueAlteringEnabled – determines whether to modify the default value. True by default. NullableAlteringEnabled – determines whether to modify NULL/NOT NULL constraint. True by default. An example of the generated migration code before the customization: public partial class AlterColumn : DbMigration {\n\n public override void Up() {\n AlterColumn(\"Blogs\", \"Title\", c => c.String(nullable: false));\n }\n\n public override void Down() {\n AlterColumn(\"Blogs\", \"Title\", c => c.String(nullable: false));\n }\n } The example of the migration code with the customization code added: using Devart.Data.Oracle.Entity.Migrations;\n\n public partial class AlterColumn : DbMigration {\n\n public override void Up() {\n AlterColumn(\"Blogs\", \"Title\", c => c.String(nullable: false),\n anonymousArguments: new AlterColumnConfiguration() {\n LobModificationAllowed = true,\n });\n }\n\n public override void Down() {\n AlterColumn(\"Blogs\", \"Title\", c => c.String(nullable: false));\n }\n } Dropping Tables If the primary key is specified as database-generated (Identity option), then a sequence and an insert trigger will be generated for this table to make this column autoincrement. When deleting such a table, the trigger is deleted automatically, but the sequence should be deleted explicitly. Devart dotConnect for Oracle tries to delete a sequence with the default name automatically. However, if the sequence name was customized with the CreateTableConfiguration class, you need to specify the custom sequence name in the DropTableConfiguration class for the DropTable operation. DropTableConfiguration class has the following public properties: IdentitySequenceName – the sequence name. IdentitySequenceDroppingEnabled – determines if the identity sequence should be deleted. When not specified, dotConnect for Oracle tries to drop the sequence safely when dropping the table, and it ignores errors if there are no such sequence. If this property is set to false, the sequence is not deleted. If this property is set to true, dotConnect for Oracle attempts to drop the sequence, but it does not ignore errors. Ideas for Future We are considering the possibility to implement the following features in future: As for additional database-specific customizations of migrations: Support database-specific customizations for MySQL, PostgreSQL and SQLite. Extend the implementation for Oracle. Implement setting some options with the global configuration options to avoid the necessity to specify them for all migrations. As for the Entity Framework Code-First Migrations support in Entity Developer: The possibility to generate migrations with powerful and flexible Update To Database Wizard. The possibility to specify database-specific customizations for entities/database objects in design time in Entity Developer without typing code. Comment these suggestions and add your own ones in comments here or on [our forum](https://forums.devart.com/viewforum.php?f=41) or on [our UserVoice forum](http://devart.uservoice.com/forums/105163-ado-net-entity-framework-support) if you want these features to be implemented in the future versions of our products. Tags [code first](https://blog.devart.com/tag/code-first) [code-first migrations](https://blog.devart.com/tag/code-first-migrations) [entity framework](https://blog.devart.com/tag/entity-framework) [Oracle](https://blog.devart.com/tag/oracle) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Code-First+Migrations+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html&title=Entity+Framework+Code-First+Migrations+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html&title=Entity+Framework+Code-First+Migrations+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite) [Copy URL](https://blog.devart.com/entity-framework-code-first-migrations-support-for-oracle-mysql-postgresql-and-sqlite.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 1 COMMENT Entity Framework 6 Support for Oracle, MySQL, PostgreSQL, SQLite and Salesforce November 8, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 7:21 pm […] Registration of Code-First Migrations SQL Generator section of our Entity Framework Code-First Migrations support for Oracle, MySQL, PostgreSQL and […] Comments are closed."} {"url": "https://blog.devart.com/entity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 1, 2011 [20](https://blog.devart.com/entity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html#comments) 11747 April 2011 saw the release of a new version of Entity Framework 4.1; this blog article and samples of code contained in it have been correspondingly updated to match the new features that are now available. The latest versions of Devart [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) contain the most recent version of Code-First (Code Only) support that was added in Entity Framework 4.1. One of the primary advantages of the Code-First approach is a higher level of abstraction from the database and the capability to develop applications using the domain-driven design (DDD) approach. This article does not provide detailed examples of different aspects related to the use of the Code-First (Code Only) approach. For detailed information on the Code-First approach, see the series of publications on the [ADO.NET team blog](https://docs.microsoft.com/en-us/archive/blogs/adonet/) or [MSDN](http://msdn.microsoft.com/en-us/library/gg696172%28v=VS.103%29.aspx) documentation. This article deals with the following: Peculiarities of Dynamic Database Creation and Dropping Implementation Database Creation Database Existence Check Database Dropping Code-First Sample Entity Creation DbContext Creation, Mapping Customization Database Fill Database Call DefaultConnectionFactory Creating Entity Framework Model with Fluent Mapping in Designer Code-First for All Databases Peculiarities of Dynamic Database Creating and Dropping Implementation The Code-First approach allows you to describe your model using plain .NET objects and map it to the existing database objects through .NET code without verifications and pre-initializations. You can also choose one of database initialization strategies ( [DropCreateDatabaseAlways](http://msdn.microsoft.com/en-us/library/gg679506%28v=vs.103%29.aspx) , [DropCreateDatabaseIfModelChanges](http://msdn.microsoft.com/en-us/library/gg679604%28v=VS.103%29.aspx) , and [CreateDatabaseIfNotExists](http://msdn.microsoft.com/en-us/library/gg679221%28v=VS.103%29.aspx) ). These strategies allow you to create or drop database tables dynamically. Let’s consider the main features of dynamic database creation and dropping. Database Creation Database creation means creating tables and relationships between them. For Oracle databases, if the primary key is specified as database-generated with the DatabaseGeneratedOption.Identity option, then a sequence and an insert trigger will be generated for this table to make this column autoincrement. Please note that, by default, the database creation mechanism doesn’t create a storage for tables. For example, if you work with the Oracle or PostgreSQL database, then the corresponding schema must exist in the database before you create database objects. The corresponding database should be created when MySQL DBMS is used. The Oracle user must have appropriate privileges to create/drop/alter tables, sequences, and triggers in corresponding schemas. The connection string for SQLite should contain the “FailIfMissing=false;” parameter to create an empty database when opening connection. Additionally, EF-providers allow configuring the process of creating and deleting database objects, including the setup of DeleteDatabaseBehaviour that can accept the following three values (four for PostgreSQL): ModelObjectsOnly (by default), AllSchemaObjects , and Schema . In the latter case, an entire schema will be created / deleted. See below for more information on DeleteDatabaseBehaviour . dotConnect for PostgreSQL allows additional value – Database , which enables deleting/creating of entire PostgreSQL database. Database Existence Check The implementation of database existence verification includes the check that at least one required object exists in the database. The existence of at least one table in MySQL, PostgreSQL, SQLite is verified as well. Additionally, for Oracle, the existence of sequences is verified. Database Dropping Depending on the value of the DeleteDatabaseBehaviour configuration option, the deletion behavior will differ: ModelObjectsOnly – only the tables and sequences (in Oracle) that model objects are mapped to are deleted. This is the default behavior. AllSchemaObjects – all tables (and corresponding sequences used for auto-increment columns in Oracle) will be deleted from schemas or databases that model objects were mapped to. Schema – the entire schema (database) is deleted. If the model contains objects from other schemas, these schemas (databases) will be deleted as well. Note: There is no possibility to delete the user that is currently connected in Oracle. Thus, to call the DeleteDatabase() method, the user must have grants to delete users and must not have any database objects that model objects are mapped to. For more information on how to configure the Code-First behavior, see the section Database Script Generation Settings in New Features of Entity Framework Support in dotConnect Providers . Code-First sample Let’s consider a simple example of Code-First approach implementation and use with DBMS, different from SQL Server. For the purpose of this example, we shall use [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) data provider, comments for other providers can be found in the code samples. Entity Creation Let’s define the following two related entities: Product and ProductCategory . [Common](http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations.aspx) and [additional](http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations%28v=VS.103%29.aspx) Data Annotations attributes can be used for mapping customization. We have used them for the autoincrement column definition in the primary key and for the definition of non-nullable columns. You can also use these attributes for the MaxLength limitation of a string field. Similarly, we can do the same using fluent mapping in the OnModelCreating method of DbContext (this example is considered in the next chapter). Besides, we shall not set up the table schema name explicitly using the Table attribute – we shall define it for all tables later using our own convention. Since entity properties that represent foreign keys are defined with a virtual modifier, we shall use dynamic proxies for POCO classes that allow the use of Lazy Loading . If you are not going to use Lazy Loading or do not want to get dynamic proxies, then do not define properties with a virtual modifier and use explicit Eager Loading (the Include method). //[Table(\"Product\", Schema = \"TEST\")]\npublic class Product\n{\n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long ProductID { get; set; }\n\n [Required]\n [MaxLength(50)]\n public string ProductName { get; set; }\n\n public string UnitName { get; set; }\n public int UnitScale { get; set; }\n public long InStock { get; set; }\n public double Price { get; set; }\n public double DiscontinuedPrice { get; set; }\n\n public virtual ProductCategory Category { get; set; }\n}\n\n//[Table(\"ProductCategory\", Schema = \"TEST\")]\npublic class ProductCategory\n{\n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long CategoryID { get; set; }\n\n [Required]\n [MaxLength(20)]\n public string CategoryName { get; set; }\n\n public virtual ProductCategory ParentCategory { get; set; }\n public virtual ICollection Products { get; set; }\n} DbContext Creating, Mapping Customization Let us define a [DbContext](http://msdn.microsoft.com/en-us/library/system.data.entity.dbcontext%28v=vs.103%29.aspx) descendant that is an [ObjectContext](http://msdn.microsoft.com/en-us/library/system.data.objects.objectcontext.aspx) wrapper. We can turn on and off conventions and set class mapping to the database objects using fluent mapping in the OnModelCreating method of DbContext. public class MyDbContext : DbContext\n{\n\n public MyDbContext()\n : base() {\n }\n\n public MyDbContext(DbConnection connection)\n : base(connection, true) {\n }\n\n protected override void OnModelCreating(DbModelBuilder modelBuilder)\n {\n\n /*-------------------------------------------------------------\n ColumnTypeCasingConvention should be removed for dotConnect for Oracle.\n This option is obligatory only for SqlClient.\n Turning off ColumnTypeCasingConvention isn't necessary\n for dotConnect for MySQL, PostgreSQL, and SQLite.\n -------------------------------------------------------------*/\n\n modelBuilder.Conventions\n .Remove();\n\n /*-------------------------------------------------------------\n If you don't want to create and use EdmMetadata table\n for monitoring the correspondence\n between the current model and table structure\n created in a database, then turn off IncludeMetadataConvention:\n -------------------------------------------------------------*/\n\n modelBuilder.Conventions\n .Remove();\n\n /*-------------------------------------------------------------\n In the sample above we have defined autoincrement columns in the primary key\n and non-nullable columns using DataAnnotation attributes.\n Similarly, the same can be done with Fluent mapping\n -------------------------------------------------------------*/\n\n //modelBuilder.Entity().HasKey(p => p.ProductID);\n //modelBuilder.Entity().Property(p => p.ProductID)\n // .HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity);\n //modelBuilder.Entity().Property(p => p.ProductName)\n // .IsRequired()\n // .HasMaxLength(50);\n //modelBuilder.Entity().HasKey(p => p.CategoryID);\n //modelBuilder.Entity().Property(p => p.CategoryID)\n // .HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity);\n //modelBuilder.Entity().Property(p => p.CategoryName)\n // .IsRequired()\n // .HasMaxLength(20);\n //modelBuilder.Entity().ToTable(\"Product\", \"TEST\");\n //modelBuilder.Entity().ToTable(\"ProductCategory\", \"TEST\");\n\n //-------------------------------------------------------------//\n }\n\n public DbSet Products { get; set; }\n public DbSet ProductCategories { get; set; }\n} Database Filling Let us consider descendants for all available strategies – DropCreateDatabaseAlways , DropCreateDatabaseIfModelChanges , and CreateDatabaseIfNotExists with additional data filling functionality after database creation: Code Sample public class MyDbContextDropCreateDatabaseAlways \n : DropCreateDatabaseAlways \n{ \n protected override void Seed(MyDbContext context) \n {\n MyDbContextSeeder.Seed(context);\n }\n}\n\npublic class MyDbContextDropCreateDatabaseIfModelChanges \n : DropCreateDatabaseIfModelChanges \n{ \n protected override void Seed(MyDbContext context) \n {\n MyDbContextSeeder.Seed(context);\n }\n}\n\npublic class MyDbContextCreateDatabaseIfNotExists \n : CreateDatabaseIfNotExists \n{ \n protected override void Seed(MyDbContext context) \n {\n MyDbContextSeeder.Seed(context);\n }\n}\n\npublic static class MyDbContextSeeder \n{ \n public static void Seed(MyDbContext context) \n { \n context.ProductCategories.Add(new ProductCategory() \n {\n CategoryName = \"prose\"\n });\n context.ProductCategories.Add(new ProductCategory() \n {\n CategoryName = \"novel\"\n });\n context.ProductCategories.Add(new ProductCategory() \n { \n CategoryName = \"poem\",\n ParentCategory = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"novel\")\n });\n context.ProductCategories.Add(new ProductCategory() \n { \n CategoryName = \"fantasy\",\n ParentCategory = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"novel\")\n });\n\n context.Products.Add(new Product() \n { \n ProductName = \"Shakespeare W. Shakespeare's dramatische Werke\",\n Price = 78,\n Category = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"prose\")\n }); \n context.Products.Add(new Product() \n { \n ProductName = \"Plutarchus. Plutarch's moralia\", \n Price = 89, \n Category = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"prose\")\n });\n context.Products.Add(new Product() \n { \n ProductName = \"Harrison G. B. England in Shakespeare's day\", \n Price = 540, \n Category = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"novel\")\n });\n context.Products.Add(new Product() \n { \n ProductName = \"Corkett Anne. The salamander's laughter\", \n Price = 5,\n Category = \n context.ProductCategories.Local.Single(p => p.CategoryName == \"poem\")\n });\n }\n } Database Call In this example, we have turned on [OracleMonitor](http://www.devart.com/dotconnect/oracle/docs/?DbMonitor.html) to view executed DDL и DML statements in the [Devart dbMonitor](http://www.devart.com/dbmonitor/) application. However, it should be remembered that the use of OracleMonitor is feasible for the purpose of testing and debugging but should be limited in production environments, since monitoring can decrease the performance of your application. By default, when a class wasn’t explicitly mapped using the [Table attribute](http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations.tableattribute%28v=vs.103%29.aspx) (see the Product and ProductCategory class definition above) or fluent mapping (the [ToTable method](http://msdn.microsoft.com/en-us/library/gg679488%28v=VS.103%29.aspx) ), then Model-First uses default mapping, suitable for SQL Server, but ineligible for other databases: the table will have the “dbo” prefix, so table names will be as follows: “dbo.TableName”. If [IncludeMetadataConvention](http://msdn.microsoft.com/en-us/library/system.data.entity.infrastructure.includemetadataconvention%28v=vs.103%29.aspx) was turned off, then it will be enough to define Table attributes for all entities or use .ToTable() in fluent mapping. If IncludeMetadataConvention was turned on, then the “dbo”.” [EdmMetadata](http://msdn.microsoft.com/en-us/library/system.data.entity.infrastructure.edmmetadata%28v=vs.103%29.aspx) ” table is used and it is impossible to easily change the schema name. It is also necessary to explicitly set the schema name for the table that represents the many-to-many association. To avoid the issue with the schema name, we use the capability for configuring the behavior of the Entity Framework provider to switch off schema name generation. class Program\n{\n static void Main(string[] args)\n {\n Devart.Data.Oracle.OracleMonitor monitor\n = new Devart.Data.Oracle.OracleMonitor() { IsActive = true };\n\n //--------------------------------------------------------------\n // You use the capability for configuring the behavior of the EF-provider:\n Devart.Data.Oracle.Entity.Configuration.OracleEntityProviderConfig config =\n Devart.Data.Oracle.Entity.Configuration.OracleEntityProviderConfig.Instance;\n // Now, you switch off schema name generation while generating \n // DDL scripts and DML:\n config.Workarounds.IgnoreSchemaName = true;\n //--------------------------------------------------------------\n\n /*--------------------------------------------------------------\n You can set up a connection string for DbContext in different ways.\n It can be placed into the app.config (web.config) file.\n The connection string name must be identical to the DbContext descendant name.\n\n \n\n After that, you create a context instance, while a connection string is \n enabled automatically:\n MyDbContext context = new MyDbContext();\n ---------------------------------------------------------------*/\n\n /*--------------------------------------------------------------\n And now it is possible to create an instance of the provider-specific connection \n and send it to the context constructor, like we did in this application. \n That allows us to use the StateChange connection event to change the Oracle \n current schema on its occurrence. Thus, we can connect as one user and \n work on a schema owned by another user.\n ---------------------------------------------------------------*/\n DbConnection con = new Devart.Data.Oracle.OracleConnection(\n \"Data Source=ora1020;User Id=scott;Password=tiger;\");\n con.StateChange += new StateChangeEventHandler(Connection_StateChange);\n ---------------------------------------------------------------*/\n\n /*--------------------------------------------------------------\n You can choose one of database initialization\n strategies or turn off initialization:\n --------------------------------------------------------------*/\n System.Data.Entity.Database.SetInitializer\n (new MyDbContextDropCreateDatabaseAlways());\n /*System.Data.Entity.Database.SetInitializer\n (new MyDbContextCreateDatabaseIfNotExists());\n System.Data.Entity.Database.SetInitializer\n (new MyDbContextDropCreateDatabaseIfModelChanges());\n System.Data.Entity.Database.SetInitializer(null);*/\n //--------------------------------------------------------------\n\n /*--------------------------------------------------------------\n Let's create MyDbContext and execute a database query.\n Depending on selected database initialization strategy,\n database tables can be deleted/added, and filled with source data.\n ---------------------------------------------------------------*/\n\n using (MyDbContext context = new MyDbContext(con))\n {\n var query = context.Products.Include(\"Category\")\n .Where(p => p.Price > 20.0)\n .ToList();\n\n foreach (var product in query)\n Console.WriteLine(\"{0,-10} | {1,-50} | {2}\",\n product.ProductID, product.ProductName, product.Category.CategoryName);\n\n Console.ReadKey();\n }\n }\n\n // On connection opening, we change the current schema to \"TEST\":\n static void Connection_StateChange(object sender, StateChangeEventArgs e) {\n\n if (e.CurrentState == ConnectionState.Open) {\n DbConnection connection = (DbConnection)sender;\n connection.ChangeDatabase(\"TEST\");\n }\n }\n} DefaultConnectionFactory In addition to the two ways of specifying the connection string that are described in the previous section, here we shall review yet another one. This way involves non-explicit specification of the connection string through [Database.DefaultConnectionFactory](http://msdn.microsoft.com/en-us/library/system.data.entity.database.defaultconnectionfactory%28v=vs.103%29.aspx) . For that purpose, you need to have a provider-specific class that implements the [IDbConnectionFactory](http://msdn.microsoft.com/en-us/library/system.data.entity.infrastructure.idbconnectionfactory%28v=vs.103%29.aspx) interface. Admittedly, this approach does not provide sufficient flexibility and should rather be used with embedded databases, for instance, SQLite. Below is the example of the SQLiteConnectionFactory class that can be used to obviate the need to specify a connection string per each context. Class SQLiteConnectionFactory public class SQLiteConnectionFactory : IDbConnectionFactory {\n\n private const string invariantName = \"Devart.Data.SQLite\";\n\n #region Constructors\n\n public SQLiteConnectionFactory()\n : this(String.Empty, String.Empty) {\n }\n\n public SQLiteConnectionFactory(string basePath, string defaultConnectionString) {\n\n this.BasePath = basePath;\n this.DefaultConnectionString = defaultConnectionString;\n }\n\n #endregion\n\n #region IDbConnectionFactory Members\n\n public DbConnection CreateConnection(string nameOrConnectionString) {\n\n if (String.IsNullOrEmpty(nameOrConnectionString))\n throw new ArgumentNullException(\"nameOrConnectionString\");\n\n DbProviderFactory sqliteProviderFactory = DbProviderFactories.GetFactory(invariantName);\n if (sqliteProviderFactory == null)\n throw new InvalidOperationException(String.Format(\"The '{0}' provider is not registered on the local machine.\", invariantName));\n\n DbConnection connection = sqliteProviderFactory.CreateConnection();\n\n if (nameOrConnectionString.Contains('='))\n connection.ConnectionString = nameOrConnectionString;\n else {\n StringBuilder builder = new StringBuilder(128);\n builder.Append(DefaultConnectionString);\n\n if (builder.Length > 0 && builder[builder.Length - 1] != ';')\n builder.Append(';');\n\n builder.Append(\"Data Source=\");\n builder.Append(BasePath);\n string dbFileName = nameOrConnectionString;\n if (dbFileName.Contains('.')) {\n int classNameFrom = nameOrConnectionString.LastIndexOf('.') + 1;\n int classNameLength = nameOrConnectionString.Length - classNameFrom;\n dbFileName = nameOrConnectionString.Substring(classNameFrom, classNameLength);\n }\n builder.Append(dbFileName);\n builder.Append(\".db\");\n\n connection.ConnectionString = builder.ToString();\n }\n\n return connection;\n }\n\n #endregion\n\n #region Properties\n\n public string BasePath { get; private set; }\n\n public string DefaultConnectionString { get; private set; }\n\n #endregion\n\n } DefaultConnectionFactory is specified once at the application startup. In the example, we shall also specify the relative path to be used when searching for/creating the SQLite database file and define an additional connection string option so that the database file is created if it is non-existent: Database.DefaultConnectionFactory = \n new SQLiteConnectionFactory(\".\\..\\..\\\", \"FailIfMissing=False\"); Then, for each context created in the application, MySQLiteContext context = new MySQLiteContext(); a provider-specific connection will be created through the use of previously defined SQLiteConnectionFactory. At first access to the database, the connection is created and, on its opening, provided that the database file does not exist, the “….. MySQLiteContext.db ” file will be created for the MySQLiteContext class. Creating Entity Framework Model with Fluent Mapping in Designer Initially, fluent mapping was intended for the Code-First development, when the DbContext class, all entity classes and complex types were written manually, mapping was defined through the use of Data Annotation attributes and/or fluent mapping (it being impossible to define all possible mapping using only attributes). In case of large models, all this required a significant amount of manual operations; besides, it was impossible to use the [Database-First](http://www.devart.com/entitydeveloper/database-first.html) and [Model-First](http://www.devart.com/entitydeveloper/model-first.html) approaches to development. So the users requested that a capability for at least partial development of the Code-First EF-model in the designer be implemented. For standard EDM Designer in Microsoft Visual Studio 2010 there is the [ADO.NET DbContext Generator](http://blogs.msdn.com/b/adonet/archive/2011/03/15/ef-4-1-model-amp-database-first-walkthrough.aspx) template that is included in the full installation of Entity Framework 4.1. It generates DbContext and POCO entity classes, but does not allow generating fluent mapping. The latest versions of [Devart Entity Developer](http://www.devart.com/entitydeveloper/) include the DbContext template that allows generating DbContext both with and without fluent mapping. Admittedly, the use of the designer results in lower flexibility and does not account for all cases of mapping; on the upside, the process of development has grown more comfortable and faster. The users can enhance the initial DbContext template or can combine different techniques, for example, they can create and map some entities using the designer and write other entities manually. For more information on functional capabilities of DbContext template, see Entity Developer – EF Code First DbContext Template . Code-First for All Databases In the [attached archive](https://blog.devart.com/wp-content/uploads/2017/08/CrmDemo.EFCodeFirst.zip) , you can find the full version of the above examples that use EF Code-First for each ADO.NET provider: Devart dotConnect for Oracle Devart dotConnect for MySQL Devart dotConnect for PostgreSQL Devart dotConnect for SQLite as well as for standard Microsoft .NET Framework Data Provider for SQL Server (SqlClient) These C# samples were created for Visual Studio 2010, and Microsoft ADO.NET Entity Framework 4.1 [installation](http://www.microsoft.com/downloads/en/details.aspx?FamilyID=b41c728e-9b4f-4331-a1a8-537d16c6acdf&displaylang=en) is required. [Devart dbMonitor](http://www.devart.com/dbmonitor/) , which is a free tool, is used for the monitoring of DML and DDL statements of Devart ADO.NET providers. You can find the updated version of the samples for Entity Framework 6 in this blog article: [Entity Framework 6 Support for Oracle, MySQL, PostgreSQL, SQLite and Salesforce](https://blog.devart.com/entity-framework-6-support-for-oracle-mysql-postgresql-sqlite-and-salesforce.html) . Tags [code first](https://blog.devart.com/tag/code-first) [entity framework](https://blog.devart.com/tag/entity-framework) [Oracle](https://blog.devart.com/tag/oracle) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Code-First+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html&title=Entity+Framework+Code-First+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html&title=Entity+Framework+Code-First+support+for+Oracle%2C+MySQL%2C+PostgreSQL+and+SQLite) [Copy URL](https://blog.devart.com/entity-framework-code-first-support-for-oracle-mysql-postgresql-and-sqlite.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 20 COMMENTS Challenger March 13, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 10:54 am Hey dude, i’m recently working on the so called “code first” stuff in EF4. it’s really made confused a lot, until i see your article here. it’s cool! THANKS A LOT! Microsoft’s code-first Entity Framework 4.1 nearly done « Tim Anderson’s ITWriting March 16, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 3:38 am […] It looks fantastic; though there are a few caveats. One is that Microsoft tends to assume use of its own database managers, SQL Server or for simple cases, SQL Server CE. That said, there are drivers for other databases; for example devart has code-first drivers for Oracle, MySQL, PostgreSQL and SQLite. […] Ilya April 27, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 1:44 pm Can you publish an example of the same code working with the released version of Code First (EF 4.1). The code you have does not work with the release. Thank you! Shalex May 6, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 1:30 am Dear users, We have updated this article (and samples) to reflect changes between EF CTP5 and EF 4.1 RC. This article includes now usage of the IgnoreSchemaName workaround, the DefaultConnectionFactory and Creating Entity Framework Model with Fluent Mapping in Designer sections. Devart Team dotConnect Team’s Blog » Blog Archive » Entity Developer – EF Code First DbContext Template May 17, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 6:09 am […] In the latter case, the EF-provider is responsible for the selection of a particular database-specific data type. The provider can be defined in several ways. For more information on how to define a provider-specific connection, see Entity Framework Code-First support for Oracle, MySQL, PostgreSQL and SQLite. […] Fred July 7, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 3:23 pm Hi, great stuff. But how does EF Code Only know which Entity Framework Provider to use when you only define the underlying ADO.NET Provider ? Regards, Fred Devart July 8, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 4:39 am Fred, each Entity Framework provider has one and only one underlying ADO.NET provider. Thus, the Code Only provider gets corresponding ProviderFactory from the provider-specific connection, and identifies the ADO.NET provider. Entity Framework Provider is then determined with the help of the ADO.NET Provider. Andrew August 23, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 3:14 pm When attempting to follow your example, I am getting an “ORA-01005: null password given; logon denied” error. This only occurs when the project tries to create the seed data. If I comment out the line System.Data.Entity.Database.SetInitializer(new MyDbContextDropCreateDatabaseAlways()); I can perform regular CRUD actions against the database. When I look at the context passed to the Seed method, it has the password in the DbConnectionBase class. Shalex August 29, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 7:12 am Andrew, please try setting the “Persist Security Info=true;” connection string parameter. mike January 10, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 4:19 pm When I try the SQLite DB first code I get errors due to no ProductId. This is due, i believe, to no identity option set on ProductId mike January 10, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 4:23 pm I see this but it isn’t creating the identity in SQLite [Key] [DatabaseGenerated(DatabaseGeneratedOption.Identity)] public long CompanyID { get; set; } mike January 10, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 5:48 pm Does your connection string support BinaryGUID=False?? mike January 11, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 10:42 am Hey, I put “BinaryGUID=False” in my connection string but guids still are not non-binary I defined my key as [Key] public Guid StuffId {get; set;} context.Stuff.Add(new Stuff(),{ StuffId = Guid.NewGuid()}; context.SaveChanges; the row is inserted into the table but StuffId is binary gibberish [Shalex](https://www.devart.com) January 13, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 9:46 am 1. > I see this but it isn’t creating the identity in SQLite > [Key] > [DatabaseGenerated(DatabaseGeneratedOption.Identity)] > public long CompanyID { get; set; } Please refer to [https://www.devart.com/dotconnect/sqlite/revision_history.html](https://www.devart.com/dotconnect/sqlite/revision_history.html) 3.60.258 08-Dec-11 The bug with generating AUTOINCREMENT in DDL for key columns when using Identity in Code First is fixed We recommend you upgrading to the latest (3.60.283) version of dotConnect for SQLite. 2. > Does your connection string support BinaryGUID=False?? Yes, connection string of dotConnect for SQLite supports BinaryGUID. 3. > I put “BinaryGUID=False” in my connection string but guids still are not non-binary We will investigate the issue and post here about the results. mike January 13, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 12:12 pm 1. > I see this but it isn’t creating the identity in SQLite > [Key] > [DatabaseGenerated(DatabaseGeneratedOption.Identity)] > public long CompanyID { get; set; } Please refer to [https://www.devart.com/dotconnect/sqlite/revision_history.html](https://www.devart.com/dotconnect/sqlite/revision_history.html) 3.60.258 08-Dec-11 The bug with generating AUTOINCREMENT in DDL for key columns when using Identity in Code First is fixed We recommend you upgrading to the latest (3.60.283) version of dotConnect for SQLite. Ummm, I downloaded the Professional trial version a couple days ago which gave me 3.60.268 How about the trail version also be the current version? mike January 13, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 12:18 pm ok, i see 3.60.283 was put on on 12 Jan 2012, yesterday what’s the best way to updated my trail download to the new version? mike January 13, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 12:27 pm for the BinaryGUID isue, I tried this. In SQLiteConnectionFactory.cs public DbConnection CreateConnection(…) { … builder.Append(dbFileName); builder.Append(“.db;”); builder.Append(“Version=3;BinaryGUID=False”); connection.ConnectionString = builder.ToString(); } along with [Key] public Guid StuffId {get; set;} context.Stuff.Add(new Stuff(),{ StuffId = Guid.NewGuid()}; context.SaveChanges; the row is inserted into the table but StuffId is binary gibberish [Shalex](http://www.devart.com) January 16, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 10:10 am > what’s the best way to updated my trail download to the new version? Please download the latest (3.60.283) trial version and run it (the previous version will be uninstalled automatically). After this, clear the bin and obj folders of your project and rebuild the project. [Shalex](http://www.devart.com) January 24, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 10:57 am The bug with Guid parameters representation depending on the BinaryGUID connection string parameter’s value is fixed. We will post here when the corresponding build of dotConnect for SQLite is available for download. [Shalex](https://www.devart.com) January 27, 2012\t\t\t\t\t\t At\t\t\t\t\t\t 9:12 am New version of dotConnect for SQLite 3.70 is released! It can be downloaded from [https://www.devart.com/dotconnect/sqlite/download.html](https://www.devart.com/dotconnect/sqlite/download.html) (trial version) or from Registered Users’ Area (for users with active subscription only). For more information, please refer to [https://www.devart.com/forums/viewtopic.php?t=23257](https://www.devart.com/forums/viewtopic.php?t=23257) . Comments are closed."} {"url": "https://blog.devart.com/entity-framework-core-1-entity-framework-7-support.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) Entity Framework Core 1 (Entity Framework 7) Support By [dotConnect Team](https://blog.devart.com/author/dotconnect) April 25, 2016 [6](https://blog.devart.com/entity-framework-core-1-entity-framework-7-support.html#comments) 9590 Entity Framework Core 1, formerly known as Entity Framework 7 (hereafter also called EF Core or EF7), is supported in Devart ADO.NET provider product line. It is supported in providers for relational databases: Oracle, MySQL, PostgreSQL, SQLite, DB2, and provides for cloud data sources , such as Salesforce, Dynamics CRM, SugarCRM, Zoho CRM, QuickBooks, FreshBooks, MailChimp, ExactTarget, Bigcommerce, Magento. Currently our providers support Entity Framework Core Release Candidate 1. Entity Framework Core support is currently in public beta stage. It is implemented for the full .NET Framework platform – .NET Framework 4.5.1 and higher. We need to mention first that Entity Framework Core 1 (Entity Framework 7) – is a completely new ORM, which inherited only the name, LINQ support, and some classes with the same or similar names from Entity Framework 6, and even these classes often have incomplete functionality in comparison to their Entity Framework 6 counterparts. You can find more information about EF Core compatibility with EF6 and issues of migration in other our article, “Migrating Entity Framework 6 projects to Entity Framework Core 1 (Entity Framework 7)”. Additionally you can study Entity Framework Core features in [its official documentation](http://ef.readthedocs.org/en/latest/) . New Assemblies and Provider Registration Mapping Current Limitations Behavior Changes Database-First and Model-First in Entity Framework Core Creating Simple Code-First Entity Framework Core Application Conclusion New Assemblies and Provider Registration For Entity Framework Core, a new assembly is added to our providers, and it should be deployed with applications, using Entity Framework Core. Additionally, we have included new new extension methods for provider registration and specifying a connection string to our Entity Framework providers for EF Core (EF7). The new assemblies and extension methods are listed in the following table: Provider Assembly Registration method name [Devart dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) Devart.Data.Oracle.Entity.EF7.dll UseOracle() [Devart dotConnect for MySQL](https://www.devart.com/dotconnect/mysql/) Devart.Data.MySql.Entity.EF7.dll UseMySql() [Devart dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) Devart.Data.PostgreSql.Entity.EF7.dll UsePostgreSql() [Devart dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) Devart.Data.SQLite.Entity.EF7.dll UseSQLite() [Devart dotConnect for DB2](https://www.devart.com/dotconnect/db2/) Devart.Data.DB2.Entity.EF7.dll UseDB2() [Devart dotConnect for Salesforce](https://www.devart.com/dotconnect/salesforce/) Devart.Data.Salesforce.Entity.EF7.dll UseSalesforce() [Devart dotConnect for Dynamics CRM](https://www.devart.com/dotconnect/dynamicscrm/) Devart.Data.Dynamics.Entity.EF7.dll UseDynamics() [Devart dotConnect for SugarCRM](https://www.devart.com/dotconnect/sugarcrm/) Devart.Data.Sugar.Entity.EF7.dll UseSugar() [Devart dotConnect for Zoho CRM](https://www.devart.com/dotconnect/zohocrm/) Devart.Data.Zoho.Entity.EF7.dll UseZoho() [Devart dotConnect for QuickBooks](https://www.devart.com/dotconnect/quickbooks/) Devart.Data.QuickBooks.Entity.EF7.dll UseQuickBooks() [Devart dotConnect for FreshBooks](https://www.devart.com/dotconnect/freshbooks/) Devart.Data.FreshBooks.Entity.EF7.dll UseFreshBooks() [Devart dotConnect for MailChimp](https://www.devart.com/dotconnect/mailchimp/) Devart.Data.MailChimp.Entity.EF7.dll UseMailChimp() [Devart dotConnect for ExactTarget](https://www.devart.com/dotconnect/exacttarget/) Devart.Data.ExactTarget.Entity.EF7.dll UseExactTarget() [Devart dotConnect for Bigcommerce](https://www.devart.com/dotconnect/bigcommerce/) Devart.Data.Bigcommerce.Entity.EF7.dll UseBigcommerce() [Devart dotConnect for Magento](https://www.devart.com/dotconnect/magento/) Devart.Data.Magento.Entity.EF7.dll UseMagento() In order to register an Entity Framework provider and set a connection string, you need to override the OnConfiguring method of your DbContext class descendant. Example for Devart dotConnect for Oracle: protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) {\n \n optionsBuilder.UseOracle(@\"user id=user; password=password; server=ORCL1210;\");\n} Mapping The mapping supported by Entity Framework Core is described in details in the [corresponding section](http://ef.readthedocs.org/en/latest/modeling/index.html) of Entity Framework documentation. If you plan to support several data sources for one base Entity Framework model, that is mapped to tables and columns named differently in different data sources, you can use special provider-specific extension methods in fluent mapping in order to support specific mapping for each data source simultaneously: Provider Table name Column name Column type Еxtension method not related to a specific provider ToTable() HasColumnName() HasColumnType() Еxtension method for SQL Server ForSqlServerToTable() ForSqlServerHasColumnName() ForSqlServerHasColumnType() [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) ForOracleToTable() ForOracleHasColumnName() ForOracleHasColumnType() [dotConnect for MySQL](https://www.devart.com/dotconnect/mysql/) ForMySqlToTable() ForMySqlHasColumnName() ForMySqlHasColumnType() [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) ForPostgreSqlToTable() ForPostgreSqlHasColumnName() ForPostgreSqlHasColumnType() [dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) ForSQLiteToTable() ForSQLiteHasColumnName() ForSQLiteHasColumnType() [dotConnect for DB2](https://www.devart.com/dotconnect/db2/) ForDB2ToTable() ForDB2HasColumnName() ForDB2HasColumnType() [dotConnect for Salesforce](https://www.devart.com/dotconnect/salesforce/) ForSalesforceToTable() ForSalesforceHasColumnName() ForSalesforceHasColumnType() [dotConnect for Dynamics CRM](https://www.devart.com/dotconnect/dynamicscrm/) ForDynamicsToTable() ForDynamicsHasColumnName() ForDynamicsHasColumnType() [dotConnect for SugarCRM](https://www.devart.com/dotconnect/sugarcrm/) ForSugarToTable() ForSugarHasColumnName() ForSugarHasColumnType() [dotConnect for Zoho CRM](https://www.devart.com/dotconnect/zohocrm/) ForZohoToTable() ForZohoHasColumnName() ForZohoHasColumnType() [dotConnect for QuickBooks](https://www.devart.com/dotconnect/quickbooks/) ForQuickBooksToTable() ForQuickBooksHasColumnName() ForQuickBooksHasColumnType() [dotConnect for FreshBooks](https://www.devart.com/dotconnect/freshbooks/) ForFreshBooksToTable() ForFreshBooksHasColumnName() ForFreshBooksHasColumnType() [dotConnect for MailChimp](https://www.devart.com/dotconnect/mailchimp/) ForMailChimpToTable() ForMailChimpHasColumnName() ForMailChimpHasColumnType() [dotConnect for ExactTarget](https://www.devart.com/dotconnect/exacttarget/) ForExactTargetToTable() ForExactTargetHasColumnName() ForExactTargetHasColumnType() [dotConnect for Bigcommerce](https://www.devart.com/dotconnect/bigcommerce/) ForBigcommerceToTable() ForBigcommerceHasColumnName() ForBigcommerceHasColumnType() [dotConnect for Magento](https://www.devart.com/dotconnect/magento/) ForMagentoToTable() ForMagentoHasColumnName() ForMagentoHasColumnType() There are also similar extension methods for specific mapping for each data source simultaneously: For{DataSourceName}HasName() – specifies the name of the index and primary key For{DataSourceName}HasConstraintName() – specifies the foreign key name For{DataSourceName}HasDefaultValueSql() – specifies the default column value For example, for dotConnect for Oracle, these methods would be: ForOracleHasName() ForOracleHasConstraintName() ForOracleHasDefaultValueSql() And for other providers there are also such methods named respectively. The following example demonstrates mapping an Account class to various tables depending on the data source: protected override void OnModelCreating(ModelBuilder modelBuilder) {\n \n modelBuilder.Entity()\n .ForSqlServerToTable(\"Account\")\n .ForOracleToTable(\"ACCOUNTS\")\n .ForMySqlToTable(\"active_accounts\")\n .ForPostgreSqlToTable(\"account\")\n .ForDB2ToTable(\"ACCOUNT\")\n .ForSQLiteToTable(\"Account\");\n } Current Limitations When applying this solution to production, you should consider that our Entity Framework Core support is still at beta stage, and Entity Framework Core has not yet reached the release stage too. Our support is implemented only for the Full .NET Framework platform of version 4.5.1 and higher. We also considering the possibility of implementation for .NET Core and Universal Windows Platform, but there are no release dates scheduled. Additionally, our providers support only a part of provider configuration settings. For example, for config.DatabaseScript.Schema.DeleteDatabaseBehaviour setting, only the DeleteDatabaseBehaviour.ModelObjectsOnly value (default value) is supported. Some of the features present in EF6 providers are also not supported in EF Core providers because Entity Framework Core itself does not support many of Entity Framework 6 features. For example, these features include spatials or stored procedure support, etc. You can find more information about not supported features in another our blog article, “Migration of Entity Framework 6 projects to Entity Framework Core 1 (Entity Framework 7)”. Behavior Changes Some provider configuration settings changed their default value. For example, config.CodeFirstOptions.TruncateLongDefaultNames is now true by default, though it was false before. This change was made in order to eliminate or decrease the need to configure the provider in simple cases. Entity Framework often generates too long name for foreign keys, and user often had to set this option to true in order to perform dynamic database object generation, especially for Oracle, where there is the 30 character limit on the names. MySQL and PostgreSQL allow longer names, but sometimes users encounter such situation with them too. There are some other changes in other aspects in comparison too Entity Framework 6, but these changes were made because we are trying to implement all the features of Entity Framework 6 we can with completely new EF Core engine. Some aspects are not yet implemented, but we are working hard to make behavior as similar as possible. Database-First and Model-First in Entity Framework Core Model-First approach is not supported in Entity Framework Core and there are no visual designer for Entity Framework Core models. Database-First is supported via Package Manager Console. It is console-mode and very feature-limited. We have not yet implemented support for this functionality. We plan to release a new version of a visual ORM model designer for Entity Framework – Devart [Entity Developer](https://www.devart.com/entitydeveloper/) . The new version will provide full implementation of the Database First and Model First approaches for Entity Framework Core. Devart Entity Developer has a wide feature set, advanced “Update From Database” support, generates code of entities, DbContext, and fluent mapping for them, and implements lots of other features. Creating Simple Code-First Entity Framework Core Application Let’s create a simple Code-First application that will create tables in the database based on the model in run-time, fill them with sample data, and execute queries. Note that dynamic database object creation (tables/FK/PK/indexes/triggers/sequences) based on an Entity Framework model is implemented in Devart Entity Framework providers only for relational databases: Oracle, MySQL, PostgreSQL, SQLite and DB2. Prerequsites: You must have Visual Studio 2013 or Visual Studio 2015 installed. Install at least one of the Devart ADO.NET providers that support EF Core (EF7) and dynamic database object creation. To create the sample application, let’s perform the following steps: Create a new console application. Ensure you are targeting .NET Framework 4.5.1 or later. Install the Entity Framework Core NuGet package by executing the following command in the Package Manager Console: Install-Package EntityFramework.Relational –Pre Add our provider-specific assemblies to the project references: dotConnect common assembly – Devart.Data.dll Provider-specific assembly Devart.Data.{DataSourceName}.dll: for Oracle it would be Devart.Data.Oracle.dll , for MySQL – Devart.Data.MySql.dll , and so on. EF7 provider assembly Devart.Data.{DataSourceName}.Entity.EF7.dll: for Oracle it would be Devart.Data.Oracle.Entity.EF7.dll , for MySQL – Devart.Data.MySql.Entity.EF7.dll , and so on. Note that when using one of our providers for cloud applications, you would also need to add a reference to Devart.Data.SqlShim.dll – common assembly of Devart ADO.NET providers for cloud applications. However, this our example uses only dotConnect providers for relational databases. Create a DbContext descendant. using Microsoft.Data.Entity;\nusing System;\nusing System.Collections.Generic;\nusing System.ComponentModel.DataAnnotations;\nusing System.ComponentModel.DataAnnotations.Schema;\nusing System.Data;\nusing System.IO;\nusing System.Linq;\nusing System.Text;\nusing System.Threading.Tasks;\n \npublic class MyDbContext : DbContext {\n \n} Register Entity Framework Core provider for using with our DbContext and specify the connection string. For this override the OnConfiguring method. Example for Devart dotConnect for Oracle : public class MyDbContext : DbContext {\n \n protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) {\n\n optionsBuilder.UseOracle(@\"user id=user; password=password; server=ORCL1210;\");\n } \n} Create entity classes, used in the model. If necessary, set DataAnnotation attributes for the classes and properties. [Table(\"Product\")]\n public class Product {\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long ProductID { get; set; }\n \n [Required]\n [MaxLength(50)]\n public string ProductName { get; set; }\n \n public string UnitName { get; set; }\n public int UnitScale { get; set; }\n public long InStock { get; set; }\n public double Price { get; set; }\n public double DiscontinuedPrice { get; set; }\n \n public virtual ProductCategory Category { get; set; }\n public virtual ICollection OrderDetails { get; set; }\n }\n \n [Table(\"ProductCategory\")]\n public class ProductCategory {\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long CategoryID { get; set; }\n \n [Required]\n [MaxLength(20)]\n public string CategoryName { get; set; }\n \n public virtual ProductCategory ParentCategory { get; set; }\n public virtual ICollection ChildCategories { get; set; }\n public virtual ICollection Products { get; set; }\n }\n \n [Table(\"Order Details\")]\n public class OrderDetail {\n \n [Column(Order = 1)]\n public long OrderID { get; set; }\n \n [Column(Order = 2)]\n public long ProductID { get; set; }\n \n public double Price { get; set; }\n public double Quantity { get; set; }\n \n public virtual Product Product { get; set; }\n public virtual Order Order { get; set; }\n }\n \n [Table(\"Orders\")]\n public class Order {\n \n public Order() {\n \n OrderDate = DateTime.Now;\n }\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long OrderID { get; set; }\n \n [Required]\n public DateTime OrderDate { get; set; }\n \n public double Freight { get; set; }\n public DateTime? ShipDate { get; set; }\n public Double Discount { get; set; }\n \n public virtual ICollection OrderDetails { get; set; }\n \n [InverseProperty(\"Orders\")]\n public virtual PersonContact PersonContact { get; set; }\n \n public virtual Company Company { get; set; }\n \n public virtual Company ShipCompany { get; set; }\n }\n \n [Table(\"Company\")]\n public class Company {\n \n public Company() {\n }\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long CompanyID { get; set; }\n \n [Required]\n [MaxLength(40)]\n public string CompanyName { get; set; }\n \n [MaxLength(100)]\n public string Web { get; set; }\n \n [MaxLength(50)]\n public string Email { get; set; }\n \n public virtual AddressType Address { get; set; }\n \n [InverseProperty(nameof(Order.Company))]\n public virtual ICollection Orders { get; set; }\n \n [InverseProperty(nameof(Order.ShipCompany))]\n public virtual ICollection ShippedOrders { get; set; }\n \n [InverseProperty(nameof(PersonContact.Companies))]\n public virtual PersonContact PrimaryContact { get; set; }\n \n [InverseProperty(nameof(PersonContact.Company))]\n public virtual ICollection Contacts { get; set; }\n \n }\n \n [Table(\"Person Contact\")]\n public class PersonContact {\n \n public PersonContact() {\n }\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long ContactID { get; set; }\n \n [MaxLength(8)]\n public string Title { get; set; }\n \n [MaxLength(50)]\n public string FirstName { get; set; }\n \n [MaxLength(50)]\n public string MiddleName { get; set; }\n \n [MaxLength(50)]\n public string LastName { get; set; }\n \n [MaxLength(25)]\n public string HomePhone { get; set; }\n \n [MaxLength(25)]\n public string MobilePhone { get; set; }\n \n public virtual AddressType Address { get; set; }\n \n public virtual ICollection Orders { get; set; }\n public virtual Company Company { get; set; }\n public virtual ICollection Companies { get; set; }\n }\n \n [Table(\"AddressType\")]\n public class AddressType {\n \n [Key]\n [DatabaseGenerated(DatabaseGeneratedOption.Identity)]\n public long Id { get; set; }\n \n [MaxLength(120)]\n public string AddressTitle { get; set; }\n \n [MaxLength(60)]\n public string Address { get; set; }\n \n [MaxLength(30)]\n public string City { get; set; }\n \n [MaxLength(20)]\n public string Region { get; set; }\n \n [MaxLength(15)]\n public string PostalCode { get; set; }\n \n [MaxLength(20)]\n public string Country { get; set; }\n \n [MaxLength(25)]\n public string Phone { get; set; }\n \n [MaxLength(25)]\n public string Fax { get; set; }\n } Add our classes to the DbContext descendant as DbSet properties. If necessary, you can also write fluent mapping, by overriding the OnModelCreating method. public class MyDbContext : DbContext {\n \n protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) {\n \n optionsBuilder.UseOracle(@\"user id=user; password=password; server=ORCL1210;\");\n }\n \n protected override void OnModelCreating(ModelBuilder modelBuilder) {\n \n modelBuilder.Entity()\n .HasKey(p => new { p.OrderID, p.ProductID });\n } \n \n public DbSet Products { get; set; }\n public DbSet ProductCategories { get; set; }\n public DbSet OrderDetails { get; set; }\n public DbSet Orders { get; set; }\n public DbSet Companies { get; set; }\n public DbSet PersonContacts { get; set; }\n} Now let’s choose how to create the database. We can generate Code-First Migrations. Or, for test purposes, we can implement the analogue of Entity Framework 6 initialization strategy [DropCreateDatabaseAlways](https://msdn.microsoft.com/en-us/library/gg679506%28v=vs.113%29.aspx) . public static class MyDbContextSeeder {\n \n public static void Seed(MyDbContext context) {\n \n context.Database.EnsureDeleted();\n context.Database.EnsureCreated();\n \n context.ProductCategories.Add(new ProductCategory() {\n CategoryName = \"prose\"\n });\n context.ProductCategories.Add(new ProductCategory() {\n CategoryName = \"novel\"\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"poetry\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"novel\")\n });\n context.ProductCategories.Add(new ProductCategory() {\n CategoryName = \"detective story\"\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"fantasy\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"novel\")\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"pop art\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"fantasy\")\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"textbook\"\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"research book\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"textbook\")\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"poem\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"novel\")\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"collection\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"textbook\")\n });\n context.ProductCategories.Add(new ProductCategory() { \n CategoryName = \"dictionary\",\n ParentCategory = context.ProductCategories.Local().Single(p => p.CategoryName == \"collection\")\n });\n \n context.Products.Add(new Product() { \n ProductName = \"Shakespeare W. Shakespeare's dramatische Werke\",\n Price = 78,\n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"prose\")\n });\n context.Products.Add(new Product() { \n ProductName = \"King Stephen. 'Salem's Lot\", \n Price = 67, \n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"poetry\")\n });\n context.Products.Add(new Product() { \n ProductName = \"Plutarchus. Plutarch's moralia\", \n Price = 89, \n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"prose\")\n });\n context.Products.Add(new Product() { \n ProductName = \"Twain Mark. Ventures of Huckleberry Finn\", \n Price = 34, \n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"novel\")\n });\n context.Products.Add(new Product() { \n ProductName = \"Harrison G. B. England in Shakespeare's day\", \n Price = 540, \n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"novel\")\n });\n context.Products.Add(new Product() { \n ProductName = \"Corkett Anne. The salamander's laughter\", \n Price = 5,\n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"poem\")\n });\n context.Products.Add(new Product() { \n ProductName = \"Lightman Alan. Einstein''s dreams\", \n Price = 5,\n Category = context.ProductCategories.Local().Single(p => p.CategoryName == \"poem\")\n });\n \n context.Companies.Add(new Company() {\n CompanyName = \"Borland UK CodeGear Division\",\n Web = \"support.codegear.com/\"\n });\n context.Companies.Add(new Company() {\n CompanyName = \"Alfa-Bank\",\n Web = \"www.alfabank.com\"\n });\n context.Companies.Add(new Company() {\n CompanyName = \"Pioneer Pole Buildings, Inc.\",\n Web = \"www.pioneerpolebuildings.com\"\n });\n context.Companies.Add(new Company() {\n CompanyName = \"Orion Telecoms (Pty) Ltd.\",\n Web = \"www.oriontele.com\"\n });\n context.Companies.Add(new Company() {\n CompanyName = \"Orderbase Consulting GmbH\",\n Web = \"orderbase.de\"\n });\n \n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2007, 4, 11),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Borland UK CodeGear Division\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2006, 3, 11),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Borland UK CodeGear Division\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2006, 8, 6),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Alfa-Bank\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2004, 7, 6),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Alfa-Bank\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2006, 8, 8),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Alfa-Bank\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2003, 3, 1),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Pioneer Pole Buildings, Inc.\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2005, 8, 6),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Orion Telecoms (Pty) Ltd.\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2006, 8, 1),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Orion Telecoms (Pty) Ltd.\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2007, 7, 1),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Orion Telecoms (Pty) Ltd.\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2007, 2, 6),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Orderbase Consulting GmbH\")\n });\n context.Orders.Add(new Order() {\n OrderDate = new DateTime(2007, 8, 1),\n Company = context.Companies.Local().Single(c => c.CompanyName == \"Orderbase Consulting GmbH\")\n });\n \n context.SaveChanges();\n }\n }\n\nnamespace Microsoft.Data.Entity {\n \n public static class DbSetExtensions {\n \n public static IEnumerable Local(this DbSet set)\n where T : class {\n \n var infrastructure = (Microsoft.Data.Entity.Infrastructure.IInfrastructure)set;\n \n var context = (DbContext)infrastructure.Instance.GetService(typeof(DbContext));\n \n return context.ChangeTracker.Entries()\n .Where(e => e.State == EntityState.Added || e.State == EntityState.Unchanged || e.State == EntityState.Modified)\n .Select(e => e.Entity);\n }\n }\n} Now let’s add code that creates the context, re-creates the database, fills it with the test data, and executes LINQ to Entities queries. class Program {\n \n static void Main(string[] args) {\n \n var context = new MyDbContext ();\n \n Console.WriteLine(\"Entity Framework Core (EF7) Code-First sample\");\n Console.WriteLine();\n \n MyDbContextSeeder.Seed(context);\n \n Console.WriteLine(\"Products with categories\");\n Console.WriteLine();\n \n var query = context.Products.Include(p => p.Category)\n .Where(p => p.Price > 20.0)\n .ToList();\n \n Console.WriteLine(\"{0,-10} | {1,-50} | {2}\", \"ProductID\", \"ProductName\", \"CategoryName\");\n Console.WriteLine();\n foreach (var product in query )\n Console.WriteLine(\"{0,-10} | {1,-50} | {2}\", product.ProductID, product.ProductName, product.Category.CategoryName);\n \n Console.ReadKey();\n }\n } Now we can run the application. It will create tables in the database, fill them with data, execute a query and output its results to the console: Entity Framework Core (EF7) Code-First sample\n\nProducts with categories\n\nProductID | ProductName | CategoryName\n\n1 | Harrison G. B. England in Shakespeare's day | novel\n2 | Twain Mark. Ventures of Huckleberry Finn | novel\n3 | Plutarchus. Plutarch's moralia | prose\n4 | Shakespeare W. Shakespeare's dramatische Werke | prose\n5 | King Stephen. 'Salem's Lot | poetry The attached archive contains a more complete application version with one Entity Framework model with several implementations for SQL Server, Oracle, MySQL, PostgreSQL, SQLite and DB2. The project also enables monitoring of the database calls via the free [Devart DbMonitor](https://www.devart.com/dbmonitor/) tool. Open the solution in Visual Studio, select a project for the installed provider and set it as Startup Project. Modify the connection string in the app.config file of the project, and after this, you can start it. You can also start Devart DbMonitor, and it will display the log of all the database interactions: connection opening/closing, transactions, DDL and DML statements, etc. Conclusion We are glad to provide the new Entity Framework provider functionality – support for Entity Framework Core (Entity Framework 7) – to our users. We hope that application development will be smooth and trouble-free for users who decide to try it. Anyway, we are always ready to improve our Entity Framework providers to help our users if they have any troubles and these troubles can be solved within Entity the scope of Framework Core functionality. As for our plans for the future of Entity Framework Core providers, they will be determined and changed according to your feedback, which you can send via the feedback page of the corresponding providers our [forum](http://forums.devart.com/viewforum.php?f=30) , and [UserVoice](http://devart.uservoice.com/forums/105163-ado-net-entity-framework-support) . Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [entity framework](https://blog.devart.com/tag/entity-framework) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-1-entity-framework-7-support.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+1+%28Entity+Framework+7%29+Support&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-1-entity-framework-7-support.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-1-entity-framework-7-support.html&title=Entity+Framework+Core+1+%28Entity+Framework+7%29+Support) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-1-entity-framework-7-support.html&title=Entity+Framework+Core+1+%28Entity+Framework+7%29+Support) [Copy URL](https://blog.devart.com/entity-framework-core-1-entity-framework-7-support.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025 6 COMMENTS Tathagata April 26, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 6:44 am What If I want to return Raw SQL or just call a procedure using the new FromSql() of EF Core without any migration ? I found that for some unknown reason that is not working. Michael July 5, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 9:03 pm The third paragraph from the bottom mentions an “attached archive” which contains a ” complete application.” I can’t find the link on this page. Thanks.. Byung-Huyn Nah January 15, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 10:21 am It is supported in ASP.NET Core 2.0 ?? Gustavo Cruz February 19, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 10:00 pm Any plans to release support for EF core 2? Shalex December 25, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 3:17 pm Current builds of dotConnect providers support .NET Core 2.1 / EF Core 2.1. Shalex December 25, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 3:19 pm Current builds of dotConnect providers support .NET Core 2.1 / EF Core 2.1. Comments are closed."} {"url": "https://blog.devart.com/entity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 2.1.1 Support in Entity Developer and dotConnect Providers! By [dotConnect Team](https://blog.devart.com/author/dotconnect) June 22, 2018 [0](https://blog.devart.com/entity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html#respond) 17288 Devart is glad to announce the release of the new versions of dotConnect [ADO.NET data providers](https://www.devart.com/dotconnect/) and [Entity Developer](https://www.devart.com/entitydeveloper/) – our visual designer for ORM models. New versions of Devart products offer you support for Entity Framework Core 2.1.1. Additionally, [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) now supports connections via SSH protocol. Entity Framework Core 2.1.1 Support New versions of dotConnect Providers now support the most recent Entity Framework Core version – 2.1.1. You can use such new Entity Framework Core features as lazy loading, query types, parameterized constructors, System.Transactions support, value conversions, etc. Entity Developer allows you to design models for Entity Framework Core 2.1.1 and supports all the new features. You may define classes as query types, add parameters for class properties to a parameterized constructor, configure lazy loading for entire model or for separate navigation properties visually. SSH Support in dotConnect for PostgreSQL The new dotConnect for PostgreSQL version allows you to create connections via secure SSH protocol. If you need another feature in our providers, LinqConnect, or Entity Developer visit our forum at [UserVoice](https://devart.uservoice.com/) and vote for suggested features or suggest your own one. List of Products with Entity Framework Core 2.1.1 Support [Entity Developer 6.3](https://www.devart.com/entitydeveloper/) [ [Download](https://www.devart.com/entitydeveloper/download.html) ]  [ [New Features](https://www.devart.com/entitydeveloper/history.html) ] ADO.NET Providers for Databases [dotConnect for Oracle 9.6](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 8.11](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 7.11](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 5.11](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] [dotConnect for DB2 2.3](https://www.devart.com/dotconnect/db2/) [ [Download](https://www.devart.com/dotconnect/db2/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/db2/history.html) ] ADO.NET Providers for Cloud Applications [dotConnect for Salesforce 3.3](https://www.devart.com/dotconnect/salesforce/) [ [Download](https://www.devart.com/dotconnect/salesforce/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/salesforce/history.html) ] [dotConnect for Dynamics CRM 1.7](https://www.devart.com/dotconnect/dynamicscrm/) [ [Download](https://www.devart.com/dotconnect/dynamicscrm/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/dynamicscrm/history.html) ] [dotConnect for SugarCRM 1.7](https://www.devart.com/dotconnect/sugarcrm/) [ [Download](https://www.devart.com/dotconnect/sugarcrm/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/sugarcrm/history.html) ] [dotConnect for Zoho CRM 1.7](https://www.devart.com/dotconnect/zohocrm/) [ [Download](https://www.devart.com/dotconnect/zohocrm/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/zohocrm/history.html) ] [dotConnect for ExactTarget 1.7](https://www.devart.com/dotconnect/exacttarget/) [ [Download](https://www.devart.com/dotconnect/exacttarget/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/exacttarget/history.html) ] [dotConnect for MailChimp 1.7](https://www.devart.com/dotconnect/mailchimp/) [ [Download](https://www.devart.com/dotconnect/mailchimp/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/mailchimp/history.html) ] [dotConnect for Magento 1.7](https://www.devart.com/dotconnect/magento/) [ [Download](https://www.devart.com/dotconnect/magento/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/magento/history.html) ] [dotConnect for QuickBooks 1.7](https://www.devart.com/dotconnect/quickbooks/) [ [Download](https://www.devart.com/dotconnect/quickbooks/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/quickbooks/history.html) ] [dotConnect for Bigcommerce 1.7](https://www.devart.com/dotconnect/bigcommerce/) [ [Download](https://www.devart.com/dotconnect/bigcommerce/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/bigcommerce/history.html) ] [dotConnect for FreshBooks 1.7](https://www.devart.com/dotconnect/freshbooks/) [ [Download](https://www.devart.com/dotconnect/freshbooks/download.html) ]  [ [New Features](https://www.devart.com/dotconnect/freshbooks/history.html) ] Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [entity developer](https://blog.devart.com/tag/entity-developer) [entity framework](https://blog.devart.com/tag/entity-framework) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [what's new entity developer](https://blog.devart.com/tag/whats-new-entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+2.1.1+Support+in+Entity+Developer+and+dotConnect+Providers%21&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+2.1.1+Support+in+Entity+Developer+and+dotConnect+Providers%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+2.1.1+Support+in+Entity+Developer+and+dotConnect+Providers%21) [Copy URL](https://blog.devart.com/entity-framework-core-2-1-1-support-in-entity-developer-and-dotconnect-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 3.0 Support in Entity Developer and dotConnect Providers By [dotConnect Team](https://blog.devart.com/author/dotconnect) October 24, 2019 [0](https://blog.devart.com/entity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html#respond) 3447 Devart is glad to announce the release of the new versions of our ADO.NET providers – dotConnect for [Oracle](https://www.devart.com/dotconnect/oracle/) , [MySQL](https://www.devart.com/dotconnect/mysql/) , [PostgreSQL](https://www.devart.com/dotconnect/postgresql/) , and [SQLite](https://www.devart.com/dotconnect/sqlite/) and Visual ORM designer – [Entity Developer](https://www.devart.com/entitydeveloper/) . The new versions feature support for Entity Framework Core 3.0. Besides dotConnect for PostgreSQL now supports PostgreSQL 12. Entity Framework Core 3.0 Support Now you can use our products in projects, powered by the latest Entity Framework Core version and enjoy the improved performance, better SQL generation and other Entity Framework Core 3.0 features. Build your Entity Framework Core models from databases using Scaffold-DbContext command with our providers, type mapping code yourself, or use Entity Developer to design models visually – every approach is supported. In Entity Developer, you just need to make sure that the corresponding target Entity Framework Core version is selected in Model Properties when creating an Entity Framework Core model, and Entity Developer will download all the necessary NuGet packages automatically. It will generate code for Entity Framework Core 3 – with keyless entity types instead of query types, etc. Note that the version 3 can only be selected if the target framework for your project supports Entity Framework Core 3. This means – only for projects targeting .NET Core 3 or .NET Standard 2.1. In such cases it is selected by default. Other .NET Frameworks (like Full .NET Framework) don’t support Entity Framework Core 3. No NuGet packages are added to dotConnect providers for Entity Framework Core 3. You need to install the same NuGet packages as you did for Entity Framework Core 2. Entity Framework 6.3 Support Improvements Our [previous release](http://entity-framework-6-3-and-net-core-3-support.html) was focused on supporting .NET Core 3.0 Preview and Entity Framework 6.3 Preview on .NET Core. This release improves this support and adds two new properties for the DbContext template, providing better configurability for Entity Framework Core code generation. The Configuration Type Name property specifies the descendant of DbConfiguration class that should be used for code-based configuration, and the Use DbConfigurationType Attribute property determines whether to register this configuration type via the DbConfigurationType attribute. PostgreSQL 12 Support Additionally, we have supported PostgreSQL 12 in dotConnect for PostgreSQL and Entity Developer List of Products with Entity Framework Core support [Entity Developer 6.6](https://www.devart.com/entitydeveloper/) [ [Download](https://www.devart.com/entitydeveloper/download.html) ] [ [New Features](https://www.devart.com/entitydeveloper/history.html) ] ADO.NET Providers for Databases [dotConnect for Oracle 9.9](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ] [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 8.15](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 7.15](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 5.14](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] Tags [entity developer](https://blog.devart.com/tag/entity-developer) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+3.0+Support+in+Entity+Developer+and+dotConnect+Providers&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+3.0+Support+in+Entity+Developer+and+dotConnect+Providers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+3.0+Support+in+Entity+Developer+and+dotConnect+Providers) [Copy URL](https://blog.devart.com/entity-framework-core-3-0-support-in-entity-developer-and-dotconnect-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-3-1-support-in-dotconnect-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 3.1 Support in dotConnect Providers By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 23, 2019 [0](https://blog.devart.com/entity-framework-core-3-1-support-in-dotconnect-providers.html#respond) 6242 Devart is glad to announce the release of the new versions of dotConnect [ADO.NET data providers](https://www.devart.com/dotconnect/) . New versions of Devart products offer you support for Entity Framework Core 3.1. After Microsoft released Entity Framework 3 only for .NET Core 3 (.NET Standard 2.1), it caused certain criticism because of not supporting Full .NET Framework. Microsoft reacted to this criticism by making Entity Framework Core 3.1 compatible with .NET Standard 2.0, and thus, allowing developers to use it in projects, targeting Full .NET Framework as well as .NET Core. So finally we also can support Entity Framework Core 3 for Full .NET Framework too, and we have added this support to all of our ADO.NET providers for cloud apps and for major databases. All dotConnect providers with ORM support now include a new Devart.Data..Entity.EFCore.dll assembly for Entity Framework Core 3.1 support on Full .NET Framework. You can find it in the \\Entity\\EFCore3 subfolder of the provider installation folder. Our products also keep supporting Entity Framework 6, and we have updated them to support Entity Framework 6.4. [Entity Developer](https://www.devart.com/entitydeveloper/) , our visual ORM model designer and code generation tool, also supports Entity Framework 6.4 and Entity Framework Core 3.1. List of Providers with Entity Framework Core 3.1 Support ADO.NET Providers for Databases [dotConnect for Oracle 9.10](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ] [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 8.16](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 7.16](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 5.15](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] [dotConnect for DB2 2.5](https://www.devart.com/dotconnect/db2/) [ [Download](https://www.devart.com/dotconnect/db2/download.html) ] [ [New Features](https://www.devart.com/dotconnect/db2/history.html) ] ADO.NET Providers for Cloud Applications [dotConnect for Salesforce 3.5](https://www.devart.com/dotconnect/salesforce/) [ [Download](https://www.devart.com/dotconnect/salesforce/download.html) ] [ [New Features](https://www.devart.com/dotconnect/salesforce/history.html) ] [dotConnect for Dynamics CRM 1.9](https://www.devart.com/dotconnect/dynamicscrm/) [ [Download](https://www.devart.com/dotconnect/dynamicscrm/download.html) ] [ [New Features](https://www.devart.com/dotconnect/dynamicscrm/history.html) ] [dotConnect for SugarCRM 1.9](https://www.devart.com/dotconnect/sugarcrm/) [ [Download](https://www.devart.com/dotconnect/sugarcrm/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sugarcrm/history.html) ] [dotConnect for Zoho CRM 1.10](https://www.devart.com/dotconnect/zohocrm/) [ [Download](https://www.devart.com/dotconnect/zohocrm/download.html) ] [ [New Features](https://www.devart.com/dotconnect/zohocrm/history.html) ] [dotConnect for Salesforce Marketing Cloud 1.9](https://www.devart.com/dotconnect/exacttarget/) [ [Download](https://www.devart.com/dotconnect/exacttarget/download.html) ] [ [New Features](https://www.devart.com/dotconnect/exacttarget/history.html) ] [dotConnect for MailChimp 1.9](https://www.devart.com/dotconnect/mailchimp/) [ [Download](https://www.devart.com/dotconnect/mailchimp/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mailchimp/history.html) ] [dotConnect for Magento 1.9](https://www.devart.com/dotconnect/magento/) [ [Download](https://www.devart.com/dotconnect/magento/download.html) ] [ [New Features](https://www.devart.com/dotconnect/magento/history.html) ] [dotConnect for QuickBooks 1.9](https://www.devart.com/dotconnect/quickbooks/) [ [Download](https://www.devart.com/dotconnect/quickbooks/download.html) ] [ [New Features](https://www.devart.com/dotconnect/quickbooks/history.html) ] [dotConnect for BigCommerce 1.10](https://www.devart.com/dotconnect/bigcommerce/) [ [Download](https://www.devart.com/dotconnect/bigcommerce/download.html) ] [ [New Features](https://www.devart.com/dotconnect/bigcommerce/history.html) ] [dotConnect for FreshBooks 1.10](https://www.devart.com/dotconnect/freshbooks/) [ [Download](https://www.devart.com/dotconnect/freshbooks/download.html) ] [ [New Features](https://www.devart.com/dotconnect/freshbooks/history.html) ] Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [entity developer](https://blog.devart.com/tag/entity-developer) [entity framework](https://blog.devart.com/tag/entity-framework) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [what's new entity developer](https://blog.devart.com/tag/whats-new-entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-3-1-support-in-dotconnect-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+3.1+Support+in+dotConnect+Providers&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-3-1-support-in-dotconnect-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-3-1-support-in-dotconnect-providers.html&title=Entity+Framework+Core+3.1+Support+in+dotConnect+Providers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-3-1-support-in-dotconnect-providers.html&title=Entity+Framework+Core+3.1+Support+in+dotConnect+Providers) [Copy URL](https://blog.devart.com/entity-framework-core-3-1-support-in-dotconnect-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 5 support in Entity Developer and dotConnect Providers By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 3, 2020 [0](https://blog.devart.com/entity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html#respond) 2696 Devart is glad to announce the release of [Entity Developer 6.10](https://www.devart.com/entitydeveloper/) and the new versions of dotConnect [ADO.NET data providers](https://www.devart.com/dotconnect/) for Oracle, MySQL, PostgreSQL, and SQLite. New versions of Devart products support Entity Framework Core 5.0 and its new features. Besides, there are certain breaking changes regarding dotConnect assemblies with Entity Framework Core functionality. These changes must be taken into account when you update your projects, working with Entity Framework Core 2. Entity Framework Core 5.0 Support New versions of our ADO.NET providers and Entity Developer fully support Entity Framework Core 5.0 and its new, long awaited features: table-per-type inheritances and many-to-many associations. Entity Developer can now detect many-to-many associations and table-per-type inheritances automatically when generating an Entity Framework Core 5.0 model from a database. You can also create these inheritances and associations manually via the corresponding editors or Model Refactoring Wizard. Standalone Entity Developer application now targets Entity Framework Core 5.0 by default, when creating new Entity Framework Core models. You can select another version in the Create Model Wizard, when creating the model. Assembly Changes New versions of dotConnect for Oracle, MySQL, PostgreSQL, and SQLite introduce significant changes to their Devart.Data.***.EFCore NuGet packages. These packages previously contained the corresponding Devart.Data.***.Entity.EFCore.dll assemblies for different Entity Framework Core versions: EF Core 1 assembly (.NET Standard 1.3) EF Core 2 assembly (.NET Standard 2.0) EF Core 3 assembly (.NET Standard 2.1) Now the assemblies and their .NET Standard compatibility changed in these packages. They now contain assemblies for the following Entity Framework Core versions: EF Core 1 assembly (.NET Standard 1.3) EF Core 3 assembly (.NET Standard 2.0) EF Core 5 assembly (.NET Standard 2.1) Note that these packages don’t contain an assembly for Entity Framework Core 2 anymore. This assembly is now available only via the installer of the corresponding dotConnect provider. It installs the assembly into \\Entity\\EFCore2\\netstandard2.0\\ subfolder of the provider installation folder. This means that if you used this NuGet package in your project, using Entity Framework Core 2, the project will be automatically upgraded to Entity Framework Core 3 when updating our NuGet package. Note that there are [breaking changes](https://docs.microsoft.com/en-us/ef/core/what-is-new/ef-core-3.x/breaking-changes) between Entity Framework Core 2 and 3. And if your project used Entity Framework Core 3, it will be upgraded to Entity Framework Core 5. If you want to stay on Entity Framework Core 2, you need to do the following things: Delete the Devart.Data.***.EFCore package from your project Add the Devart.Data.*** package instead. Add the Microsoft.EntityFrameworkCore.Relational package of version 2.2.6 to your project. Add the \\Entity\\EFCore2\\netstandard2.0\\Devart.Data.***.Entity.EFCore.dll assembly from the provider installation folder to your project. If you want to stay on Entity Framework Core 3, you need to switch the Target framework of your project to one of the following: .NET Core 2.x .NET Standard 2.0 .NET Framework 4.6.1 or higher List of Providers with Entity Framework Core 5 Support [dotConnect for Oracle 9.14](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ] [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 8.19](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 7.20](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 5.17](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] [Entity Developer 6.10](https://www.devart.com/entitydeveloper/) [ [Download](https://www.devart.com/entitydeveloper/download.html) ] [ [New Features](https://www.devart.com/entitydeveloper/history.html) ] Tags [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [what's new entity developer](https://blog.devart.com/tag/whats-new-entity-developer) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+5+support+in+Entity+Developer+and+dotConnect+Providers&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+5+support+in+Entity+Developer+and+dotConnect+Providers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html&title=Entity+Framework+Core+5+support+in+Entity+Developer+and+dotConnect+Providers) [Copy URL](https://blog.devart.com/entity-framework-core-5-support-in-entity-developer-and-dotconnect-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-6-support-in-dotconnect-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 6 Support in dotConnect Providers By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 21, 2021 [0](https://blog.devart.com/entity-framework-core-6-support-in-dotconnect-providers.html#respond) 3227 Devart is glad to announce the release of the new versions of dotConnect [ADO.NET data providers](https://www.devart.com/dotconnect/) for Oracle, MySQL, PostgreSQL, and SQLite. New versions of Devart providers support Entity Framework Core 6.0 and its new features. New assembly versions for Entity Framework Core 6 are added to the Devart.Data.***.EFCore and Devart.Data.***.EFCore.NetTopologySuite NuGet packages. They are compiled for target framework .NET 6. We have also supported the latest NetTopologySuite version 2.4.0. Now you can use our providers with the most recent Entity Framework Core version and benefit from such new Entity Framework features as pre-convention model configuration, new mapping attributes: UnicodeAttribute, PrecisionAttribute, EntityTypeConfigurationAttribute, enjoy improvements to Model building and DbContext factory. Create faster applications with Compiled models and enjoy other new Entity Framework 6 updates and fixes. List of Providers with Entity Framework Core 6 Support [dotConnect for Oracle 9.15](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ] [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 8.20](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 7.23](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 5.19](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [spatial](https://blog.devart.com/tag/spatial) [what's new dotconnect](https://blog.devart.com/tag/whats-new-dotconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-6-support-in-dotconnect-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+6+Support+in+dotConnect+Providers&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-6-support-in-dotconnect-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-6-support-in-dotconnect-providers.html&title=Entity+Framework+Core+6+Support+in+dotConnect+Providers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-6-support-in-dotconnect-providers.html&title=Entity+Framework+Core+6+Support+in+dotConnect+Providers) [Copy URL](https://blog.devart.com/entity-framework-core-6-support-in-dotconnect-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-compatibility-in-devart-net-connectivity-solutions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [What’s New](https://blog.devart.com/category/whats-new) Entity Framework Core 7 Support in Devart .NET Connectivity Solutions By [Max Remskyi](https://blog.devart.com/author/max-remskyi) March 6, 2023 [0](https://blog.devart.com/entity-framework-core-compatibility-in-devart-net-connectivity-solutions.html#respond) 2482 Entity Developer now features full Entity Framework 7 compatibility for dotConnect for Oracle, MySQL, PostgreSQL, and SQLite, along with third-party providers for MS SQL Server (Microsoft.Data.SqlClient), and other databases. EF Core 7 version supports Table per Concrete type (TPC) inheritance , whereas other EF Core 7 features will be developed and implemented soon, taking into account features, the users requested the most. [dotConnect for Oracle](https://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](https://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/) , [dotConnect for SQLite](https://www.devart.com/dotconnect/sqlite/) : EF Core 7 is supported. It functions both on .NET 6 and .NET 7 as EF Core 6 does, thus apparently, it is currently impossible to include support for both of them in brackets of a single NuGet package. From now on, NuGet packages with EF Core support shall come in two versions that are different in the absence/presence of .7 indicator as a package revision number. The packages that have .7 in their revision number will have only EF Core 7 assemblies included, whereas the packages without .7 will have the assemblies for EF Core 1, EF Core 3, EF Core 5, and EF Core 6 for inheriting projects that, for some reason, remain running previous EF Core versions. Thus, we provide the EF Core 7 support for those users that prefer to use the most recent ORM version, whereas continuing support for the previous versions to ensure all our users have access to the newest bug fixes and functionality improvements. How it works and looks like: Devart.Data.Oracle.EFCore for dotConnect for Oracle package: 10.1.134.7 (NuGet for EF Core 7) 10.1.134 (NuGet for EF Core 1, EF Core 3, EF Core 5, EF Core 6) Devart.Data.MySql.EFCore for dotConnect for MySQL package: 9.1.134.7 (NuGet for EF Core 7) 9.1.134 (NuGet for EF Core 1, EF Core 3, EF Core 5, EF Core 6) Devart.Data.PostgreSql.EFCore for dotConnect for PostgreSQL package: 8.1.134.7 (NuGet for EF Core 7) 8.1.134 (NuGet for EF Core 1, EF Core 3, EF Core 5, EF Core 6) Devart.Data.SQLite.EFCore for dotConnect for SQLite package: 6.1.134.7 (NuGet for EF Core 7) 6.1.134 (NuGet for EF Core 1, EF Core 3, EF Core 5, EF Core 6) dotConnect for SQLite : implemented a built-in encryption support for Professional Edition . Now, the users of dotConnect for SQLite – Professional Edition will no longer require any third-party libraries for encryption. The following encryption algorithms are supported: AES-128, AES-192, AES-256, Blowfish, CAST-128, RC4, and Triple DES. Minor changes: Visual Studio 17.5 preview support has been added to all products; Salesforce Web Services API of 56.0 version is now supported in dotConnect for Salesforce; multiple bug fixes have been implemented. List of Updated Products [Entity Developer 7.1.134](https://www.devart.com/entitydeveloper/) [ [Download](https://www.devart.com/entitydeveloper/download.html) ] [ [New Features](https://www.devart.com/entitydeveloper/history.html) ] ADO.NET Providers for Databases [dotConnect for Oracle 10.1.134](https://www.devart.com/dotconnect/oracle/) [ [Download](https://www.devart.com/dotconnect/oracle/download.html) ] [ [New Features](https://www.devart.com/dotconnect/oracle/history.html) ] [dotConnect for MySQL 9.1.134](https://www.devart.com/dotconnect/mysql/) [ [Download](https://www.devart.com/dotconnect/mysql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/mysql/history.html) ] [dotConnect for PostgreSQL 8.1.134](https://www.devart.com/dotconnect/postgresql/) [ [Download](https://www.devart.com/dotconnect/postgresql/download.html) ] [ [New Features](https://www.devart.com/dotconnect/postgresql/history.html) ] [dotConnect for SQLite 6.1.134](https://www.devart.com/dotconnect/sqlite/) [ [Download](https://www.devart.com/dotconnect/sqlite/download.html) ] [ [New Features](https://www.devart.com/dotconnect/sqlite/history.html) ] Tags [.NET](https://blog.devart.com/tag/net) [ADO.NET](https://blog.devart.com/tag/ado-net) [dotconnect](https://blog.devart.com/tag/dotconnect) [entity framewok](https://blog.devart.com/tag/entity-framewok) [salesforce](https://blog.devart.com/tag/salesforce) [Max Remskyi](https://blog.devart.com/author/max-remskyi) DAC Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-compatibility-in-devart-net-connectivity-solutions.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+7+Support+in+Devart+.NET+Connectivity+Solutions&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-compatibility-in-devart-net-connectivity-solutions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-compatibility-in-devart-net-connectivity-solutions.html&title=Entity+Framework+Core+7+Support+in+Devart+.NET+Connectivity+Solutions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-compatibility-in-devart-net-connectivity-solutions.html&title=Entity+Framework+Core+7+Support+in+Devart+.NET+Connectivity+Solutions) [Copy URL](https://blog.devart.com/entity-framework-core-compatibility-in-devart-net-connectivity-solutions.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) Entity Framework Core Spatials Support in dotConnect ADO.NET Providers By [dotConnect Team](https://blog.devart.com/author/dotconnect) September 17, 2020 [0](https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html#respond) 2599 The new releases of dotConnect ADO.NET providers for Oracle, MySQL, PostgreSQL, and SQLite support Spatial data in Entity Framework Core 3. Entity Framework Core 3 does not limit entities to properties of primitive System types, like, String, Int32, DateTime. It allows using a wide range of classes, so you can now work with Spatial data in Entity Framework Core via the NetTopologySuite GIS library. Note for Users Who Upgrade to EF Core from EF5 or EF6 Entity Framework v5 and v6 support spatial data types. These types are represented as two DbGeometry and DbGeography data types from System.Data.Entity.dll in .NET Framework 4.5 (Entity Framework 5) or from EntityFramework.dll (Entity Framework 6). However, these classes themselves were not independent, but they were rather wrappers for full-featured classes from some third-party GIS library. dotConnect providers for Oracle, MySQL, and PostgreSQL supported spatial data for Entity Framework v5 and v6 via the NetTopologySuite 1.x library. Upgrade process would be complicated not just because you need to use the corresponding NetTopologySuite classes (Geometry, Point, LineString, Polygon, etc.), but also because of [partial incompatibility between EF Core and EF v5 and v6](https://docs.microsoft.com/en-us/ef/efcore-and-ef6/) . Note for Users Who Upgrade dotConnect Providers for MySQL or SQLite from Earlier Versions If you had dotConnect for MySQL v 8.17 / dotConnect for SQLite v 5.15 or earlier, you need to reset type mapping rules for the provider in Entity Developer when updating to new provider versions in order to have the type new mapping rules for spatial types added. To do it, on the Visual Studio Tools menu, point to Entity Developer , and click Options . In the Options dialog box, expand the Entity Developer -> Servers Options node and select MySQL or SQLite options page depending on the provider you have installed. On the selected page, click Reset . The following type mapping rules will be added: Server Type .NET Type geometry NetTopologySuite.Geometries.Geometry geometrycollection NetTopologySuite.Geometries.GeometryCollection linestring NetTopologySuite.Geometries.LineString point NetTopologySuite.Geometries.Point polygon NetTopologySuite.Geometries.Polygon multilinestring NetTopologySuite.Geometries.MultiLineString multipoint NetTopologySuite.Geometries.MultiPoint multipolygon NetTopologySuite.Geometries.MultiPolygon NetTopologySuite (NTS) Spatial Library Linking NetTopologySuite to Provider If you create an Entity Framework Core model in Entity Developer via the database-first approach, it downloads all the necessary NuGet packages and links assemblies automatically. If you use code-first approach, and write the classes and mapping code yourself, you will need to perform additional actions. .NET Core 3 or Higher If you target .NET Core 3 or .NET 5, and you use dotConnect NuGet packages, all the necessary assemblies are loaded automatically. Just install the NuGet package of the corresponding provider: Devart.Data.Oracle.EFCore.NetTopologySuite Devart.Data.MySql.EFCore.NetTopologySuite Devart.Data.PostgreSql.EFCore.NetTopologySuite Devart.Data.SQLite.EFCore.NetTopologySuite Full .NET Framework If you target Full .NET Framework, and you use assemblies, installed by dotConnect installer, you need to add the Devart.Data..Entity.EFCore.NetTopologySuite.dll assembly from the Entity/EFCore3 subfolder of the provider installation folder to the project references. Additionally, you need to install the NetTopologySuite NuGet package of version 2.1.0. dotConnect for SQLite uses SpatiaLite SQLite extension for working with spatial data. To get SpatiaLite, you need to install the following NuGet packages: mod_spatialite 4.3.0.1 NetTopologySuite.IO.SpatiaLite 2.0.0 Choose between [Entity Framework or ADO.NET](https://blog.devart.com/key-difference-between-ado-net-and-asp-net.html) wisely! Check the comparison and get to know the benchmarks. NetTopologySuite Configuration Call the UseNetTopologySuite() method for DbContext options builder of the corresponding provider to link NetTopologySuite to the provider and enable the ability to map properties to spatial data types. Here are examples for Oracle, MySQL, PostgreSQL and SQLite: optionsBuilder.UseOracle(\n @\"Host=orcl;User Id=scott;Password=tiger;\",\n x => x.UseNetTopologySuite());\n\noptionsBuilder.UseMySql(\n @\"Host=db;Port=3306;User Id=root;Password=root;Database=test;\",\n x => x.UseNetTopologySuite());\n\noptionsBuilder.UsePostgreSql(\n @\"Host=db;Port=5432;User Id=postgres;Password=postgres;Database=test;\",\n x => x.UseNetTopologySuite());\n\noptionsBuilder.UseSQLite(@\"Data Source=database.db;FailIfMissing=False;\",\n x => x.UseNetTopologySuite()); Supported NetTopologySuite Data Types Our Entity Framework Core provider supports a number of NetTopologySuite data types. Geometry is the base type for them, and you can use it in your application. However, you may use specific data types for properties if the corresponding database column stores only corresponding spatial figures: Class Brief Description [Geometry](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.Geometry.html) Abstract base class for all spatial data types. [GeometryCollection](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.GeometryCollection.html) A collection of geometry objects. [LineString](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.LineString.html) A sequence of two or more vertices with all points along the linearly-interpolated curves (line segments) between each pair of consecutive vertices. [Point](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.Point.html) A single point. [Polygon](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.Polygon.html) A polygon with linear edges. [MultiLineString](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.MultiLineString.html) A collection of LineStrings. [MultiPoint](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.MultiPoint.html) A collection of Points. [MultiPolygon](http://nettopologysuite.github.io/NetTopologySuite/api/NetTopologySuite.Geometries.MultiPolygon.html) A collection of Polygons. Mapping NetTopologySuite Types to Database Data Types Suppose, we have the following class: public class City {\n public int Id { get; set; }\n public Point Geometry { get; set; }\n [MaxLength(200)]\n public string Name { get; set; }\n } Oracle Mapping It’s not necessary to specify the column type for spatial properties, because Oracle uses only one spatial type – SDO_GEOMETRY, but you may specify it anyway: modelBuilder.Entity ()\n .Property(p => p.Geometry)\n .HasColumnType(\"sdo_geometry\"); MySQL Mapping You can specify any geometry type, supported by MySQL: geometry, geometrycollection, linestring, point, polygon, multilinestring, multipoint, multipolygon. modelBuilder.Entity ()\n .Property(p => p.Geometry)\n .HasColumnType(\"point\"); PostgreSQL Mapping You can specify geometry or geography type: modelBuilder.Entity ()\n .Property(p => p.Geometry)\n .HasColumnType(\"geography\"); SQLite Mapping You can specify any supported geometry type: geometry, geometrycollection, linestring, point, polygon, multilinestring, multipoint, multipolygon. Additionally, it is better to specify [SRID](https://en.wikipedia.org/wiki/Spatial_reference_system#Identifier) (in this example, 4326 is a WGS 84 identifier, that is used for geographical calculations on the Earth surface): modelBuilder.Entity ()\n .Property(p => p.Geometry)\n .HasColumnType(\"point\")\n .HasSrid(4326); Conclusion We are glad to provide Entity Framework Core support improvements in [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) to our users. As for our future plans, further development of spatial functionality of our Entity Framework Core providers will depend on the users’ feedback. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [entity framework](https://blog.devart.com/tag/entity-framework) [Entity Framework Core](https://blog.devart.com/tag/entity-framework-core) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Core+Spatials+Support+in+dotConnect+ADO.NET+Providers&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html&title=Entity+Framework+Core+Spatials+Support+in+dotConnect+ADO.NET+Providers) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html&title=Entity+Framework+Core+Spatials+Support+in+dotConnect+ADO.NET+Providers) [Copy URL](https://blog.devart.com/entity-framework-core-spatials-support-in-dotconnect-ado-net-providers.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Dapper vs. Entity Framework Core: Which One Is Right for Your .NET Project?](https://blog.devart.com/dapper-vs-entity-framework-core.html) April 10, 2025"} {"url": "https://blog.devart.com/entity-framework-sql-generation-enhancements-for-in-clause.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework: SQL Generation Enhancements for IN Clause By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 9, 2010 [0](https://blog.devart.com/entity-framework-sql-generation-enhancements-for-in-clause.html#respond) 5504 Introduction Collection with unique elements Two and more collections with the repeated elements Optimization of the query without obvious collection Oracle-specific SQL The Afterword Introduction Devart was the first company who shipped Entity Framework providers for Oracle, MySQL, PostgreSQL, SQLite. And now we remain the leader in supporting new versions and features of Entity Framework. Continuing to improve support of Entity Framework v4 features, as it was described here , we have optimized generated SQL for our ADO.NET data providers – [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) . SQL generation was significantly improved and simplified in Entity Framework 4. Most of these improvements can be used in third-party providers, but one aspect has been improved only for SqlClient. It is a conversion of [IN](http://msdn.microsoft.com/en-us/library/bb386895.aspx) expression in Entity SQL and Contains method for collections in LINQ to Entities. New versions of our dotConnect implement this enhancement for both Entity Framework v4 and Entity Framework v1. Our EF v1 users can get optimized SQL for IN Entity SQL expressions, but LINQ to Entities doesn’t support Contains method for collections in EF v1. Let’s consider examples of improved SQL generation by Devart dotConnect for Oracle and compare them with the code generated by SqlClient. EF-model was created for the “Categories” table: Oracle DDL script: CREATE TABLE NORTHWINDEF.\"Categories\" (\n \"CategoryID\" NUMBER(9),\n \"CategoryName\" VARCHAR2(15) NOT NULL,\n \"Description\" VARCHAR2(4000),\n \"Picture\" BLOB,\n CONSTRAINT \"PK_Categories\" PRIMARY KEY (\"CategoryID\")\n) MS SQL Server DDL script: CREATE TABLE [Categories] (\n [CategoryID] int IDENTITY (1, 1) NOT NULL,\n [CategoryName] nvarchar (15) NOT NULL,\n [Description] nvarchar(4000),\n [Picture] varbinary(8000),\n CONSTRAINT \"PK_Categories\" PRIMARY KEY ([CategoryID])\n) All examples were made in C# for EF v4 (.NET 4.0) using LINQ to Entities. All collections in all examples are arrays with predefined data, but principles of work are the same for any collection that implements IEnumerable. Collection with unique elements Let’s look at the Contains method for a single collection that consists of unique elements: Entities ctx = new Entities();\nint[] collection1 = new int[] { 1, 2, 3, 4, 5, 6, 7, 8 };\n\nvar query = from c in ctx.Categories\n where collection1.Contains(c.CategoryID)\n select c; The equivalent Entity SQL statement is: SELECT VALUE Categories \nFROM Entities.Categories\nWHERE Categories.CategoryID IN { 1, 2, 3, 4, 5, 6, 7, 8 } SQL generated by dotConnect for Oracle 5.XX: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\", \n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n (((1 = \"Extent1\".\"CategoryID\") OR (2 = \"Extent1\".\"CategoryID\")) \n OR ((3 = \"Extent1\".\"CategoryID\") OR (4 = \"Extent1\".\"CategoryID\"))) \n OR (((5 = \"Extent1\".\"CategoryID\") OR (6 = \"Extent1\".\"CategoryID\")) \n OR ((7 = \"Extent1\".\"CategoryID\") OR (8 = \"Extent1\".\"CategoryID\"))) SQL generated by dotConnect for Oracle 6.00: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\", \n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n \"Extent1\".\"CategoryID\" IN (1, 2, 3, 4, 5, 6, 7, 8 ) SQL generated by SqlClient in EF v4: SELECT\n [Extent1].[CategoryID] AS [CategoryID], [Extent1].[CategoryName] AS [CategoryName], \n [Extent1].[Description] AS [Description], [Extent1].[Picture] AS [Picture]\nFROM \n [dbo].[Categories] AS [Extent1]\nWHERE \n [Extent1].[CategoryID] IN (1, 2, 3, 4, 5, 6, 7, 8 ) As we can see dotConnect for Oracle 6.00 generates a single IN clauses instead of multiple OR clauses. Microsoft Entity Framework provider for SQL Server provides a similar result. Two and more collections with the repeated elements Let’s consider two collections containing unique elements; join of these collections is a new collection with repeated elements. LINQ to Entities: Entities ctx = new Entities();\nint[] collection1 = new int[] { 1, 2, 3, 4, 5, 6, 7, 8 };\nint[] collection2 = new int[] { 1, 2, 7, 8, 9, 10 };\n\nvar query = from c in ctx.Categories\n where collection1.Contains(c.CategoryID) \n || collection2.Contains(c.CategoryID)\n select c; Entity SQL: SELECT VALUE Categories \nFROM Entities.Categories\nWHERE Categories.CategoryID IN{ 1, 2, 3, 4, 5, 6, 7, 8}\n OR Categories.CategoryID IN { 1, 2, 7, 8, 9, 10 } SQL generated by dotConnect for Oracle 5.XX: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n ((((1 = \"Extent1\".\"CategoryID\") OR (2 = \"Extent1\".\"CategoryID\")) \n OR ((3 = \"Extent1\".\"CategoryID\") OR (4 = \"Extent1\".\"CategoryID\"))) \n OR (((5 = \"Extent1\".\"CategoryID\") OR (6 = \"Extent1\".\"CategoryID\")) \n OR ((7 = \"Extent1\".\"CategoryID\") OR (8 = \"Extent1\".\"CategoryID\")))) \n OR ((((1 = \"Extent1\".\"CategoryID\") OR (2 = \"Extent1\".\"CategoryID\")) \n OR ((7 = \"Extent1\".\"CategoryID\") OR (8 = \"Extent1\".\"CategoryID\"))) \n OR ((9 = \"Extent1\".\"CategoryID\") OR (10 = \"Extent1\".\"CategoryID\"))) SQL generated by dotConnect for Oracle 6.00: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n \"Extent1\".\"CategoryID\" IN ( 1, 2, 3, 4, 5, 6, 7, 8, 9, 10) SQL generated by SqlClient in EF v4: SELECT\n [Extent1].[CategoryID] AS [CategoryID], [Extent1].[CategoryName] AS [CategoryName],\n [Extent1].[Description] AS [Description], [Extent1].[Picture] AS [Picture]\nFROM \n [dbo].[Categories] AS [Extent1]\nWHERE \n [Extent1].[CategoryID] IN ( 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 7, 8, 9, 10) Even though LINQ query contains two collections and two Contains clauses, provider will generate only one IN expression, containing united item collection. As we can see, the repeated elements are removed from the IN collection in dotConnect for Oracle. We can get the similar result, if we perform Contains for one collection with the repeated elements: LINQ to Entities: int[] collection1 = new int[] { 1, 2, 3, 4, 5, 1, 1, 2, 3, 3, 4, 5, 5, 5, 5, 5, 5 };\nvar query = from c in ctx.Categories\n where collection1.Contains(c.CategoryID)\n select c; The equivalent EntitySQL statement is: SELECT VALUE Categories \nFROM Entities.Categories\nWHERE Categories.CategoryID \nIN { 1, 2, 3, 4, 5, 1, 1, 2, 3, 3, 4, 5, 5, 5, 5, 5, 5 } In this case EF will generate the following SQL statements. SQL generated by dotConnect for Oracle 5.XX: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n ((((1 = \"Extent1\".\"CategoryID\") OR (2 = \"Extent1\".\"CategoryID\")) \n OR ((3 = \"Extent1\".\"CategoryID\") OR (4 = \"Extent1\".\"CategoryID\"))) \n OR (((5 = \"Extent1\".\"CategoryID\") OR (1 = \"Extent1\".\"CategoryID\")) \n OR ((1 = \"Extent1\".\"CategoryID\") OR (2 = \"Extent1\".\"CategoryID\")))) \n OR ((((3 = \"Extent1\".\"CategoryID\") OR (3 = \"Extent1\".\"CategoryID\")) \n OR ((4 = \"Extent1\".\"CategoryID\") OR (5 = \"Extent1\".\"CategoryID\"))) \n OR (((5 = \"Extent1\".\"CategoryID\") OR (5 = \"Extent1\".\"CategoryID\")) \n OR (((5 = \"Extent1\".\"CategoryID\") OR (5 = \"Extent1\".\"CategoryID\")) \n OR (5 = \"Extent1\".\"CategoryID\")))) SQL generated by dotConnect for Oracle 6.00: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \"Extent1\".\"CategoryID\" IN ( 1, 2, 3, 4, 5) SQL generated by SqlClient in EF v4: SELECT\n [Extent1].[CategoryID] AS [CategoryID], [Extent1].[CategoryName] AS [CategoryName],\n [Extent1].[Description] AS [Description], [Extent1].[Picture] AS [Picture]\nFROM \n [dbo].[Categories] AS [Extent1]\nWHERE \n [Extent1].[CategoryID] IN ( 1, 2, 3, 4, 5, 1, 1 , 2, 3, 3, 4, 5, 5, 5, 5, 5, 5) Optimization of the query without obvious collection Let’s consider an example where collection isn’t declared explicitly. LINQ to Entities: var query = from c in ctx.Categories\n where c.CategoryID == 1 || c.CategoryID == 3 \n || c.CategoryID == 5 || c.CategoryID > 200\n select c; SELECT VALUE Categories \nFROM Entities.Categories\nWHERE Categories.CategoryID = 1 OR Categories.CategoryID == 3 \nOR Categories.CategoryID == 5 OR Categories.CategoryID > 200 In this case EF will generate the following SQL. SQL generated by dotConnect for Oracle 5.XX: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n (((1 = \"Extent1\".\"CategoryID\") OR (3 = \"Extent1\".\"CategoryID\")) \n OR (5 = \"Extent1\".\"CategoryID\")) OR (\"Extent1\".\"CategoryID\" > 200) SQL generated by dotConnect for Oracle 6.00: SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".\"Picture\" AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n (\"Extent1\".\"CategoryID\" IN (1,3,5)) OR (\"Extent1\".\"CategoryID\" > 200) SQL generated by SqlClient in EF v4: SELECT\n [Extent1].[CategoryID] AS [CategoryID], [Extent1].[CategoryName] AS [CategoryName],\n [Extent1].[Description] AS [Description], [Extent1].[Picture] AS [Picture]\nFROM \n [dbo].[Categories] AS [Extent1]\nWHERE \n (1 = [Extent1].[CategoryID]) OR (3 = [Extent1].[CategoryID]) \n OR (5 = [Extent1].[CategoryID]) OR ([Extent1].[CategoryID] > 200) As we can see dotConnect for Oracle 6.00 is capable of transforming a set of OR clauses into single IN expression in some cases. Oracle-specific SQL There is a limit of 1000 elements for the IN expression in Oracle. If element count exceeds this limit, then SQL query execution fails with ORA-01975 exception. dotConnect for Oracle deals with this limitation by the correct SQL generation (several IN expressions connected with OR clause). So now you don’t need to split your collection into smaller ones, dotConnect for Oracle does it automatically. SELECT\n \"Extent1\".\"CategoryID\" AS \"CategoryID\", \"Extent1\".\"CategoryName\" AS \"CategoryName\",\n \"Extent1\".\"Description\" AS \"Description\", \"Extent1\".Picture AS \"Picture\"\nFROM \n NORTHWINDEF.\"Categories\" \"Extent1\"\nWHERE \n \"Extent1\".\"CategoryID\" IN ( 0, 1, 2, 3, 4, 5, 6, 7, 8, ... 994, 995, 996, 997, 998, 999) \n OR \"Extent1\".\"CategoryID\" IN (1000, 1001, 1002, 1003, 1004, 1005, 1006, 1007, 1008) Afterword New versions of Devart ADO.NET data providers for Oracle, MySQL, PostgreSQL, and SQLite generate more compact and fast SQL queries. In future we will continue optimization of generated SQL for specific cases of Entity Framework usage considering the peculiarities of each DBMS. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [Oracle](https://blog.devart.com/tag/oracle) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-sql-generation-enhancements-for-in-clause.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework%3A+SQL+Generation+Enhancements+for+IN+Clause&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-sql-generation-enhancements-for-in-clause.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-sql-generation-enhancements-for-in-clause.html&title=Entity+Framework%3A+SQL+Generation+Enhancements+for+IN+Clause) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-sql-generation-enhancements-for-in-clause.html&title=Entity+Framework%3A+SQL+Generation+Enhancements+for+IN+Clause) [Copy URL](https://blog.devart.com/entity-framework-sql-generation-enhancements-for-in-clause.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/entity-framework-tips-and-tricks-part-2.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework Tips and Tricks, Part 2 By [dotConnect Team](https://blog.devart.com/author/dotconnect) July 30, 2010 [0](https://blog.devart.com/entity-framework-tips-and-tricks-part-2.html#respond) 5302 In this article we continue series of publications about non-trivial situations encountered by our users. Some of these situations are the restrictions of the used platforms, and some simply require a nonstandard approach to solving the problem. We consider the following questions in this article: How can I execute native SQL using Entity Framework? How can I use Self-Tracking Entities (STE) or POCO in the Devart Entity Model? How can I use Oracle stored procedures with a boolean parameter in Entity Framework? How to work with UNION in Entity Framework 1. How can I execute native SQL using Entity Framework? Issue: I don’t seem to be able to use CreateQuery. I have the following code: using (Entities ent = new Entities())\n{\n var results = ent.CreateQuery(\"SELECT * FROM \"Products\"\");\n\n foreach (var row in results)\n {\n Console.WriteLine(\"id: \" + row.ID.ToString());\n }\n} This results in the error: “The query syntax is not valid, near term ‘*’, line 1, column 11.” What should I use to execute the native SQL? Solution: The CreateQuery method takes Entity SQL query, not SQL query. To execute the SQL query you can use the [ObjectContext.ExecuteStoreQuery](http://msdn.microsoft.com/en-us/library/dd487208.aspx) method. Please note that it is available in the Entity Framework v4 only. There is one more way to execute native SQL both in Entity Framework v1 and Entity Framework v4. You can use the StoreConnection property of the [ObjectContext.Connection](http://msdn.microsoft.com/en-us/library/system.data.objects.objectcontext.connection.aspx) object. Here is an Oracle-specific sample: using (OracleConnection connection \n = (entities.Connection as EntityConnection).StoreConnection as OracleConnection) \n{\n OracleCommand command \n = new OracleCommand(\"SELECT * FROM \"Products\"\", connection);\n\n connection.Open();\n OracleDataReader reader = command.ExecuteReader();\n\n //Materialize your query results here\n} 2. How can I use Self-Tracking Entities (STE) or POCO in the Devart Entity Model? Issue: I am using EF 4.0 on top of your provider dotConnect for Oracle. I want to use Self-Tracking Entities. Can I rename edml file to the edmx file and use STE with this file? Can I perform the same in case of using POCO entities? Solution: Devart model has structure similar to the one Microsoft model has. But there are some customizations. There should be no troubles with the extension changing ( edml -> edmx ). There is one more approach to use STE or POCO with the .edml file: Go to the %Program Files%Microsoft Visual Studio 10.0Common7IDEExtensionsMicrosoftEntity Framework ToolsTemplatesIncludesEF.Utility.CS.ttinclude Change if (extension.Equals(\".edmx\", StringComparison.InvariantCultureIgnoreCase)) to if(extension.Equals(\".edmx\", StringComparison.InvariantCultureIgnoreCase)\n||\nextension.Equals(\".edml\", StringComparison.InvariantCultureIgnoreCase)) Save the file In case you are using VB.NET, you should perform these operations with the %Program Files%Microsoft Visual Studio 10.0Common7IDEExtensionsMicrosoftEntity Framework ToolsTemplatesIncludesEF.Utility.VB.ttinclude file. Change the following line If (extension.Equals(\".edmx\", StringComparison.InvariantCultureIgnoreCase)) to If (extension.Equals(\".edmx\", StringComparison.InvariantCultureIgnoreCase)\nOr\nextension.Equals(\".edml\", StringComparison.InvariantCultureIgnoreCase)) We plan to add support for the Self-Tracking Entities and POCO in one of the future builds. 3. How can I use Oracle stored procedures with a boolean parameter in Entity Framework? Issue: I am trying to execute a stored procedure with a PL/SQL BOOLEAN parameter in Entity Framework but always get a “ORA-06550: PLS-00306: wrong number or types of arguments in call to ‘EF_BOOLEAN_SP'”. Is it a bug, and is there any workaround for this situation? Solution: PL/SQL BOOLEAN parameters are not supported directly in Entity Framework, but there is a workaround for this case. Here are the steps to solve the problem: Add the procedure to Devart Entity model (you can perform all actions with ADO.NET Entity Data Model also, but these actions should be performed in XML using XML Editor) Change the type of the PL/SQL BOOLEAN parameters to bool Edit the CommandText for these procedure. Write a simple PL/SQL block calling the procedure. Don’t forget to pass the parameters, and convert the necessary parameters from NUMBER to PL/SQL BOOLEAN using the SYS.DIUTIL.INT_TO_BOOL functionHere is a simple XML example:The text before changes: \n \n \n The text after changes: \n \n BEGIN \n EF_BOOLEAN_SP(:PK, SYS.DIUTIL.INT_TO_BOOL(:FLAG));\n END;\n \n \n \n 4. How to work with UNION in Entity Framework? Issue: I have created the folowing union of two queries: string productName = \"Spotlight on Britain's economy\";\nvar query1 = context.Products.Where(t => t.Productname == productName);\n\nproductName = \"Carroll Lewis. Alice'sventures in Wonderland \";\nvar query2 = context.Products.Where(t => t.Productname == productName);\n\nvar resultQuery = query1.Union(query2);\nvar ResultSets = resultQuery.ToList(); Because of some reason I get only one result – the book of Lewis Carroll. Is this a designed behaviour or bug? Solution: This is a designed behaviour. The reason of it is the deferred execution. The actual materialization of query1 and query2 occurs only in the following line: var ResultSets = resultQuery.ToList(); Values of the parameters are passed in this moment also (productName points to the book of Lewis Carroll), and that’s why the results of query1 and query2 coincide. The solution is to use different variables for parameters, or to materialize the query just after the parameter value is assigned. In the first case, change your code in the following way: string productName = \"Spotlight on Britain's economy\";\nvar query1 = context.Products.Where(t => t.Productname == productName);\n\nstring productName2 = \"Carroll Lewis. Alice'sventures in Wonderland \";\nvar query2 = context.Products.Where(t => t.Productname == productName2);\n\nvar resultQuery = query1.Union(query2);\nvar ResultSets = resultQuery.ToList(); In this case UNION statement is translated as a part of LINQ to Entities query. In the second case both queries are materialized: string productName = \"Spotlight on Britain's economy\";\nvar query1 = context.Products.Where(t => t.Productname == productName);\nquery1.ToList();\n\nproductName = \"Carroll Lewis. Alice'sventures in Wonderland \";\nvar query2 = context.Products.Where(t => t.Productname == productName);\nquery2.ToList();\n\nvar resultSet = query1.Union(query2); In this case both lists are materialized before UNION is applied. That is why there is no need to materialize UNION itself, it is performed as a LINQ to Objects operation. We hope this material will be useful for you. Please provide your feedbacks and suggestions in comments to article. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [linq](https://blog.devart.com/tag/linq) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-2.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Tips+and+Tricks%2C+Part+2&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-2.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-tips-and-tricks-part-2.html&title=Entity+Framework+Tips+and+Tricks%2C+Part+2) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-tips-and-tricks-part-2.html&title=Entity+Framework+Tips+and+Tricks%2C+Part+2) [Copy URL](https://blog.devart.com/entity-framework-tips-and-tricks-part-2.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/entity-framework-tips-and-tricks-part-3.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework: Tips and Tricks, part 3 By [dotConnect Team](https://blog.devart.com/author/dotconnect) October 19, 2010 [2](https://blog.devart.com/entity-framework-tips-and-tricks-part-3.html#comments) 3452 This article continues series of publications about non-trivial Entity Framework situations encountered by our users. We consider the following questions in this article: Why Oracle raises an exception about too many local sessions participating in global transaction when using TransactionScope? How can I optimize SQL queries using Entity Framework? How can I perform cascade deleting of objects in Entity Framework? How can I get the default value of a database field in my Entity Framework project? 1. Why Oracle raises an exception about too many local sessions participating in global transaction when using TransactionScope? Issue: I have run the following test code: public void TestMaxNumberOfObjectContetxtInTransactionScope()\n{\n using (TransactionScope ts = new \n TransactionScope(TransactionScopeOption.Required, \n IsolationLevel.ReadCommitted))\n {\n for (int i = 0; i < 50; i++)\n {\n using (TestObjectContext context = new TestObjectContext())\n {\n TestData newTestData = TestData.CreateTestData(id);\n newTestData.Description = description;\n context.AddToTestDataEntity(newTestData);\n context.SaveChanges();\n }\n }\n ts.Complete();\n }\n} and it failed after the 32 iterations. The error I get is “ORA-02045: too many local sessions participating in global transaction”. One more strange thing is the fact that everything works like a charm in Direct mode. Could you please explain the behaviour? Solution: Maximum number of branches in the distributed Oracle transaciton is 32. We use distributed Oracle transactions in our code that implements TransactionScope support. So, as soon as the number of transactions inside the scope exceeds this number this error appears. The reason of the fact that Direct mode works is the fact that there is no distributed transactions support in it (local transactions are used). So, the workaround is to use TransactionScope with smaller number of transaction branches. In most scenarios it is enough to use a local transaction like in the following example: context.Connection.Open();\nDEPT department = context.DEPT.Where(d => d.DEPTNO == 10).First();\ndepartment.LOC = \"TEST\";\n\nusing (System.Data.Common.DbTransaction transaction = \n context.Connection.BeginTransaction()) {\n context.SaveChanges();\n department.DNAME = \"TEST\";\n context.SaveChanges();\n\n if (flag)\n transaction.Commit();\n else\n transaction.Rollback();\n } Note: Starting from the last version of dotConnect for Oracle we solve this problem with the help of promotable single phase transaction support. Only one active session with internal local transaction is used during several connections opening. You can activate this option in connection string setting up the parameter “Transaction Scope Local=true”. 2. How can I optimize SQL queries using Entity Framework? Issue: When I am using native SQL I can easily use hints for MySQL optimization. Can I use anything like this when using Entity Framework? Solution: You have two alternatives – the first one is to create views in your database implementing the desired behaviour, and the second one is to manually create Defining Query in the Storage model and entities in your conceptual model corresponding to these Defining Queries. In both cases you will be able to use the MySQL hints. 3. How can I perform cascade deleting of objects in Entity Framework? Issue: Do I really have to delete all child entities separately? Is there any possibility to use cascaded deletes? Solution: Add Cascade action for all associations you want to be cascaded. This action should be added both in CSDL and SSDL parts of the .edmx file. Here is an example of CSDL part: \n \n \n \n \n \n Example of SSDL is shown below: \n \n \n \n As you can see to add cascade deleting just define the following construction in CSDL or SSDL part of your .edmx file: 4. How can I get the default value of a database field in my Entity Framework project? Issue: I have a varchar2 “MyProperty” column in my Oracle database containing the default value. How can I get this value, e.g, to populate my Textbox in the user input form? Solution: You need to query MetadataWorkspace, for example, in the following way: MetadataWorkspace workspace = context.MetadataWorkspace;\nobject value = new value();\nvar item = workspace.GetItems(DataSpace.CSpace).Where\n (i => i.Name == \"MyEntity\").Single();\nvalue = item.Properties.Where\n (p => p.Name == \"MyProperty\").Single().DefaultValue; We hope this material will be useful for you. Please provide your feedbacks and suggestions in comments to article. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [linq](https://blog.devart.com/tag/linq) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-3.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework%3A+Tips+and+Tricks%2C+part+3&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-3.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-tips-and-tricks-part-3.html&title=Entity+Framework%3A+Tips+and+Tricks%2C+part+3) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-tips-and-tricks-part-3.html&title=Entity+Framework%3A+Tips+and+Tricks%2C+part+3) [Copy URL](https://blog.devart.com/entity-framework-tips-and-tricks-part-3.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 2 COMMENTS Sebastian November 4, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 5:05 am If I add the setting “TransactionScopeLocal=true” to my connection string as suggested in the note at the end of question #1 I get an exception telling me that the connection string parameter is not supported. Is this parameter really implemented in the current version? Devart November 25, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 10:00 am Thank you for the comment. We have made a correction, the parameter name should be “Transaction Scope Local”. This functionality is available since the latest 6.0.46 Beta build of dotConnect for Oracle. Comments are closed."} {"url": "https://blog.devart.com/entity-framework-tips-and-tricks-part-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework: Tips and Tricks, part 4 By [dotConnect Team](https://blog.devart.com/author/dotconnect) February 4, 2011 [1](https://blog.devart.com/entity-framework-tips-and-tricks-part-4.html#comments) 4419 We continue post series about Entity Framework usage. We consider the following questions in this article: MS SQL Server and Oracle – how to use both in one EF project? DDL is generated with wrong keywords. What is the difference between .edml and .edmx models? Why setting the StoreGeneratedPattern attribute to “Identity” or “Computed” in Visual Studio 2010 does not work as expected? How can I avoid loading an entire child collection? 1. MS SQL Server and Oracle – how to use both in one EF project? Issue: I am using Entity Framework with SQL Server and I need to migrate to Oracle database. How can I do this? And, is it possible to have both Oracle and SQL Server databases in production simultaneously without duplicating the code of the project? Solution: To solve this situation perform the following steps: Create a copy of your .edmx file, set a valid Oracle connection for it, and run the Generate Database from Model wizard. As a result you will obtain a new valid EF Oracle model Change the Metadata Artifact Processing property to CopyToOutputDirectory for both models Change the connection strings accordingly (pointing to the same .csdl. .msl. and different .ssdl resources) Basically, that’s all. As for the users’ feedback, Paul Reynolds, one of our users, has created series of posts based on his experience in a similar task: [Preparing for Multiple Databases](http://web.archive.org/web/20160410162242/http://blogs.planetsoftware.com.au/paul/archive/2010/09/04/ef4-part-5-preparing-for-multiple-databases.aspx) [SSDL Adjustments](http://web.archive.org/web/20160410202156/http://blogs.planetsoftware.com.au/paul/archive/2010/09/11/ef4-part-7-ssdl-adjustments.aspx) [Database Agnostic LINQ to Entities](http://web.archive.org/web/20160410145811/http://blogs.planetsoftware.com.au/paul/archive/2010/09/24/ef4-part-8-database-agnostic-linq-to-entities.aspx) If you are interested in Model First, take a look at [this post by Vagif Abilov](http://web.archive.org/web/20130321054345/http://bloggingabout.net/blogs/vagif/archive/2010/10/26/migrating-database-schema-from-microsoft-sql-server-to-oracle-using-entity-framework-and-devart-dotconnect.aspx) . Don’t miss our comment. 2. DDL is generated with wrong keywords. Issue: I have tried to generate Oracle-specific DDL from a conceptual model (initially generated from SQL Server), have set the DDL Generation template to “Devart SSDLToOracle.tt” but the script contains some not supported by Oracle types (like “varchar(max)” or “datetime”). Is it a bug? Solution: Please comment out the context connection string in the application configuration file ( App.Config/Web.Config ) and run the Generate Database from Model wizard again. Provide a correct connection on the first step of the wizard, and the DDL script will be generated successfully. 3. What is the difference between .edml and .edmx models? Issue: Are there any significant differences between Devart Entity models and Microsoft Entity models? In particular, I failed to add an Oracle stored procedure having reference cursor as an output parameter in .edmx model. Solution: The structure of Microsoft and Devart entity models is almost identical. There are some additional mapping elements that enable additional functionality like TPC and TPT inheritance, File per Class code generation, View Pregeneration and stored procedures handling, for example. As for Oracle cursors, read our blog-article – Working with Oracle cursors and stored procedures in Entity Framework . There is no possibility to handle this situation in design time using Entity Data Model designer. 4. Why setting the StoreGeneratedPattern attribute to “Identity” or “Computed” in Visual Studio 2010 does not work as expected? Issue: I have a field with a default value specified in database. I have set the StoreGeneratedPattern to “Identity” for the corresponding property in Entity Data Model designer, but it does not have any effect – Oracle gives an error stating that null is inserted into non-nullable field. What is the reason of the problem? Solution: Please make sure that you have changed the StoredGeneratedPattern attribute in the Store part of the model. It is a known issue with Visual Studio Entity Data Model Designer, it does not make any change to the StoreGeneratedPattern attribute if it is changed in design time (only the annotation:StoreGeneratedPattern conceptual attribute is modified). The solution is to open the model using any XML Editor and add StoreGeneratedPattern = “Identity” to the necessary property in the SSDL part of the model. 5. How can I avoid loading an entire child collection? Issue: I am trying to get a subset of the child entity properties in the query like the following: var parents = from p in context.Parents select p;\nforeach (var parent in parents)\n{\n var children = from c in parent.Children\n select new {\n c.ID,\n c.Name\n };\n} The issue is the fact that the generated SQL I can see in OracleMonitor is too agressive. It loads the complete child entity, and only after that the LINQ to Objects filtering occurs. Is it a bug or do I miss something? Solution: When you are calling parent.Children , the EF Lazy Loading mechanism loads the entire collection, and the select new clause is only a LINQ to Objects query, and is executed in memory. As a workaround you can use, for example, the following syntax: var parents = from p in context.Parents select p;\nforeach (var parent in parents)\n{\n var children = from c in context.Children\n where c.ParentID == parent.ParentID\n select new {\n c.Id,\n c.Name\n };\n} This statement will produce the query you expect. We hope this material will be useful for you. Please provide your feedbacks and suggestions in comments to article. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [linq](https://blog.devart.com/tag/linq) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-4.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework%3A+Tips+and+Tricks%2C+part+4&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-tips-and-tricks-part-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-tips-and-tricks-part-4.html&title=Entity+Framework%3A+Tips+and+Tricks%2C+part+4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-tips-and-tricks-part-4.html&title=Entity+Framework%3A+Tips+and+Tricks%2C+part+4) [Copy URL](https://blog.devart.com/entity-framework-tips-and-tricks-part-4.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 1 COMMENT Saman Pirooz October 9, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 1:32 am Tanx for the identity bit in SSDL Really helped alot Comments are closed."} {"url": "https://blog.devart.com/entity-framework-user-request-review.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Entity Framework Tips and Tricks, Part 1 By [dotConnect Team](https://blog.devart.com/author/dotconnect) July 9, 2010 [1](https://blog.devart.com/entity-framework-user-request-review.html#comments) 5191 This article will open a series of publications about non-trivial situations encountered by our users. Some of these situations are the restrictions of the used platforms, and some simply require a nonstandard approach to solving the problem. Our first post consists of the following users’ questions concerning Entity Framework: How to provide the current datetime for record filtering? How can I use XML-types from Oracle tables in the Entity Framework model? Is ORA_ROWSCN type useful for concurrency handling in Entity Framework? What is “outer apply” statement and why isn’t it supported in the Entity Framework statements? 1. How to provide the current datetime for record filtering? Issue: I’m trying to filter results in the LINQ to Entities query and show all orders which are older than today. I have built two queries and each of them returns different record sets. For example, This query sometimes returns incorrect results: var query = from c in db.Orders where c.OrderDate <= DateTime.Now select c; The corresponding SQL-statement is the following: SELECT * FROM Orders WHERE OrderDate <= CURRENT_DATE This query always works fine: DateTime date = DateTime.Now;\nvar query = from c in db.Orders where c.OrderDate <= date select c; The corresponding SQL-statement is the following: SELECT * FROM Orders WHERE OrderDate <= :p0 The :p0 parameter is bound with the current datetime value, for example ‘6/10/2010’. Could you please explain the reasons of this behaviour? Am I doing something in wrong way? Response: If you have a difference between server and local machine dates, it is better to use local parameter in your LINQ to Entities queries. It will guarantee that you will use values from your local machine. Note: This situation isn’t a problem or bug. If DateTime.Now is used as a LINQ to Entities query parameter it will be mapped to the function that returns value of current date and time in server format. When you pass a local variable as a parameter of a LINQ to Entities query, the current date and time will be transformed to the local machine format. 2. How can I use XML-types from Oracle tables in the Entity Framework model? Issue: I’m using Entity Framework and trying to access an Oracle table which contains a field of XMLTYPE data type. When reading data from the table I get the following exception: “ORA-03115 unsupported network datatype or representation”. Can I use XMLTYPE in Entity Framework with dotConnect for Oracle? Solution: You can use XML type directly only in the OCI mode because Direct mode has a set of limitations. Please see the [Using Direct mode](http://www.devart.com/dotconnect/oracle/docs/DirectMode.html) topic of our documentation for detailed information about the Direct mode limitations. As a workaround, you can use Oracle CLOBs. To do this, you will need to make the following changes in your .edmx (or .edml) file: 1. SSDL. Here you should add DefiningQuery for your table: \n \n SELECT\n \"Extent1\".ID AS ID,\n \"Extent1\".XMLFIELD.GetClobVal() AS XMLFIELD\n FROM SCOTT.XML_TYPE \"Extent1\"\n \n Add definition of the properties of your table as well: \n \n \n \n \n \n 2. CSDL part. Here it’s enough to declare the class and its properties: \n \n \n \n \n \n \n \n 3. MSL part. It’s enough to just specify the mapping: \n \n \n \n \n \n \n These operations let you read the XMLTYPE columns to your code. To save XMLTYPE objects to database you’ll have to write a set of stored procedures (for inserting, updating, and deleting) and map these procedures as an Insert/Update/Delete actions in the Configure Behaviour form. 3. Is ORA_ROWSCN type useful for concurrency handling in Entity Framework? Issue: Can I use Oracle ORA_ROWSCN column for concurrency control purposes in Entity Framework? Solution: In fact, ORA_ROWSCN column is not very helpful in this situation. This column value is refreshed only after the transaction commit, taking place in the end of all SaveChanges() , is performed. In case the first user performs an insert (ORA_ROWSCN will be null after this operation), then refreshes the object from database ( ObjectContext.Refresh() ) and performs an update, there will be no problem. But in case the first user wants to save changes, then to update the values inserted by himself (not refreshing them explicitly), there will arise a possibly unnecessary concurrency error. The same problem will arise in case when this user wants to perform two consecutive updates separated by a SaveChanges call. Please note that after the first update the old ORA_ROWSCN value will be read. In case, when the second user performs an update of the record that was just inserted by the first user, everything will go smoothly. So, the only possible way to refresh the value for this column is to refresh the entire object (fetch it from the database), what seems to be an overkill. 4. What is “outer apply” statement and why isn’t it supported in the Entity Framework statements? Issue: I am using FirstOrDefault with a condition for the first time. I have tried a few permutations. Always with the same result – failure. I have the following tables: Jobs (list of Jobs) Names (list of names able to do the jobs) JobNames (list of name(s) that are associated with a specific job All Jobs have at least one name associated with it, so for every row in Jobs there is at least one row in JobNames. For every Job there is one Main name (person responsible). Jobnames has the columns: Surname Forename IsResponsible For every job there is one row in Jobnames that has the flag IsResponsible set to true. There is one row in jobs and one row in JobNames and the flag IsResponsible is set to true. Devart for PostgreSQL was used to build an entity model for queries. I want a list of Jobs to include the surname of the person responsible for the job. The code in a query: Surname = Jobs.JobNames.FirstOrDefault().Names.Surname; It works but I cannot guarantee that the Surname returned is that of the person responsible. Looking at examples on the web for the syntax I presumed: Surname = Jobs.JobNames.FirstOrDefault(n => n.IsResponsible == true).Names.Surname; Sometimes it fails with the error: base {System.Data.EntityException} \n = {\"An error occurred while preparing the command definition. \n See the inner exception for details.\"} InnerException = {“OUTER APPLY is not supported by PostgreSQL”} Does this mean that the issue is with Postgres and I can’t do it? Is the problem with the Devart interface and if so will there be a fix? Is there a workround to the issue? Solution: The OUTER APPLY operator allows you to invoke a table-valued function for each row returned by an outer table expression of a query. It is supported only in Microsot SQL Server 2005 and higher but the Entity Framework query in C# code is transformed to Expression Tree by Microsoft code, and only after this transformation we generate provider-specific SQL query. Sometimes OUTER APPLY statement is included to your SQL query even when you use other database servers. It is the reason of the error described in this issue. We have added NotSupportedException in case of Outer Apply occurrence in the expression tree. This material will be enhanced when the new requests are resolved by our support team. We hope that this article will be useful for you and helps you in the Entity Framework usage. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [linq](https://blog.devart.com/tag/linq) [tips and tricks](https://blog.devart.com/tag/tips-and-tricks-2) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentity-framework-user-request-review.html) [Twitter](https://twitter.com/intent/tweet?text=Entity+Framework+Tips+and+Tricks%2C+Part+1&url=https%3A%2F%2Fblog.devart.com%2Fentity-framework-user-request-review.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entity-framework-user-request-review.html&title=Entity+Framework+Tips+and+Tricks%2C+Part+1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entity-framework-user-request-review.html&title=Entity+Framework+Tips+and+Tricks%2C+Part+1) [Copy URL](https://blog.devart.com/entity-framework-user-request-review.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 1 COMMENT Monducci Marco August 3, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 2:33 pm Hi, I’m writting about “3. Is ORA_ROWSCN type useful for concurrency handling in Entity Framework?”. I got your answer but now I would like to know if you have any suggestion…. Thanks in advance, Monducci Marco Comments are closed."} {"url": "https://blog.devart.com/entitydac-for-rad-studio-11-alexandria.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) EntityDAC for RAD Studio 11 Alexandria By [DAC Team](https://blog.devart.com/author/dac) September 29, 2021 [0](https://blog.devart.com/entitydac-for-rad-studio-11-alexandria.html#respond) 2530 We are glad to announce that EntityDAC is now available for the new RAD Studio 11 Alexandria and includes support for the ARM architecture (Apple Silicon M1). EntityDAC is an ORM tool for Delphi and C++ Builder that provides advanced object-relational mapping capabilities for database applications. The product offers visual ORM designer and supports common approaches to database application development (database-first, mobile-first, and code-first), LINQ queries, caching of entities loaded from the database, metadata, and more. RAD Studio 11 Alexandria is a big update that brings a lot of exciting features, including support for Apple Silicone M1. With this release, we enable developers to build their EntityDAC based applications in the most advanced IDE. Please feel free to download the new version: [EntityDAC 3.0](https://www.devart.com/entitydac/) [Download](https://www.devart.com/entitydac/download.html) [Revision History](https://www.devart.com/entitydac/revision_history.html) Tags [entitydac](https://blog.devart.com/tag/entitydac) [entitydac what's new](https://blog.devart.com/tag/entitydac-whats-new) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentitydac-for-rad-studio-11-alexandria.html) [Twitter](https://twitter.com/intent/tweet?text=EntityDAC+for+RAD+Studio+11+Alexandria&url=https%3A%2F%2Fblog.devart.com%2Fentitydac-for-rad-studio-11-alexandria.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entitydac-for-rad-studio-11-alexandria.html&title=EntityDAC+for+RAD+Studio+11+Alexandria) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entitydac-for-rad-studio-11-alexandria.html&title=EntityDAC+for+RAD+Studio+11+Alexandria) [Copy URL](https://blog.devart.com/entitydac-for-rad-studio-11-alexandria.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/entitydac-now-supports-android-64-bit.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) EntityDAC Now Supports Android 64-bit By [DAC Team](https://blog.devart.com/author/dac) January 21, 2020 [0](https://blog.devart.com/entitydac-now-supports-android-64-bit.html#respond) 3105 EntityDAC 2.3. was released this week with support for the Android 64-bit platform, fully-functional trial version for macOS and Linux, and the latest fixes and improvements. The key update in this release is support for 64-bit Android app development, following the release of RAD Studio 10.3.3 that includes a new Delphi compiler for the ARM platform, which allows developers to build 64-bit applications from a single Delphi codebase. In RAD Studio 10.3.3, you can build Google Play Store ready Android 64-bit applications, complete with Android App Bundle Support — the official publishing format for Android applications. Note that that starting August 1, 2019, your apps published on Google Play need to support Android devices with 64-bit architectures. All publishers are required to provide 64-bit versions in addition to 32-bit versions in Google Play. The trial editions of EntityDAC for macOS and Linux are now fully functional: we removed the limitation on the number of ORM metatypes that you can create with the trial version of EntityDAC on these platforms. EntityDAC is a Delphi ORM framework for fast object-relational mapping of database objects to Delphi classes using LINQ. For the full list of updates in this release, see the [revision history](https://www.devart.com/entitydac/revision_history.html) page. You are welcome to [download](https://www.devart.com/entitydac/download.html) and try out the new version of the framework for free. If you have any thoughts on the product you would like to share with us, you are welcome to join our [forum](https://forums.devart.com/viewforum.php?f=42) . Tags [delphi](https://blog.devart.com/tag/delphi) [entitydac](https://blog.devart.com/tag/entitydac) [entitydac what's new](https://blog.devart.com/tag/entitydac-whats-new) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentitydac-now-supports-android-64-bit.html) [Twitter](https://twitter.com/intent/tweet?text=EntityDAC+Now+Supports+Android+64-bit&url=https%3A%2F%2Fblog.devart.com%2Fentitydac-now-supports-android-64-bit.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entitydac-now-supports-android-64-bit.html&title=EntityDAC+Now+Supports+Android+64-bit) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entitydac-now-supports-android-64-bit.html&title=EntityDAC+Now+Supports+Android+64-bit) [Copy URL](https://blog.devart.com/entitydac-now-supports-android-64-bit.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/entitydac-supports-macos-64bit.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) EntityDAC now supports macOS 64-bit By [DAC Team](https://blog.devart.com/author/dac) July 30, 2019 [0](https://blog.devart.com/entitydac-supports-macos-64bit.html#respond) 2855 Version 2.2 is a minor release of EntityDAC that includes a few improvements and new features. The most notable update in this release is support for the macOS 64-bit operating system. We have also supported the Bytes attribute for the GUID generator. There is a good chance that applications you have developed are still 32-bit. Now you can create 32-bit or 64-bit versions of your database applications for macOS to align technology with your business needs. This update has arrived just in time considering Apple’s plans to drop support for 32-bit applications in future versions of macOS. Apple confirmed this June that the new macOS Catalina, which the company plans to release in the fall will bar users from installing 32-bit applications. EntityDAC is a Delphi ORM framework for fast object-relational mapping of database objects to Delphi classes using LINQ. Note that EntityDAC 2.2 requires installation of Embarcadero RAD Studio 10.3 – Release 2. For the full list of updates in this release, see the [revision history](https://www.devart.com/entitydac/revision_history.html) page. You are welcome to [download](https://www.devart.com/entitydac/download.html) and try out the new version of the framework for free. If you have any thoughts on the product you would like to share with us, you are welcome to join our [forum](https://forums.devart.com/viewforum.php?f=42) . Tags [entitydac](https://blog.devart.com/tag/entitydac) [entitydac what's new](https://blog.devart.com/tag/entitydac-whats-new) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentitydac-supports-macos-64bit.html) [Twitter](https://twitter.com/intent/tweet?text=EntityDAC+now+supports+macOS+64-bit&url=https%3A%2F%2Fblog.devart.com%2Fentitydac-supports-macos-64bit.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entitydac-supports-macos-64bit.html&title=EntityDAC+now+supports+macOS+64-bit) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entitydac-supports-macos-64bit.html&title=EntityDAC+now+supports+macOS+64-bit) [Copy URL](https://blog.devart.com/entitydac-supports-macos-64bit.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/entitydac-with-support-for-delphi-10-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [What’s New](https://blog.devart.com/category/whats-new) New in EntityDAC: Support for Delphi 10.4 By [DAC Team](https://blog.devart.com/author/dac) June 15, 2020 [0](https://blog.devart.com/entitydac-with-support-for-delphi-10-4.html#respond) 2306 We are delighted to announce the release of Devart EntityDAC with support for the recently released Delphi 10.4 Sydney. Our object-relational mapping tool received support for the latest version of Delphi. You are welcome to [download](https://www.devart.com/entitydac/download.html) the new version of EntityDAC. Tags [entitydac](https://blog.devart.com/tag/entitydac) [entitydac what's new](https://blog.devart.com/tag/entitydac-whats-new) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fentitydac-with-support-for-delphi-10-4.html) [Twitter](https://twitter.com/intent/tweet?text=New+in+EntityDAC%3A+Support+for+Delphi+10.4&url=https%3A%2F%2Fblog.devart.com%2Fentitydac-with-support-for-delphi-10-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/entitydac-with-support-for-delphi-10-4.html&title=New+in+EntityDAC%3A+Support+for+Delphi+10.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/entitydac-with-support-for-delphi-10-4.html&title=New+in+EntityDAC%3A+Support+for+Delphi+10.4) [Copy URL](https://blog.devart.com/entitydac-with-support-for-delphi-10-4.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/enum-in-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) ENUM Data Type in MySQL: Syntax, Examples and Best Practices By [Victoria Shyrokova](https://blog.devart.com/author/victorias) March 27, 2025 [0](https://blog.devart.com/enum-in-mysql.html#respond) 351 ENUM in MySQL is a powerful way to handle string data without the clutter. Whether you’re tracking order statuses or user roles, ENUM lets you lock in a predefined list of string values right in your database schema. It’s a favorite among developers because it stores those options as compact integers internally, speeding up queries and ensuring your data stays consistent when used correctly. In this article, we’ll take a deep dive into what is ENUM in MySQL, exploring its syntax, practical examples, and best practices to help you make the most of it in your database designs. Table of contents What is ENUM in MySQL? Why use ENUM in MySQL? How to use ENUM in MySQL Alternative to ENUM: VARCHAR + CHECK constraint Best practices and correct usage of ENUM in MySQL Common mistakes when using ENUM in MySQL How dbForge Studio for MySQL can help with ENUM in MySQL Conclusion What is ENUM in MySQL? ENUM is a string-based data type that restricts a column’s values to a predefined list. Unlike VARCHAR, which allows arbitrary strings, or INT, which stores numeric values, the ENUM data type in MySQL gives you human-readable strings that MySQL then efficiently stores as integers. It enforces strict data integrity by accepting only values you define, so it’s great for representing fixed sets of options, such as status codes, categories, or boolean-like values with more descriptive labels. For example, you could use the MySQL ENUM data type for a status column with values like “pending”, “processing”, and “completed”. Why use ENUM in MySQL? ENUM offers several benefits: It’s storage-efficient, converting strings into integers using just 1-2 bytes versus VARCHAR’s variable length. It enhances readability for queries and reports, with descriptive options (e.g., “high,” “medium,” “low”) instead of cryptic INT codes. It guarantees clean and predictable data without invalid entries. It speeds up sorting and indexing, as integer comparisons are generally faster than string comparisons. However, the correct usage of ENUM in MySQL hinges on stability. It’s perfect for static or rarely changing lists, such as product categories or size. If your value set is dynamic or you’re doing heavy string manipulation (e.g., substring searches), VARCHAR offers more flexibility. You can also use TINYINT or INT with a key for numeric mapping without ENUM’s strict rules. How to use ENUM in MySQL Let’s illustrate how to use ENUM in MySQL through an example. When creating a table, you define an ENUM column with a list of allowed values, such as: status ENUM('active', 'inactive', 'pending') Internally, MySQL assigns each ENUM value an integer index, starting from 1. In this ENUM in MySQL example, that’d be 1 for “active”, 2 for “inactive”, 3 for “pending”. This cuts down on space and speeds up performance. Then, when you query the table, MySQL translates these integers back into their corresponding string values. Creating a table with ENUM in MySQL The syntax of the ENUM data type in MySQL looks like this: CREATE TABLE table_name (\ncolumn1 datatype, \ncolumn2 ENUM('value_1', 'value_2', 'value_3', ...), \ncolumn3 datatype); Now, say you’re managing a product catalog. You’d write: CREATE TABLE products (\nproduct_id INT PRIMARY KEY AUTO_INCREMENT,\nproduct_name VARCHAR(255),\nsize ENUM('small', 'medium', 'large', 'extra-large') DEFAULT 'medium',\ncolor ENUM('red', 'blue', 'green', 'yellow') NOT NULL,\navailability ENUM('in stock', 'out of stock', 'backordered') DEFAULT 'in stock',\nstatus ENUM('pending', 'shipped', 'delivered')\n); Here, we’re defining three ENUM columns: size, color, and availability. We’re keeping the value lists short because tight sets are easier to manage and avoid clutter, which is key for the correct use of ENUM in MySQL. Adding constraints like NOT NULL on color ensures no blanks slip through, while DEFAULT ‘medium’ and DEFAULT ‘in stock’ set handy fallbacks — both great practices for tighter control and cleaner data. Inserting and retrieving ENUM values To demonstrate how to use ENUM in MySQL for inserting and retrieving data, let’s use the products table from our previous example. Suppose you’re adding a few items to your catalog. Here’s how you’d insert them: INSERT INTO products (product_name, size, color, availability)\nVALUES ('t-shirt', 'large', 'blue', 'in stock'),\n ('jeans', 'medium', 'red', 'out of stock'),\n ('socks', 'small', 'green', 'backordered'); You can also insert data referencing the index of the ENUM value. For instance, instead of ‘in stock’, ‘out of stock’, and ‘backordered’, you could simply write: INSERT INTO products (product_name, size, color, availability)\nVALUES ('hat', 'one size', 'black', 1), \n ('scarf', 'long', 'white', 2), \n ('gloves', 'medium', 'grey', 3); If you insert ‘0’ as a string, MySQL will insert an empty row because ENUM indices start from 1. Now, need to check what’s available? Simply run: SELECT product_name, size, color, availability \nFROM products \nWHERE availability = 'in stock'; This grabs all in-stock items. You can also filter by number (e.g., WHERE availability = 1) or tack on a LIMIT clause to cap results: SELECT product_name, size, color, availability \nFROM products \nWHERE size = 1 LIMIT 3; Updating and modifying ENUM values in MySQL Updating ENUM values in MySQL can get tricky because it’s not as simple as changing data in a row. To update an ENUM value in MySQL for a specific record, you’d use a standard UPDATE like: UPDATE products SET status = 'shipped' WHERE product_id = 1; This is easy enough if the new value’s already in your ENUM list. But if you need to add or remove options from the list itself (say, adding “canceled” to a status ENUM('pending', 'shipped', 'delivered') , you’re stuck with a challenge: ENUM’s definition is baked into the table structure. You’ll have to use an ALTER TABLE statement to modify it. For instance: ALTER TABLE products MODIFY COLUMN status ENUM('pending', 'shipped', 'delivered', 'canceled'); This redefines the list, keeps existing data intact (old values stay as their integer indexes), and lets you use the new option. The catch? It’s a structural change, so it can lock the table and slow things down on big datasets. Alternative to ENUM: VARCHAR + CHECK constraint MySQL 8.0.16 introduced full support for CHECK constraints, so you can now enforce a list of allowed values without ENUM’s rigid structure (i.e, no table alterations when your options change). VARCHAR lets you store any string up to its length limit, and CHECK keeps it in line, giving you room to adapt as data needs evolve, unlike ENUM’s locked-in list. Here’s an example: CREATE TABLE products (\n product_id INT PRIMARY KEY AUTO_INCREMENT,\n product_name VARCHAR(255),\n status VARCHAR(20) CHECK (status IN ('pending', 'shipped', 'delivered'))\n); This sets status as a VARCHAR with a CHECK constraint, restricting it to three values. You can insert “shipped”, but “canceled” gets rejected, just like ENUM. However, updating the allowed values only needs an ALTER TABLE to tweak the constraint, not redefine the column. Advantages of VARCHAR + CHECK over ENUM When weighing ENUM vs VARCHAR, the VARCHAR + CHECK combo has some clear wins: It’s way simpler to tweak your allowed values (no need to overhaul the table structure like you do with ENUM). It leans on MySQL’s native constraint support, which means your setup is likely to play nicer with future updates. It works smoothly across other SQL databases like PostgreSQL or SQL Server, where ENUM isn’t directly supported. Best practices and correct usage of ENUM in MySQL ENUM is a handy data type, but you have to know when it fits. Here are a few tips to nail the correct usage of ENUM in MySQL: Use it for stable, short lists. Think 10-15 values max, where options won’t shift much, since adding more means altering the table. Index ENUM columns. Create indexes on ENUM columns you frequently use in WHERE clauses or JOIN conditions to significantly improve query performance. Avoid numeric dependencies. While you can query by index, it’s best to stick to strings — index shifts if you reorder values can mess up your logic. Keep it clear for the long haul. Use descriptive and meaningful ENUM values to keep your data clear and maintainable over time. Avoid overuse: If your data’s always changing, use VARCHAR + CHECK instead of ENUM for better scalability, cross-database compatibility, and long-term flexibility. Common mistakes when using ENUM in MySQL When working with ENUM in MySQL, it’s easy to stumble into a few common pitfalls. First, remember that modifying ENUM values later involves an ALTER TABLE, which locks the table and drags performance on big datasets. Also, ENUM’s 1-2 byte storage sounds efficient, but if you’re pushing dozens of options, it’s less readable and harder to maintain than VARCHAR, which scales without structural rework. Another slip? Assuming ENUM’s tiny footprint always beats alternatives. VARCHAR with CHECK might use more space, but it’s way more flexible for evolving data. How dbForge Studio for MySQL can help with ENUM in MySQL Managing ENUM fields in MySQL can be a headache, especially when creating, modifying, or updating them in large databases. dbForge Studio for MySQL can take some of that pain away. It’s a robust database management tool that simplifies the process with a Visual [MySQL Query Builder](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) that makes it easy to define or edit ENUMs without digging through code. Plus, its schema comparison and refactoring features help you safely update ENUM lists across tables, catching issues before they bite. Additionally, it supports importing and exporting ENUM values in various formats, such as CSV, Excel, JSON, and XML. If you want to streamline your MySQL workflow, [download the free trial](https://www.devart.com/dbforge/mysql/studio/) and see how it helps. Conclusion ENUM in MySQL is perfect for fixed, compact value sets, providing efficient storage and fast performance. However, modifying it requires table alters, which can slow you down and tax big databases, so stick to stable data. dbForge Studio for MySQL streamlines ENUM management, cutting through the complexity with a visual query builder and reliable schema tools.To fully understand ENUMs strengths and limitations, check out more guides or download [dbForge for MySQL](https://www.devart.com/dbforge/mysql/studio/) and test it on a real project. FAQ What is the difference between ENUM and SET in MySQL? ENUM allows only one value from a predefined list, while SET permits multiple values. ENUM is stored as a single integer, whereas SET uses a bitmap representation, making it suitable for multiple-choice fields. How does the ENUM data type in MySQL store values internally? ENUM values are stored as integers, with each string option assigned a numeric index starting from 1. This compact storage improves efficiency compared to VARCHAR. What is the correct usage of ENUM in MySQL for storing predefined values? ENUM works best for stable, small sets of predefined values like status labels or categories. It ensures data consistency and reduces storage, but frequent modifications require altering the table structure. How to use ENUM in MySQL to optimize database performance? ENUM improves performance by storing values as integers, making comparisons and sorting faster. It also reduces storage requirements and enforces consistency, avoiding invalid data entries. Can I update ENUM values in MySQL after creating a table? You can modify ENUM values using ALTER TABLE, but this locks the table and may impact performance, especially in large databases. Planning ENUM values carefully helps avoid frequent changes. Does ENUM in MySQL support indexing, and how does it affect query performance? Yes, ENUM columns support indexing, making lookups and filtering more efficient. Since ENUM values are stored as integers, indexed searches run faster than equivalent VARCHAR-based queries. How can dbForge Studio for MySQL help with managing ENUM data types in MySQL? dbForge Studio simplifies ENUM management by providing a visual query builder, schema comparison tools, and safe refactoring options, reducing the complexity of modifying ENUM values. Can I visually create and modify ENUM columns in MySQL using dbForge Studio? Yes, dbForge Studio allows you to create and modify ENUM columns through an intuitive visual interface, eliminating the need to manually write and execute ALTER TABLE statements. Tags [dbForge for MySQL](https://blog.devart.com/tag/dbforge-for-mysql) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenum-in-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=ENUM+Data+Type+in+MySQL%3A+Syntax%2C+Examples+and+Best+Practices&url=https%3A%2F%2Fblog.devart.com%2Fenum-in-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enum-in-mysql.html&title=ENUM+Data+Type+in+MySQL%3A+Syntax%2C+Examples+and+Best+Practices) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enum-in-mysql.html&title=ENUM+Data+Type+in+MySQL%3A+Syntax%2C+Examples+and+Best+Practices) [Copy URL](https://blog.devart.com/enum-in-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/enumeration-support.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ORM Solutions](https://blog.devart.com/category/products/orm-solutions) Enumeration Support By [dotConnect Team](https://blog.devart.com/author/dotconnect) May 17, 2010 [4](https://blog.devart.com/enumeration-support.html#comments) 4612 Introducing Enumeration support in databases Enumerations support in LINQ to SQL Case management Entity Developer User interface for enumerations Creating enumerations Naming rules The model validation Update Model From Database Entity Framework Introducing We keep receiving requests about enumeration support in [Entity Developer](http://www.devart.com/entitydeveloper/) from our customers. That’s why we plan to implement this functionality in the second half of 2010. In this article we describe the functionality concerning enumeration support we plan to add. It is only a draft and these features are a subject to change. It depends on the customer suggestions in many respects, so if you have any suggestions or requirements concerning this functionality, please write them as comments to this article or in any other way comfortable for you. Enumeration support in databases Some databases have native enumeration support: [MySQL](http://dev.mysql.com/doc/refman/5.0/en/enum.html) [PostgreSQL](http://www.postgresql.org/docs/current/static/datatype-enum.html) supports enumerations from the 8.3 version Microsoft SQL Server, Oracle, SQLite store enumerations as usual string or numeric columns. When data is stored in numeric columns, a single table-dictionary with foreign key is often used. When table-dictionary isn’t used, usually the SQL CHECK constraint is applied to the column values. Enumeration support in LINQ to SQL Surely, enumeration usage makes sense only in case when column values on which enumeration was formed are static and constant during the client application work. Enumerations in C#/Visual Basic have explicit or implicit specified integer value for each element and each element has a name. This enumeration can be stored in the database as an integer or string value. Complete support of numeric and string enumerations is available in LINQ to SQL. Please view a sample concerning this support [LINQ to SQL Mapping Enum from string](http://stackoverflow.com/questions/211875/linq-to-sql-mapping-enum-from-string) . We support LINQ to SQL in the following products – [dotConnect for Oracle](http://www.devart.com/dotconnect/oracle/) , [dotConnect for MySQL](http://www.devart.com/dotconnect/mysql/) , [dotConnect for PostgreSQL](http://www.devart.com/dotconnect/postgresql/) , and [dotConnect for SQLite](http://www.devart.com/dotconnect/sqlite/) . Recently we released a beta-version of a new product which integrates LINQ to SQL support for Oracle, MySQL, PostgreSQL, SQLite, and SQL Server – [Devart LINQ Connect](http://www.devart.com/linqconnect/) . The main aspects of the enumerations in LINQ to SQL are described in the next chapters. Case management Sometimes it is impossible to set up a case of enumerations in LINQ to SQL. For example, string values ‘man’ and ‘woman’ are stored in a database, but customer wants to use standard capitalized names in C#. public enum Gender {\n Man,\n Woman\n} Also, string values in the database can be presented as abbreviations, words written in uppercase, etc. Public enumeration saves .NET enumeration element readability and provides flexibility during enumeration mapping to the database. It can be similar to the code sample below: public enum Devart.Data.Linq.EnumMappingCaseRule {\n CaseSensitive,\n UpperCase,\n LowerCase,\n CaseInsensitive\n} It is better to use some flag to manage this behaviour, but adding this flag on the DataQuery level can be insufficient. It may be configured on the DataContext level, because in another case object materialization returned from stored procedures and executed native SQL statements can be incomplete. But in this case mapping flexibility can be lost. So, the best practice will be to separate each enumeration configuration using case sensitivity. Mapping rules for enumeration in C# can be described with the following attribute: [AttributeUsage(AttributeTargets.Enum)]\npublic sealed class Devart.Data.Linq.EnumMappingAttribute\n (Devart.Data.Linq.EnumMappingCaseRule caseRule) : Attribute { } So, our resulting enumeration will be the following: [EnumMapping(EnumMappingCaseRule.LowerCase)]\npublic enum Gender {\n Man,\n Woman\n} Entity Developer Let’s consider enumeration support in the Entity Developer for LINQ to SQL models. User interface for enumerations The Enumerations node will be added to Model Explorer. A node is created for each enumeration in the model and it can be opened to display its elements. Each enumeration has its Access Modifier (public/internal). When enumeration is created by the column or table-dictionary, information about it can be saved in the Documentation property. This description will be included to the generated C# code as XML-comment, shown to the developer with the help of Visual Studio Intellisense. /// \n /// This enum was generated for \n/// the \"ReasonType\" column of the \"AdventureWorks.Sales.SalesReason\" table.\n /// \n public enum ReasonType {\n ...\n } Support of EnumMappingCaseRule will be added in design-time as well. Each enumeration can be: a simple link for external .NET enum-type described in another part of an application or in one of the included references (full type name with class namespace is required). enumeration described in the model that has its own elements list. This enumeration will be generated during code generation. Each element of the enumeration has a required Name property and a set of the optional properties. The Integer value of an enumeration element can be stored in the Integer Value property. The Documentation property can store the full name (with spaces and special characters) of the element. User also can place here his own comments concerning the elements of this enumeration. The Documentation property will be saved as an XML-comment to enumeration during code generation. public enum GlobalRegion {\n /// \n /// The Americas, being North, Central, and South America\n /// \n AMER = 1,\n /// \n /// Europe, the Middle East and Africa\n /// \n EMEA = 2,\n /// \n /// Asia Pacific, and Japan\n /// \n JAPAC = 3\n } Creating enumerations Enumeration can be created manually (just by choosing Add New Enum from the Enumerations node popup menu and then Add New Element from the enumeration popup menu). Some proposals concerning enumeration creation are described here (we don’t know which of them would be implemented): You can generate enumeration for MySQL and PostgreSQL for table columns of enumeration data type at once. Other ways of enumeration creation are listed below: Generating enumerations for a string column The example of the string enumeration is the ReasonType column in the Sales.SalesReason table of the AdventureWorks database. To generate enumeration from the string property use the Convert to Enum menu item or drag a property to the Enumerations node. During this operation the database call will be executed for data fetching from this column. A new enumeration will be created (or an existing will be detected and used) depending on the selected data. User will receive a Message Box warning about database call. If there are no records returned then enumeration wouldn’t be created. User will be informed about that too. Also please note that the string enumeration element names can be incorrect for C#/Visual Basic. User should be aslo informed about this. Property type will be changed if enumeration was succesfully created. Generating enumerations for integer columns The example of the integer enumeration is the Status column in the Production.Document table of the AdventureWorks database. Everything that was written concerning string columns can be also used for integer columns. But this is less convenient. In this case probably a table-dictionary exists in the database. User has to rename all enumeration elements from the numeric to the string values manually if the table-dictionary doesn’t exist in the database. Generating enumerations for table-dictionary The tables Person.AddressType and Person.ContactType are an example of the table-dictionary in the AdventureWorks database. After an integer column migrates to the enumeration which presents values from the current table-dictionary, user doesn’t need to join these tables. Besides, user can delete the entity corresponding to the table-dictionary from the model and leave only the enumeration. The Create Enum Based on This Entity popup menu item is available for an entity with two properties (integer primary key and string column), if this entity is included to the association as a parent table. Similar action can be provided by moving this entity to the Enumerations node. If this entity has other columns, then user should delete them. For example, the AdventureWorks database has has Modified date column in all tables. After this operation the database call will be executed for the process of receiving data from this table and a new enumeration will be created (or an existing one will be detected and used) depending on the selected data. If enumeration was successfully created, then [Entity Developer](http://www.devart.com/entitydeveloper/) checks all foreign keys and changes the related properties type. Maybe, it is a good idea to implement such popup menu item as Convert Entity to Enum . In this case, entity and all related associations will be deleted. Naming rules Naming Rules of [Entity Developer](http://www.devart.com/entitydeveloper) require enhancement. User will be able to setup rules for the enumeration element name conversion during the process of the enumeration creation. All names will be capitalized by default. Model validation Model validation will check the following points: Are enumerations and their element names correct for C#/Visual Basic? Are enumeration element names unique within the enum bounds? Are specific integer values unique for the enumeration elements? Update Model From Database When using the Update Model from Database wizard it is necessary to update enumeration element collection built by column or table-dictionary. Maybe, the wizard should not offer to add table-dictionary if it was deleted from the model. Entity Framework The possibility to work with enumerations in Entity Framework is important for users too. Enumeration support on the EF run-time level was planned in the first [CTP](http://msdn.microsoft.com/en-us/library/aa697428(VS.80).aspx) in 2006. However, it wasn’t implemented in EF v4. Entity properties in Entity Framework have a limited set of types. All of the possible solutions can be provided only with the help of the wrapper property, but in LINQ to Entities queries calls are made only via primary db-property which is mapped to a database column, which is mapped to database column. There are some samples concerning this point: [How do I replace an Int property with an Enum in Entity Framework?](http://stackoverflow.com/questions/353190/how-do-i-replace-an-int-property-with-an-enum-in-entity-framework) , [Enum in ADO.NET Entity Framework v1](http://weblogs.asp.net/alexeyzakharov/archive/2008/09/18/enum-in-ado-net-entity-framework-v1.aspx) . Therefore the necessity of the enumeration support for Entity Framework is not obvious, although such support is potentally possible, but noticeably limited if compared to LINQ to SQL. For example, it is possible to support enumeration storing in the model and automatic enumeration generation based on the database values. But in future we will have a trouble with enumeration linking with the specific mapped properties for wrapper property creating. It is possible to provide a possibility of creating wrapper properties, but this seems too difficult. Maybe, it is necessary to allow setting up enumeration mapping for specific tables/columns and saving it in the enumeration itself. It will help with the Update model from database task, and it will allow updating values of the enumeration element collection from several tables. It will simplify the trouble with Entity Framework models. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fenumeration-support.html) [Twitter](https://twitter.com/intent/tweet?text=Enumeration+Support&url=https%3A%2F%2Fblog.devart.com%2Fenumeration-support.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/enumeration-support.html&title=Enumeration+Support) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/enumeration-support.html&title=Enumeration+Support) [Copy URL](https://blog.devart.com/enumeration-support.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS Rui Marques May 4, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 4:50 am Currently I’m using a simple solution that mimics Enum support to entity classes. Having a way to do this task on the designer with the ability of the T4 templates would be a killer feature. Shalex May 6, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 4:37 am Rui, Please describe your approach. How should it work (user interface, etc)? Best regards Rui Marques May 9, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 5:52 am I user this approach: [http://landman-code.blogspot.com/2010/08/adding-support-for-enum-properties-on.html](http://landman-code.blogspot.com/2010/08/adding-support-for-enum-properties-on.html) . To be honest this is the approach most used if you Google for it. I terms of user interface something like right-click on a property and having an option “Map to Enum type” would be great. Then having the choice to browse from an assembly would be just perfect. I think that having also full support within Entity Developer would please everyone if some sort of specific model settings are permitted: include enum in current class, make enum global, etc, etc… [Shalex](http://www.devart.com) September 7, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 5:06 am Entity Developer supports ENUM types now: [https://www.devart.com/forums/viewtopic.php?t=21949](https://www.devart.com/forums/viewtopic.php?t=21949) . Comments are closed."} {"url": "https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Estimating Coverage of Project’s Source Code with Code Review By [ALM Team](https://blog.devart.com/author/alm) November 27, 2014 [0](https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html#respond) 4741 Summary : This article describes how to estimate the project’s source code coverage with code review. The article also gives an outline of how to make the most from Review Assistant’s Code Coverage report. As from version 2.6, [Review Assistant](https://www.devart.com/review-assistant/) , Devart’s code review tool, provides the new Code Coverage report . Developed in response to numerous requests from our customers, the report serves for a better quality control over the code review process. Within the context of this article, we would like to show how to effectively use the tool. In particular, we will elaborate on how to: Exclude excessive data from a report with filtering Group report data Interpret report results Overview of Code Coverage Report First, let’s create the Code Coverage report: In Visual Studio, click the Review Assistant Reports button on the Review Assistant toolbar In the Report box, select Code Coverage In the Project box, select a project Use the Date Picker control to select the report time period Click View Report The opened Code Coverage report looks in the following way. Figure 1. Initial Code Coverage Report As we see, the report combines data from the project repository with the data on preformed reviews. The Status column shows whether the given revision was reviewed or not. Also, the Linked Reviews column allows to quickly navigate to the Code Review Board and view the details of a particular review. Preparing Code Coverage Data for  Analysis By looking at Figure 1 , it becomes obvious that not all revisions should be put into consideration when estimating the coverage of the review project. Even such small commit range includes two commits, that were automatically generated by Git source control. Therefore, to prepare data for analysis, we need to: Exclude revisions with automatic merges – usually, such revisions do not include users’ edits. Eliminate revisions, that were created by a Continuous Integration Server. As a rule, these revisions represent the change of project version after a night build. To achieve the aforesaid goals, we will use filtering. Click Edit Filter and create a filter, similar to the one, that is shown in Figure 2 . Figure 2. Report Filter The filter in Figure 2 can be interpreted in the following way: the commit comment should not include the “Merge” word, and anyone can be an Author, except AlmBuildBot (a virtual user, created for CI Server needs). Estimating the Total Number of the Reviewed Revisions Now our report contains meaningful revisions. So, we can estimate how many revisions passed through code revision. In this regard, we need to group the report by the Status column. Right-click the header of the column and select Group By This Column from the popup menu (see Figure 3 ). Figure 3. Report Grouping After grouping,  the report will look in the following way. Figure 4. Volume of Unreviewed Revisions As you see in Figure 4 , the selected time period includes revisions, that are currently reviewed. However, it is clear that 40,6% of them stay unreviewed. Therefore, 60% of code in our example has been reviewed . Important Note When it comes to the Code Coverage report, it is important to say, that we estimate the number of reviewed revisions, not code lines. Our estimation of the reviewed code volume is based on the presumption, that all revisions are approximately of the same size. Also, we presume that the project code is reviewed on a regular basis, not from time to time (in the latter case, there is no sense to estimate statistics at all). Conclusion We have reviewed the main features and benefits of the Code Coverage report, implemented in Devart’s code review tool, as well as the ways to analyze the code coverage report data. We hope it helps you improving control over code review process. [Download Review Assistant](https://www.devart.com/review-assistant/download.html) and start reviewing code today. Tags [code review](https://blog.devart.com/tag/code-review) [review assistant](https://blog.devart.com/tag/review-assistant) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Festimating-coverage-of-projects-source-code-with-code-review.html) [Twitter](https://twitter.com/intent/tweet?text=Estimating+Coverage+of+Project%E2%80%99s+Source+Code+with+Code+Review&url=https%3A%2F%2Fblog.devart.com%2Festimating-coverage-of-projects-source-code-with-code-review.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html&title=Estimating+Coverage+of+Project%E2%80%99s+Source+Code+with+Code+Review) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html&title=Estimating+Coverage+of+Project%E2%80%99s+Source+Code+with+Code+Review) [Copy URL](https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/evaluating-developers-performance-in-code-review-process.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) Evaluating Developer’s Performance in Code Review Process By [ALM Team](https://blog.devart.com/author/alm) December 22, 2014 [0](https://blog.devart.com/evaluating-developers-performance-in-code-review-process.html#respond) 5015 Summary : This article describes how to estimate the coverage with code review of code written by individual developers. The article builds upon [Estimating Coverage of Project’s Source Code with Code Review](https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html) . As from version 2.6, [Review Assistant](https://www.devart.com/review-assistant/) , Devart’s code review tool, provides the new Code Coverage report . Developed in response to numerous requests from our customers, the report serves for a better quality control over the code review process. Within the context of this article, we would like to show how to: Evaluate team performance in the code review process Evaluate individual developer’s performance Disclaimer: From the very beginning I would like to clarify that the article uses the performance evaluation term. Yet no one should abuse it. Review Assistant provides a supplementary tool for evaluating the performance of software developers. And you have to be very careful while doing conclusions based on the reports. Preparing Code Coverage Data for  Analysis For the demo purposes we will use the Code Coverage report provided by Review Assistant. As we have previously demonstrated, before analysing the data we need to weed out the “rubbish data” as described in [Estimating Coverage of Project’s Source Code with Code Review](https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html) . Evaluating team’s performance in the code review process After we filtered out irrelevant data, we can start the analysis. For example, let’s check what developers have most unreviewed code. To do this: 1. Group the report by the Status column. Right-click the header of the column and select Group By This Column from the popup menu 2. Similarly group the report by the Author column. 3. Right-click the header of any column and select Show Group Panel from the popup menu. The report should now look like on Figure 1. Figure 1 – Grouped Code Coverage Report We want to know whose code is least reviewed. For this, we need to take the next step. 4. On the group panel, right-click the Author column, and then select Sort By Summary in the popup menu. Figure 2 – Sorting report by a summary Now the report will be sorted within the group. After sorting, we can easily see (Figure 3), the largest number of unreviewed revisions. In this particular case, almost a half of that revisions is owned by a user named ArtemA. Figure 3 – The least reviewed author Evaluating individual developer’s performance Let’s have a look at a single developer. For this: Filter the report so that it only contains AlexeyN’s revisions. For more details read [Estimating Coverage of Project’s Source Code with Code Review](https://blog.devart.com/estimating-coverage-of-projects-source-code-with-code-review.html) . Group the report by the Author column. Group the report by the Status column. The report should now look like on Figure 4. Figure 4 – Code Coverage report filtered for one author Looking at this report we can draw the following conclusions : The author made 2 commits at average during the work day (43 commits for 20 days on the November) 72% revisions reviewed (31 revisions from 43). If we look at the Authors Statistics report for the same time-period, we can get an additional information . Figure 5 – Authors Statistics report The report shows that AlexeyN: 19 times issued review requests Got 2 comments at average for every review. Had 1 defect for 2 reviews (1 defect rep 3 commits). We can conclude that there are 3-4 uncovered defects in those 12 revisions that were not reviewed. Conclusion We have reviewed the main features and benefits of the Code Coverage report, implemented in Devart’s code review tool, as well as the ways to analyze the code coverage report data. We hope it helps you improving control over code review process. [Download Review Assistant](https://www.devart.com/review-assistant/download.html) and start reviewing code today. Tags [code review](https://blog.devart.com/tag/code-review) [review assistant](https://blog.devart.com/tag/review-assistant) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fevaluating-developers-performance-in-code-review-process.html) [Twitter](https://twitter.com/intent/tweet?text=Evaluating+Developer%E2%80%99s+Performance+in+Code+Review+Process&url=https%3A%2F%2Fblog.devart.com%2Fevaluating-developers-performance-in-code-review-process.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/evaluating-developers-performance-in-code-review-process.html&title=Evaluating+Developer%E2%80%99s+Performance+in+Code+Review+Process) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/evaluating-developers-performance-in-code-review-process.html&title=Evaluating+Developer%E2%80%99s+Performance+in+Code+Review+Process) [Copy URL](https://blog.devart.com/evaluating-developers-performance-in-code-review-process.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/excel-2021-support-and-more-updates-in-excel-add-ins-2-6.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) Excel 2021 Support and More Updates in Excel Add-ins 2.6 By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 20, 2021 [0](https://blog.devart.com/excel-2021-support-and-more-updates-in-excel-add-ins-2-6.html#respond) 3451 Devart is glad to announce the release of [Excel Add-ins 2.6](https://www.devart.com/excel-addins/) with support for Microsoft Excel 2021, renamed and updated Excel Add-in for G Suite, and more cloud objects supported in other add-ins. Microsoft Excel 2021 Support Main feature of the update is support for the newest Microsoft Excel version – Excel 2021. With better Microsoft Excel 2021 collaboration tools, it’s even more convenient to share your reports from cloud and database data, created with our Excel Add-ins. Excel Add-in for Google Workspace Excel Add-in for G Suite was renamed to Excel Add-in for Google Workspace. Besides, we have added support for People API, so now you can query OtherContacts object and work with the list of people, who are not in the Contact list of the connection Google account, but with which it interacted. Other Updates We have added support for the DealsStageHistory object in Excel Add-in for Zoho CRM and ProductCustomsInformation object in Excel Add-in for BigCommerce. Besides, we have supported modern encryption algorithms in Excel Add-ins for MySQL and PostgreSQL. You are welcome to download the updated version of our [Excel Add-ins](https://www.devart.com/excel-addins/universal-pack/download.html) and send [feedback](https://www.devart.com/excel-addins/universal-pack/feedback.html?pn=Devart%20Excel%20Add-ins) Tags [excel addins](https://blog.devart.com/tag/excel-addins) [what's new excel addins](https://blog.devart.com/tag/whats-new-in-excel-addins) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexcel-2021-support-and-more-updates-in-excel-add-ins-2-6.html) [Twitter](https://twitter.com/intent/tweet?text=Excel+2021+Support+and+More+Updates+in+Excel+Add-ins+2.6&url=https%3A%2F%2Fblog.devart.com%2Fexcel-2021-support-and-more-updates-in-excel-add-ins-2-6.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/excel-2021-support-and-more-updates-in-excel-add-ins-2-6.html&title=Excel+2021+Support+and+More+Updates+in+Excel+Add-ins+2.6) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/excel-2021-support-and-more-updates-in-excel-add-ins-2-6.html&title=Excel+2021+Support+and+More+Updates+in+Excel+Add-ins+2.6) [Copy URL](https://blog.devart.com/excel-2021-support-and-more-updates-in-excel-add-ins-2-6.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/excel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) Excel Add-ins 2.1 with PostgreSQL 12 Support, HubSpot Web Login, and Other Improvements By [dotConnect Team](https://blog.devart.com/author/dotconnect) November 19, 2019 [0](https://blog.devart.com/excel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html#respond) 4724 Devart is glad to announce the release of [Excel Add-ins 2.1](https://www.devart.com/excel-addins/) with support for PostgreSQL 12, HubSpot web login, and a number of other improvements for cloud applications. PostgreSQL 12 Support Now you can use Excel Add-ins with the latest version of the open-source PostgreSQL database. HubSpot Web Login Support You don’t need to obtain HubSpot API Key and enter it to connect to HubSpot anymore. Just select the RefreshToken authentication, and click Web Login. Then log in to HubSpot and grant Excel Add-ins access to HubSpot. Besides, the LineItems object is now available for HubSpot, allowing you to work with deal line items data. Server-to-server Authentication for Salesforce Marketing Cloud Now you can use Server-to-server authentication to connect to Salesforce Marketing Cloud. For this authentication, you need to open the Advanced connection parameters and select ServerToServer in the Authentication Type parameter. Then specify your Marketing Cloud subdomain and client ID and client secret from Salesforce Marketing Cloud application center in the corresponding parameters. Other Improvements Additionally, two more parameters have been added for all the connections to cloud apps: Failover Retries and Local SQL Engine. To access them, click the Advanced button when editing a connection. Failover Retries specifies the number of attempts to automatically retry a command when the command failed because of the reason that is probably just a temporary issue. In this case, the command will probably be successfully re-run after a small delay without showing any error messages to the user. The Local SQL Engine parameter allows turning off local SQL processing for cases when complex queries cannot be translated directly to calls to the corresponding cloud source. While local SQL processing is beneficial in most cases and allows running complex queries, in some cases it may cause performance issues and take much time, so if you would prefer not to use it, you may turn it off. In this case, if you use a query that is too complex for your data source, you will get the corresponding message that the query cannot be directly executed by service. Feel free to [download the new versions of Devart Excel Add-ins](https://www.devart.com/excel-addins/download.html) , try the new functionality, and [leave feedback](https://www.devart.com/excel-addins/feedback.html?pn=Devart%20Excel%20Add-ins) ! Tags [excel addins](https://blog.devart.com/tag/excel-addins) [what's new excel addins](https://blog.devart.com/tag/whats-new-in-excel-addins) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html) [Twitter](https://twitter.com/intent/tweet?text=Excel+Add-ins+2.1+with+PostgreSQL+12+Support%2C+HubSpot+Web+Login%2C+and+Other+Improvements&url=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/excel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html&title=Excel+Add-ins+2.1+with+PostgreSQL+12+Support%2C+HubSpot+Web+Login%2C+and+Other+Improvements) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/excel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html&title=Excel+Add-ins+2.1+with+PostgreSQL+12+Support%2C+HubSpot+Web+Login%2C+and+Other+Improvements) [Copy URL](https://blog.devart.com/excel-add-ins-2-1-with-postgresql-12-support-hubspot-web-login-and-other-improvements.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) Excel Add-ins 2.10 Are Coming Soon By [DAC Team](https://blog.devart.com/author/dac) October 8, 2024 [0](https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html#respond) 1002 We recognize how crucial Excel Add-ins are for businesses, enabling them to streamline processes, enhance data analysis, and integrate seamlessly with other essential tools. That’s why we are excited to announce that a new release of our Excel Add-ins is planned for October 8, 2024. The [Devart Excel Add-ins](https://www.devart.com/excel-addins/) 2.10 release is focused on improving performance, ensuring compatibility with the latest APIs from Salesforce and Shopify, expanding data integration from Shopify, HubSpot and FreshBooks, and adding new safeguards for data integrity, helping businesses streamline operations and improve reporting. New features and improvements Significantly improved the performance of data import to Excel from relational databases and some fast cloud services. Added support for Salesforce Web Services API version 59.0. Added support for Shopify API version 2024-04. Introduced a new parameter—”Warn about tables without columns that uniquely identify rows” now available under Options > Edit > General. Introduced a new parameter—the “Show connection string” option (default value: True), now available under Options > Edit > Error Handling. Introduced a new parameter—the “Show sensitive info in connection string” option (default value: False), now available under Options > Edit > Error Handling. Added a new “Copy Log” button in the Execution window. For Shopify, added the GiftCards table. For HubSpot, added new read-only tables: Invoices, InvoiceLineItems, LineItemInvoices, Teams, and Users. For FreshBooks, added new tables: BillVendor, Bill, BillLine, and BillPayment. Tags [excel add-ins](https://blog.devart.com/tag/excel-add-ins) [new release](https://blog.devart.com/tag/new-release) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-2-10-are-coming-soon.html) [Twitter](https://twitter.com/intent/tweet?text=Excel+Add-ins+2.10+Are+Coming+Soon&url=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-2-10-are-coming-soon.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html&title=Excel+Add-ins+2.10+Are+Coming+Soon) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html&title=Excel+Add-ins+2.10+Are+Coming+Soon) [Copy URL](https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/excel-add-ins-version-2-7-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [What’s New](https://blog.devart.com/category/whats-new) Excel Add-ins: Version 2.7 Released By [DAC Team](https://blog.devart.com/author/dac) September 21, 2022 [0](https://blog.devart.com/excel-add-ins-version-2-7-released.html#respond) 2813 We are pleased to announce the release of Excel Add-ins version 2.7. The update includes official Windows 11 support, multiple security, and performance improvements. The release addresses general performance improvement, implementation of additional encryption algorithms for SQLite databases, and bug fixes. The list of changes is as follows: Windows 11 official support – Excel Add-ins are now fully compatible with the newest Windows version. Excel Add-in for Microsoft Dynamics CRM has been renamed to Excel Add-in for Microsoft Dynamics 365. Implemented in-built support of multiple encryption algorithms for SQLite databases: AES-128, AES-192, AES-256, Blowfish, CAST-128, RC4, Triple DES . Replacing an OAuth web login with a user’s external default browser for the following components: Microsoft Dynamics 365, Google Workspace, HubSpot, Salesforce, and Zoho CRM. Performance improvement during data import: implemented the Buffer Size feature that may be located in Option -> Import -> Data -> General . Please see the complete list of changes on the [official Devart website](https://www.devart.com/excel-addins/universal-pack/revision_history.html) . The newest Excel Add-ins version may be downloaded [here](https://www.devart.com/products.html#excel) . Tags [excel addins](https://blog.devart.com/tag/excel-addins) [new version](https://blog.devart.com/tag/new-version) [ready for Windows 11](https://blog.devart.com/tag/ready-for-windows-11) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-version-2-7-released.html) [Twitter](https://twitter.com/intent/tweet?text=Excel+Add-ins%3A+Version+2.7+Released&url=https%3A%2F%2Fblog.devart.com%2Fexcel-add-ins-version-2-7-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/excel-add-ins-version-2-7-released.html&title=Excel+Add-ins%3A+Version+2.7+Released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/excel-add-ins-version-2-7-released.html&title=Excel+Add-ins%3A+Version+2.7+Released) [Copy URL](https://blog.devart.com/excel-add-ins-version-2-7-released.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/excel-data-visualization.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) Excel Data Visualization: The Best Tools and Techniques for Stunning Insights By [Victoria Shyrokova](https://blog.devart.com/author/victorias) March 27, 2025 [0](https://blog.devart.com/excel-data-visualization.html#respond) 242 Let’s say you have a big table full of sales numbers, and you need to turn it into something clear and easy to understand. If you stare at endless rows of data, you can use Excel to create charts and graphs that make the information visually appealing and useful. This is why data visualization is so important — it helps you see patterns, spot trends, and make smarter decisions. Excel is a go-to tool for this because it’s easy to use and packed with great features for creating charts, graphs, and dashboards. Why Excel data visualization matters in decision-making and how to visualize data in Excel? We’ll explain step by step how to visually display data in Excel and what other tools you should pay attention to. Improve your data visualization in Excel and turn data into attractive stories! Table of contents Why data visualization in Excel is essential for analysts Built-in Excel data visualization tools: The essentials Using PivotTables & PivotCharts for interactive dashboards Conditional formatting: Visualizing data without charts Integrating external data for advanced Excel visualizations Data visualization tools for Excel: Beyond built-in features Best practices for effective data visualization in Excel Conclusion: Unlocking the full potential of Excel for data visualization Why data visualization in Excel is essential for analysts Looking at rows and rows of numbers can be exhausting. Data analysis and visualization with Excel makes it easier to spot trends, patterns, and key insights without digging through endless spreadsheets. If you struggle with raw data, you can use charts and graphs to bring numbers to life. Excel is for sure the most popular tool for data analysis because it’s simple and powerful at the same time. Moreover, it has many customization options. For example, you can change chart titles, adjust the data, and much more. If you really want to delve deeper into the subject, you can sign up for a comprehensive course on charts and dashboards. Relying only on tables can slow you down and make reports harder to understand. Using Excel visualizations helps make data clearer and more engaging for teams, clients, and decision-makers. Visualize data in Excel to not only simplify complex information but also improve communication and make it easier to act on important insights. Built-in Excel data visualization tools: The essentials Excel has everything you need to turn plain numbers into clear, easy-to-read charts. Whether you’re making a simple bar graph or a detailed financial report, Excel helps you spot trends and patterns without the headache. With just a few clicks, your raw data becomes something useful. You can also tweak your charts however you like — change colors, add labels, or adjust the layout to make them look just right. Whether it’s a quick report or a full dashboard, Excel gives you the tools to make your data stand out. How to create basic charts in Excel How to visualize data in Excel? Making a chart in Excel is super simple. Follow these steps to create bar charts, line graphs, and pie charts. Bar charts: great for comparing numbers Select your data written in the Excel table. Click on the Insert tab. Choose Column or Bar Chart and select a style (e.g., Clustered Column). The chart will appear with your data. When to use Bar Charts Bar charts are ideal for illustrating how data is distributed across categories and highlighting differences between them. They work well for comparing integers or percentages, especially when the data is grouped into distinct categories. When to avoid Bar Charts Despite their popularity, bar charts aren’t always the best choice. Avoid using them for summarizing continuous data, analyzing long time series, or identifying correlations—other visualization types are better suited for these tasks. Line graphs: help you see trends over time Select your data written in the Excel table. Go to the Insert tab. Choose Line Chart and pick a 2D or 3D style. The line graph will display trends over time. When to use Line Charts Line charts are great for visualizing trends within a dataset, showing overall patterns, or comparing multiple trends simultaneously. They are especially useful for making forecasts based on historical data. When to avoid Line Charts If your dataset contains only a few values over a short time span, a bar chart may be a better choice. Additionally, if the data consists of discrete values with no logical progression between points, it’s best to use a chart type that doesn’t rely on continuous connections. Pie charts: show how different pieces make up a whole Select your data written in the Excel table. Click Insert and choose Pie or Doughnut Chart . Pick a 2D or 3D pie chart style. The pie chart will show proportions visually. Once your chart is ready, you can make it look even better. Excel data visualization examples prove that changes in the colors, labels, and style adjustments let your charts look clearer and more professional instantly. A few simple tweaks can turn a basic chart into a powerful way to share your insights! When to use Pie Charts Pie charts illustrate part-to-whole relationships. Since these charts rely on color coding, they can display multiple proportions simultaneously, making them useful for emphasizing overall distribution rather than precise differences. When to avoid Pie Charts Pie Charts are not fit for pinpointing exact values or making direct comparisons. Advanced chart types for data analysts For more detailed analysis, Excel has some advanced chart options and data visualization techniques in Excel. Scatter plots and bubble charts help you see relationships between different factors, making them perfect for spotting trends or correlations. Scatter Chart (X, Y Plot) Select your dataset with two numerical columns. Click the Insert tab and choose Scatter (X, Y) Chart under the Charts group. Pick a Standard Scatter Chart to see individual data points. Your chart will now display relationships between the two variables. Bubble Chart A Bubble Chart is a variation of a scatter chart where the size of each data point (bubble). Follow Steps 1 & 2 from the Scatter Chart to create a Bubble Chart. Choose the Bubble Chart option. The third variable determines the size of each bubble. Your chart will now display three data dimensions. Histograms Histograms and box plots show how data is spread out, which is helpful for understanding distributions and patterns. Select your numerical data. Click Insert and choose Statistic Chart > Histogram . Excel automatically groups data into bins; you can adjust the bin size by right-clicking the X-axis and selecting Format Axis. The histogram will display data distribution. Box Chart Select your dataset with numerical values. Click Insert > Statistic Chart > Box & Whisker . The chart will display quartiles (box), median (line inside the box), and outliers (dots). If you’re dealing with financial data, waterfall charts are the best way to visualize data in Excel. They show how different amounts add up to a total, helping you break down things like profits, expenses, or changes in your budget. Select your dataset with starting values, increments, and final values. Click Insert and select Waterfall Chart . You’ll see a column-based visualization of positive and negative contributions. The chart will show how values increase or decrease over time. These advanced charts give you a clearer picture of more complicated data. Using PivotTables & PivotCharts for interactive dashboards Data visualization in Excel for interactive dashboards works great with pivot options. A pivot table structures data in tabular form with minimal effort, summarizes it, and enables its analysis. Step 1. Set up your data Step 2. Create a PivotTable Select your data, go to Insert > PivotTable . Choose New Worksheet and click OK . Drag fields into Rows, Columns, and Values to organize your data. Step 3. Add a PivotChart Click inside your PivotTable. Go to Insert > PivotChart , pick a chart type, and click OK . Step 4. Add slicers Click the PivotTable, go to Insert > Slicer . Select fields you want to filter and place them on your dashboard. Step 5. Format charts for a clean look and move everything onto one sheet for easy viewing. Pivot tables and pivot charts are those Excel data visualization tools that eliminate the need to create complex links or formulas within tables. Changes in a pivot table do not update the original data. Thus, visualize data in Excel with pivot tables to achieve data consistency. The use of pivot tables is always useful when a data source (e.g., a large table with a lot of data) needs to be analyzed and not all columns (fields) are needed. Example: Building a sales performance dashboard Think of a real example on how to visualize Excel data? You can create a sales performance dashboard using PivotTables and PivotCharts to track things like revenue, top-selling products, and regional sales. Start by creating a PivotTable to organize sales by category, region, or salesperson. (See the steps above if you need a refresher.) Then, add a PivotChart to show trends visually. Want to make your dashboard easier to use? Add slicers and filters! As we’ve mentioned before, these tools let you focus on specific fields, like time periods, products, or customer segments with just a click. Don’t stare at a massive spreadsheet, just quickly narrow down the data you need. Conditional formatting: Visualizing data without charts Data visualization with Excel offers Conditional Formatting . This feature helps you highlight important patterns without using charts. It automatically changes the look of cells based on their values. This makes it easier to spot trends, outliers, and key insights, especially when you’re working with large sets of data. Data visualization using Excel shouldn’t be boring. Instead of scrolling through rows of numbers, you can use colors, bars, or icons to make important data pop. For example, you could set high values to show in green, low values in red, and medium values in yellow. These visual cues help you quickly focus on what matters most. How to use them? Highlight important numbers, e.g., color cells that are too high, too low, or meet a rule. Add mini bars or color scales to compare values easily. Add icons, like arrows, checkmarks, or flags, to show trends fast. Set custom conditions to format data your way. Best practices: When to use color scales, data bars, and icon sets Each type of Conditional Formatting is helpful for different tasks in data visualization techniques in Excel. Color scales are great for showing ranges, like temperature or sales trends over time. Data bars give a quick visual comparison of numbers, making it easy to see which values are high or low just by glancing. Icon sets are useful for visually categorizing data. For example, you could use arrows to show trends or symbols to mark important items. These tools make your data clearer, more engaging, and easier to interpret. When used correctly, they help ensure your data is easy to understand at a glance. Integrating external data for advanced Excel visualizations Think of visualizing data in Excel like a supercharged notebook. Normally, you type in numbers and update them yourself, which can be a hassle. But if you connect Excel to outside sources (like a database, a website, or an online service), it can pull in fresh data by automating it. Your charts and reports update themselves — no extra work needed! With data visualization in Excel, you can keep an eye on things like sales, stock prices, or customer trends as they change in real time. Plus, you can pull in data from different sources and see everything in one place, making it easier to spot patterns and make smart decisions. Methods to import data from SQL, APIs, and cloud databases Excel offers easy ways to import external data. For example, you can use Power Query to pull data from SQL databases , which stores and organizes your data for easy analysis. If you want up-to-the-minute data, you can link Excel to online sources like stock market updates or customer databases. This way, the latest info flows in automatically instead of you typing it in manually. For really big data, cloud tools like Google BigQuery or Azure keep things running smoothly without slowing Excel down. Once your data is in, you can use PivotTables, charts, and other tools to make sense of it. And if you set up automatic updates, your reports will always stay fresh without you having to do a thing. Use case: Combining multiple data sources into one Excel dashboard Excel data visualization tools are a good-fit for many niches. Let’s say you need to track sales from different regions. With Excel, you can pull customer data from your CRM system , sales figures from a SQL database , and marketing data from a Google Analytics API — and bring it all together in one dashboard. In data analysis and visualization with Excel and Power Query and PivotTables, you can clean, organize, and visualize all your data in one place. You can also add buttons and filters to let people easily sort through the data and focus on what matters most. This makes it simple to break things down (like seeing sales by region or customer type) without messing with tables. Data visualization tools for Excel: Beyond built-in features Excel’s built-in charts are really great, and many analysts use them only. But if you want to take your data visuals even further, you can use extra tools. You can connect to real-time data, automate your analysis, and create more advanced reports as a professional visualizer. Just by adding these external plugins and software, you get a Marvel superhero called Excel whose superpowers are data analysis and visualization. We offer you to check out some powerful tools you can add to Excel. They include Devart’s ODBC Driver, Power Query, Power Pivot, Power BI, and a few free add-ins. Excel data visualization with Devart’s ODBC driver The [Excel ODBC Driver](https://www.devart.com/odbc/excel/) connects to databases like SQL, MySQL, PostgreSQL, and cloud storage. You don’t need to manually copy and paste data because Excel pulls in live updates automatically. This way, your charts and dashboards always show the latest info without extra work. It’s especially useful for analysts dealing with big datasets from multiple sources. No more constant updates — just real-time data whenever you need it. You can [try out the Devart ODBC Driver with a free trial](https://www.devart.com/odbc/) to see how it works. Microsoft Power Query & Power Pivot Power Query helps you bring in data from different places, clean it up, and get it ready for analysis — all without writing a single line of code. This is quite handy for handling large amounts of information without hassle. Power Pivot does complex calculations and creates custom formulas. It can handle millions of data rows and helps you build interactive dashboards. Together, Power Query and Power Pivot manage and analyze big data and turn Excel into a tool that competes with full business intelligence software. Power BI integration with Excel Power BI takes your Excel data visuals to the next level by offering interactive dashboards and real-time analytics. You can create detailed reports in Power BI and connect them directly to your Excel data. This gives you the ability to tell a clearer story with your data and get more comprehensive insights. The easiest way to [connect Power BI to Excel](https://www.devart.com/odbc/excelonline/integrations/excelonline-powerbi-connection.html) is by using the Power BI Publisher add-in . It allows you to share your Excel reports within Power BI and keeps your data connected in different platforms. Know businesses that need interactive and shareable dashboards that update in real time? It’s a great option for them! Free add-ins for data visualization What is the best way to visualize data in Excel? There are free add-ins that can improve Excel’s data visualization . For example: People Graph : Turns raw data into infographics. Bing Maps : Makes interactive maps from geographical data. ChartExpo : Creates advanced charts that Excel doesn’t support natively. Colorizer : Applies consistent color schemes to charts and tables. Zebra BI : Adds intuitive AI visuals to financial reporting. These free data visualization tools for Excel help create better charts, dashboards, and reports with minimal effort. Best practices for effective data visualization in Excel Want your Excel charts to actually make sense? Here’s how to keep them clear and useful. Pick the Right Chart Type : Bar charts are great for comparisons, line graphs are best for trends over time, and pie charts help show proportions. Keep It Simple and Avoid clutter : Too many labels, gridlines, or colors make it hard to focus on what matters. Stick to the key insights and skip the extra noise. Stay Consistent : Don’t mix too many colors or chart styles because it can be distracting. Use a simple color scheme and the same fonts if you want everything to look clean and professional. Use Live Data : If your numbers change often, connect Excel to an external source. This way, your charts update automatically, and you don’t have to worry about outdated info. These small changes can make a big difference. They will make your Excel data visualizations more attractive and reliable. Conclusion: Unlocking the Full Potential of Excel for Data Visualization Excel is really powerful, as you can see. However, you can make it even cooler with the right tools. Excel’s built-in features, advanced tools like Power Query and Power Pivot, and external integrations like Power BI and Devart’s ODBC Driver will bring your data analysis to the next level. Experiment with different visualization techniques. Try using PivotCharts, interactive dashboards, etc. to find what works best for your data. The more you practice, the better your visual storytelling will become. For even more advanced data visualization, think about Devart’s ODBC Driver. It integrates real-time data into Excel, saves your time and eliminates manual updates. It’s a miracle, isn’t it? Download [the free trial](https://www.devart.com/odbc/) today to make your Excel dashboards even better! FAQ What is Excel data visualization, and why is it important for data analysis? Excel data visualization refers to the use of charts, graphs, and other visual tools to represent data in a clear and insightful way. It helps identify trends, patterns, and correlations, making data-driven decisions more effective. How to visualize data in Excel using built-in charts and graphs? Excel provides various built-in chart types, including bar charts, pie charts, scatter plots, and pivot charts. You can create visualizations by selecting your data, navigating to the Insert tab, and choosing a suitable chart type. Advanced options like conditional formatting and sparklines can enhance the clarity of data representation. What are the best ways to visualize data in Excel for reports and dashboards? To create impactful reports and dashboards, it’s crucial to choose the right type of visualization for the data at hand. PivotCharts and PivotTables are excellent for summarizing large datasets dynamically, while conditional formatting can help highlight important trends directly in data tables. Combining multiple chart types, such as bar and line charts, is useful for comparing different data series. Additionally, tools like slicers and timelines can add interactivity, allowing users to filter and adjust the data being presented. What are some Excel data visualization examples for business and financial analysis? In business and financial analysis, Excel data visualizations can take many forms. For example, sales performance dashboards can visually represent revenue trends, helping to identify the highest-performing products or regions. Financial statement visualizations might display income and expense trends, while customer segmentation analysis helps categorize customers by various criteria. Inventory tracking visuals can show stock levels and trends, making it easier to manage supply chains. Which Excel data visualization tools are best for creating interactive dashboards? Excel offers powerful tools for building interactive dashboards, such as the integration of Power BI for more advanced visualizations. Power Query and Power Pivot can be used to manage and analyze large datasets, while slicers and timelines help users interact with and filter data within dashboards. These features make it easy to create dynamic reports that update automatically based on changing data. How do I visualize data in Excel using real-time SQL database connections? To visualize data in Excel using real-time SQL database connections, you can link Excel to an external database through an ODBC driver. This allows you to import live data directly into your spreadsheets, ensuring your charts and reports are always up-to-date. Tools like Devart’s ODBC Driver make it possible to connect Excel with various databases such as MySQL, PostgreSQL, and SQL Server, allowing you to pull the latest data directly into your visualizations. Does Devart’s ODBC Driver support automating Excel data visualization workflows? Devart’s ODBC Driver supports the automation of Excel data visualization workflows by establishing real-time connections between Excel and SQL databases. This enables automatic updates of data without manual intervention, ensuring that your visualizations remain current. Can I create dashboards in Excel using data from MySQL, PostgreSQL, or SQL Server with Devart’s ODBC Driver? Yes, Devart’s ODBC Driver allows you to create dashboards in Excel using live data from databases like MySQL, PostgreSQL, and SQL Server. This direct integration ensures that your dashboards are always up-to-date, reflecting real-time changes in the data. Tags [odbc](https://blog.devart.com/tag/odbc) [ODBC driver for Excel](https://blog.devart.com/tag/odbc-driver-for-excel) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexcel-data-visualization.html) [Twitter](https://twitter.com/intent/tweet?text=Excel+Data+Visualization%3A+The+Best+Tools+and+Techniques+for+Stunning+Insights&url=https%3A%2F%2Fblog.devart.com%2Fexcel-data-visualization.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/excel-data-visualization.html&title=Excel+Data+Visualization%3A+The+Best+Tools+and+Techniques+for+Stunning+Insights) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/excel-data-visualization.html&title=Excel+Data+Visualization%3A+The+Best+Tools+and+Techniques+for+Stunning+Insights) [Copy URL](https://blog.devart.com/excel-data-visualization.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025 [ODBC](https://blog.devart.com/category/odbc) [What is Data Integration? Definition, Types, Examples & Use Cases](https://blog.devart.com/what-is-data-integration.html) May 5, 2025"} {"url": "https://blog.devart.com/execute-process-task-in-ssis.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SSIS Components](https://blog.devart.com/category/products/ssis-components) SSIS Execute Process Task Tutorial with Examples By [dotConnect Team](https://blog.devart.com/author/dotconnect) January 19, 2022 [0](https://blog.devart.com/execute-process-task-in-ssis.html#respond) 13544 What’s the big deal with SSIS Execute Process Task? Among the many [SSIS components](https://www.devart.com/ssis/ssis-components-and-tools.html) , Data Flow and SQL tasks are enough for simple tasks. Using SSIS to compare two tables is one example. But sometimes, those task components alone are not enough for our requirements. Executing a batch file, a PowerShell script, or an executable file as the next step makes sense. So, why not make use of it in [SSIS](https://www.devart.com/ssis/what-is-ssis.html) ? And guess what? The [SSIS Execute Process Task](https://www.devart.com/ssis/ssis-tasks.html) can use these useful software nuggets. This article will discuss how to use Execute Process Task in SSIS with examples. You’ll learn the following: What is Execute Process Task? The different Execute Process Task arguments. Running Batch Files using SSIS Execute Process Task. Using SSIS Execute Process Task with PowerShell. Running executable (.EXE) files with SSIS Execute Process Task. Unzipping Zip files with SSIS Execute Process Task. Finally, how to make your SSIS Execute Process Task flexible by using variables. Before we begin,\nthe following tools are used in the examples: SQL Server Integration Services 2019 Visual Studio 2019 Windows 10 21H2 Now, let’s dive in. What is the Execute Process Task in SSIS? The SSIS Execute\nProcess Task runs apps or batch files as part of the SSIS package\nworkflow. Though you can run any executable app, batch file, or\nPowerShell commands, you typically use it for data integration\npurposes. An example of this is unzipping a Zip file that contains\nCSV files within. Then, the CSV files will be opened and processed by\nanother SSIS data flow component. You can use this\ncomponent by dragging the Execute Process Task from the toolbox into\nyour SSIS package’s Control Flow. There are a few settings you need to know to configure this component properly. You do this after right-clicking the Execute Process Task component. Then, the Execute Process Task Editor window will appear. See a screenshot of this in Figure 1. Figure 1 . The Execute Process Task Editor window. Let’s describe each property. Execute Process Task Properties RequireFullFileName When True, the\ntask will fail if the app is not found on the specified path. Executable This is the app\nor batch file you want to run. You can also specify the full path\nalong with the app or batch name. Arguments Apps and batch\nfiles can have parameters or arguments. You pass the values from the\nSSIS package from here. The next section will discuss this further\nwith an example. WorkingDirectory This is where you\nput the exact working folder location for your app’s input/output. StandardInputVariable Sometimes, a\nconsole app requires input from the user. But SSIS packages typically\nrun unattended. So, you can use this property to pass a value a user\nwill enter. And when you use a variable for that purpose, the\nexecution will continue to run without user input. This will be\nexplained further with an example in the next section. StandardOutputVariable You may want to\nget the output from the app and handle the values in your SSIS\npackage. Enter your preferred variable here so you can work on the\nvalue in a later task. StandardErrorVariable An app can go\nwrong. So, you need to capture the error to handle the problem. The\napp can pass the error in the variable you specify here. FailTaskIfReturnCodeIsNotSuccessValue Defaults to True.\nWhen true, the Execute Process Task will fail if the output value is\nnot the same as the value of SuccessValue . SuccessValue Defaults to zero.\nBut the app may return a different value, like 1. Specify the success\nvalue here. This is also used as a basis\nin FailTaskIfReturnCodeIsNotSuccessValue . TimeOut Specify the\nnumber of seconds until time-out here. Zero means indefinite. When\nyou provide a non-zero value, you can choose to let the app end using\nthe TerminateProcessAfterTimeOut = True. WindowStyle Possible values here are Normal, Maximized, Minimized, or Hidden . This indicates how the app will display when invoked. Execute Process Task Arguments Arguments passed\ninto the app or batch file are specified in a single string value.\nThis string value can be entered as a literal value or through an\nargument variable. It also accepts single or multiple arguments\nwithin the string. Further, you may include double quotes when\nneeded. Let’s say you want to copy a Zip file to another folder. CMD.EXE /C COPY PAY202201.ZIP C:\\WorkingFolder The executable\nis CMD.EXE . The arguments are the rest of the text\ncommand. So, in the Execute Process Task Editor, enter /C\nCOPY PAY202201.ZIP C:\\WorkingFolder for Arguments . Arguments vs StandardInputVariable – What’s the Difference? We’re going to use an example to answer this. But first, see the settings in Figure 2. Figure 2 . Execute Process Task with both Arguments and StandardInputVariable. As you can see in\nFigure 2, Arguments and StandardInputVariable are\nnot the same. So, what’s the functional difference? Arguments are\nfor app parameters . Meanwhile, StandardInputVariable is\nfor user input within the app . Command prompt commands like DATE\nand TIME wait for user input. Without the /T parameter, the DATE\ncommand will wait for user input for a new system date. Now, consider the\nexample above. It will invoke\nthe CHOICE console command. It asks if you want coffee or not. Then,\nyou can enter either Y or N. The value of Arguments is\nthe CHOICE command and its parameter values. Meanwhile,\nthe StandardInputVariable is User::response .\nAnd if User::response equals Y, the SSIS task will\ncontinue to run without waiting for user input. It will act as if a\nuser presses Y. But without the StandardInputVariable ,\nthe SSIS task will pause until Y or N is pressed. And we don’t want\nthat on an unattended execution. Here’s the\npoint: Don’t use parameter values for StandardInputVariable .\nAnd don’t use supposed-to-be user input under Arguments .\nIt won’t work. But note that the example is for console apps with 1\nuser input. By the looks of the property setting, it only accepts 1\ninput. How to Run .EXE Files from Command Line on SSIS\nExecute Process Task Earlier, we run the command prompt through the SSIS Execute Process Task. But let’s have another example that will clarify it further. Let’s try encrypting a folder and the files within it through this component. Check out Figure 3. Figure 3 . Sample configuration settings for running an executable file. The executable used is CIPHER.EXE . It is pre-installed on Windows 10 systems but will only work in Windows 10 Pro and Enterprise. The /E argument tells Cipher to encrypt the files and folder specified in WorkingDirectory . That’s easy.\nNext, let’s try it with batch files. How to Run Batch File with SSIS Execute Process\nTask At this point, the idea should be easy. But let’s try the Execute Process Task in SSIS with batch files. Consider Figure 4 below for the batch file example. Figure 4 . Sample batch file. Our example in\nFigure 4 will do the following for our demo purposes: Accept 2 batch file parameters and display them. Display the Windows version In the end, pause and wait for any keyboard press, so we can have a screenshot of the result. You don’t do this in production batch files, though. Then, see Figure 5 for the settings. Figure 5 . Sample configuration settings for running a batch file in SSIS. Figure 5 shows\nthe setting for running a batch file with 2 parameters. The Executable is ExecBatch.bat with the full\npath indicated. Then, the Arguments are “Tada!” and\n“Surprise!”. See a screenshot of the result in Figure 6. Figure 6 . The output of our sample batch file. The batch file\nfunctioned as expected after executing the task. Run PowerShell Script from SSIS Execute Process\nTask If SSIS can run\napps, SSIS and PowerShell can work together. So, we’re going to try\nexecuting a PowerShell script block from within SSIS Execute Process\nTask. The concept is basically the same. In our example\nbelow, SSIS will run a PowerShell script block with arguments to\nsearch for a string within a text file. Executable : powershell.exe Arguments : -NoProfile -ExecutionPolicy ByPass -Command “if (select-string -pattern BOMB -path AUDIT202201.txt) {exit 0} else {exit 1}” WorkingDirectory : c:\\users\\edwin\\documents\\ FailTaskIfReturnCodeIsNotSuccessValue : False The text file\nis AUDIT202201.txt . And we will look for the word “BOMB”\ninside that file. It will return 0 if it exists. Otherwise, 1.\nThe NoProfile and ExecutionPolicy arguments\nare supplied so the script won’t break if there are server\nconfiguration changes that will affect it. Here’s how the whole Control Flow and the Execute Process Task Editor window look like in Figure 7. Figure 7. Executing PowerShell script block from SSIS Execute Process Task. The working\ndirectory setting ensures that PowerShell can find the file in that\nfolder. Also, setting the FailTaskIfReturnCodeIsNotSuccessValue to\nFalse ensures that the whole package won’t fail if the string is\nnot found. The result will\nlead to Show Failure as the search string does not exist in my copy\nof the text file. How to Unzip .ZIP Files Using Execute Process\nTask Another data\nintegration task is to unzip a file that includes data files like CSV\nand Excel files. Let’s try this by using the bsdtar archiving tool\nor TAR.EXE . This is pre-installed in Windows 10 Build\n17063 or newer. So, there’s no need to download any Zip/Unzip tool\nlike 7Zip if you want to try this example yourself. First, drag the Execute Process Task to the Control Flow of your SSIS package in Visual Studio. Then, check out Figure 8 for the property settings. Figure 8 . Configuration setting for unzipping a Zip file using TAR.EXE. Extracting a Zip\nfile needs the /xf argument and the zip file itself. You also need to\nspecify the WorkingDirectory so it will extract the\nfiles there. SSIS Execute Process Task with Variables So far, the\nconfigurations you saw use hardcoded values. This way, you can see\nthe samples straight and clear. But this time, let’s add a little\nflexibility by using [SSIS\nvariables](https://docs.microsoft.com/en-us/sql/integration-services/integration-services-ssis-variables?view=sql-server-ver15#:~:text=Integration%20Services%20%28SSIS%29%20Variables%201%20System%20and%20user-defined,6%20Change%20the%20scope%20of%20a%20variable.%20) . Some of the properties in the Execute Process Task in SSIS require variables. This includes StandardInputVariable which you saw in an earlier section. Also, there are StandardOutputVariable and StandardErrorVariable . All these are useful for creating a defensive control flow design. You can set a\nvariable using the SSIS Script Task based on some factors, like year\nand month. Or, you can also use expressions if your desired output is\nsimpler. Let’s change the configuration for unzipping ZIP files\nfrom the previous section. Create the Variables First, let’s have the variables. We need 4 variables as seen in Figure 9. Figure 9 . List of variables for the modified Unzip file task. Instead of using a script, we used expressions as seen in Figure 9. With expressions, you can use functions and operators to get your desired value. You can observe the following from the above variables: A fixed working folder in workingFolder variable. A string that holds the year and month ( yearMonthString ). We need this to form the values of the next variables. The expression is defined as (DT_WSTR, 4) YEAR(GETDATE()) + RIGHT(“00” + (DT_WSTR, 2) MONTH(GETDATE()),2). The DT_WSTR is used to convert numeric values, like month and year, to strings. Meanwhile, the YEAR, MONTH, and GETDATE are date functions used to get the year, month, and current date respectively. targetDataFolderName that holds the flexible target folder name to extract the files. This is based on the yearMonthString . The expression is defined as “data” + @[User::yearMonthString] . Lastly, a flexible name for the Zip file with the full path. The expression is defined as @[User::workingFolder] + “data” + @[User::yearMonthString] + “.zip” . Now create the user variables as you see them in Figure 9 above. Then, we will use these variables in the next section. Set the Properties to Variables To use these variables in the SSIS Execute Process Task, check out Figure 10 and follow the steps after it. Figure 10. Assigning variables to properties of SSIS Execute Process Task. Here are the steps to configure the task as on the Figure 10: Click Expressions in the left pane of the Execute Process Task Editor. Under Misc , click the ellipsis button of the Expressions property. Add and select Arguments property. Then, click the ellipsis button. The Expression Builder window will open. Type the expression “-xf ” + @[User::zipFullPathFileName] . Click the Evaluate Expression button to check the syntax. If there’s an error, correct it. Click the OK button to close the Expression Builder window. Then, continue\nworking for the WorkingDirectory property. Here’s\nthe next thing to do: Add another property and select WorkingDirectory . Then, click the ellipsis button. The Expression Builder window will appear again. This time, set the expression to @[User::workingFolder] + @[User::targetDataFolderName] . Click the OK button to close the Expression Builder window. Then, click back to the Process pane of the Execute Process Task Editor, change the Executable property setting to C:\\Windows\\system32\\tar.exe . Finally, click the OK button to close the Execute Process Task Editor window. You can now\nperform a test run. It should behave like the example in the previous\nsection. Our design\nincludes a simple setup with 1 task component. It also doesn’t\ninvolve catching the output value of the app and the error if there’s\nany. But the point of using variables inside the SSIS Execute Process\nis the same. And you can apply this in other properties as well. Conclusion That’s it. Using the SSIS\nExecute Process Task is easy to understand and configure. You will\nfind this a valuable component in your data integration projects. But\nif you have questions, please fire away in the Comments section\nbelow. Do you find this post helpful? Please share it with your friends on your favorite social media platforms. Tags [how to](https://blog.devart.com/tag/how-to) [SSIS](https://blog.devart.com/tag/ssis) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexecute-process-task-in-ssis.html) [Twitter](https://twitter.com/intent/tweet?text=SSIS+Execute+Process+Task+Tutorial+with+Examples&url=https%3A%2F%2Fblog.devart.com%2Fexecute-process-task-in-ssis.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/execute-process-task-in-ssis.html&title=SSIS+Execute+Process+Task+Tutorial+with+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/execute-process-task-in-ssis.html&title=SSIS+Execute+Process+Task+Tutorial+with+Examples) [Copy URL](https://blog.devart.com/execute-process-task-in-ssis.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/execution-history-in-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) SQL Query History in SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) May 2, 2019 [0](https://blog.devart.com/execution-history-in-sql-complete.html#respond) 5717 [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) includes the whole range of options that provide possibilities for powerful optimization and great performance increase. SQL history is an important part of this list. It is included into the Standard and Express edition package. The feature stores a history of queries run on a SQL server from SSMS for a certain period. With the help of this option, the user can view, edit and search for queries that were executed in the database easily. However, the information above describes only the basic functionality. Let’s discover what else this feature is possible to do. Main features Success verification check This option allows checking whether the SQL statements was successfully executed or not. Each query in the table has the field with an icon that displays the result of the execution. You can also sort the table entries by this parameter. Log output range Query history provides the ability to set the date interval in which queries were output. You can view the entire list for any period, just set any time range you need. Group by column This option allows grouping the completed statements by any desired column from the table. To do this, just drag the selected column to the panel above the list. SQL Query History will display the results immediately. Open in a new SQL window or copy to the clipboard Each user can get access to any SQL statement from the archive to view its code to perform any changes after. It’s only necessary to right-click on the certain request and select the “To New SQL Window” or “To Clipboard” option. Statements sorting The feature provides opportunities for various sorting of executed statements. You can sort the archive by any column that is present in the table. To do this, it is just enough to click on the header of the required one. Detailed item info The feature of the executed SQL Server queries has a convenient interface for getting all the necessary information about each statement. You can sort info by each parameter field as well as perform the customization of the displayed columns. Right-click the Query History grid and select Column Chooser . To customize the layout, clear or select the required checkbox. Search filter The feature allows filtering executed statements by text key entry. This option will be very useful for DBAs who often have to work with a large number of requests. Simply enter the certain characters in the “Search” field and the Query History will immediately display the desired result. Conclusion Executed SQL statements history is a powerful feature that provides a whole set of tools for working with already executed statements. The feature allows DBAs to monitor historical queries that have been run on an SSMS in order to work with databases backlog or with its backups. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexecution-history-in-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=SQL+Query+History+in+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fexecution-history-in-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/execution-history-in-sql-complete.html&title=SQL+Query+History+in+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/execution-history-in-sql-complete.html&title=SQL+Query+History+in+SQL+Complete) [Copy URL](https://blog.devart.com/execution-history-in-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/execution-notifications-and-transaction-reminder-in-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) Execution Notifications and Transaction Reminder in SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) July 16, 2019 [0](https://blog.devart.com/execution-notifications-and-transaction-reminder-in-sql-complete.html#respond) 4608 dbForge SQL Complete is an advanced tool providing a batch of code completion instruments that free its users from memorizing and typing complex parts of the code. The tool won recognition from the public and has been constantly developed and enhanced ever since. Two favorable features: Execution Notifications and Transaction Reminder appeared in recent SQL Complete updates. In the situations when developers are overloaded with the tasks it is highly important to know quickly when the query is completed and how it is completed (successfully or unsuccessfully) as well as it is crucial to have a reminder that a certain transaction has not been yet completed. In order to support these really important use cases and enhance the visibility of the development process Execution Notifications and Transaction Reminder features have been added to our SQL Complete product. Execution Notifications The feature is particularly useful in situations when developers need to execute long-running queries and switch to other windows within SSMS or even to other applications. You can enable notifications and change settings on the Notifications tab of the Options menu. You can also set a query execution time limit to get notified about queries that exceed it. By default, notifications are on and notification duration time is 7 seconds. The notification contains the following information: – The name of the document – The duration of a query –  Execution status: – Query completed with errors. – Query executed successfully. – Query canceled. Note, that i f a query is running longer than 23:59:59 hours, days are added to the time in the Notification. You can close the notification window by clicking the cross button in the upper-right corner. If you click the notification window – the SQL document will open. Transaction Reminder Envision quite a common situation when you open a transaction, run some statements and then due to some reasons forget to commit it. Transaction Reminder feature of SQL Complete is called to solve that problem. With this feature enabled whenever there are open transactions during the query execution, a reminder will pop-up informing you about the number of uncommitted transactions. By default, notifications are on. To disable notifications uncheck Notify if execution contains open transactions on the Notifications tab of the Options menu. The notification alerts users that a transaction has not been yet completed if you execute a script by clicking F5 in SSMS. The Transaction Reminder window contains the following information: a document name and a number of open transactions. You can close the notification window by clicking the cross button in the upper-right corner. If you click the notification window – the SQL document will open. Conclusion The Devart company launched Execution Notifications and Transaction Reminder features in SQL Complete v5.8 and have enjoyed seeing its customers put it to use in their everyday work. The notification and reminder system assists users in their routine tasks and aims to optimize their work preventing specific incidents. Tags [Execution Notifications](https://blog.devart.com/tag/execution-notifications) [Transaction Reminder](https://blog.devart.com/tag/transaction-reminder) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexecution-notifications-and-transaction-reminder-in-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=Execution+Notifications+and+Transaction+Reminder+in+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fexecution-notifications-and-transaction-reminder-in-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/execution-notifications-and-transaction-reminder-in-sql-complete.html&title=Execution+Notifications+and+Transaction+Reminder+in+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/execution-notifications-and-transaction-reminder-in-sql-complete.html&title=Execution+Notifications+and+Transaction+Reminder+in+SQL+Complete) [Copy URL](https://blog.devart.com/execution-notifications-and-transaction-reminder-in-sql-complete.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/expand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) [What’s New](https://blog.devart.com/category/whats-new) Expand Your Connectivity With New ODBC Drivers for Dynamics 365 Business Central, Trello, and ClickUp By [DAC Team](https://blog.devart.com/author/dac) March 13, 2025 [0](https://blog.devart.com/expand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html#respond) 1103 Our team is excited to announce the release of three new Devart ODBC Drivers that provide easy access to data stored in Dynamics 365 Business Central , Trello , and ClickUp using the ODBC standard. These Drivers are robust solutions that allow developers, analysts, and business intelligence professionals to connect to live data on cloud platforms and databases directly from any ODBC-compliant application. Devart ODBC Drivers provide reliable connectivity to a wide range of databases and cloud applications, and boast features like connection pooling, metadata caching, and advanced security options. These ensure efficient data access, improved performance, and secure integration for developers. With this release, Devart is expanding its [ODBC Driver lineup](https://www.devart.com/odbc/universal-bundle/) by adding three new Drivers, broadening integration capabilities, and providing even more connectivity options. ODBC Driver for Dynamics 365 Business Central Dynamics 365 Business Central ODBC Driver enables connection to live Dynamics 365 Business Central data from any application that supports ODBC connectivity. It allows you to read, write, and update data like Items, Sales Orders, and Purchase Orders through a standard ODBC interface. With support for SQL-92 operations, complex JOINs, and data aggregation, our ODBC Driver offers seamless integration with various data integration tools and applications. ODBC Driver for Trello ODBC Driver for Trello allows ODBC-compatible applications to connect directly to Trello via HTTPS and interact with Trello data, including Lists, Cards, and Boards, as if it were a custom database. With advanced metadata querying and extended SQL syntax, you can work with Trello data in real time using SQL-92 compatible SELECT statements. ODBC Driver for ClickUp ODBC Driver for ClickUp enables SQL-like access to live ClickUp data from any application that supports the ODBC interface. It comes as a standalone installation file that does not require users to deploy or configure any additional software, such as a vendor library, for example. The Driver is fully compatible with third-party data analysis tools like Microsoft Excel and Tableau and integrates with a variety of IDEs and platforms, including Visual Studio and RAD Studio. Test-drive our new ODBC Drivers today! Start your 30-day free trial today and experience seamless data connectivity with our powerful ODBC Drivers! Unlock full access and see how easy it is to integrate your data across various platforms. Interested in more ODBC Drivers? Our [ODBC Integration Universal Bundle](https://www.devart.com/odbc/universal-bundle/) offers all Devart ODBC Drivers in one comprehensive package, enabling you to integrate data from 87 databases and cloud applications into ODBC-compliant reporting, analytics, BI, and ETL tools. The Universal Bundle allows you to save over 95% compared to purchasing the Drivers separately! Tags [ODBC Driver for ClickUP](https://blog.devart.com/tag/odbc-driver-for-clickup) [ODBC Driver for Dynamics 365 Business Central](https://blog.devart.com/tag/odbc-driver-for-dynamics-365-business-central) [ODBC Driver for Trello](https://blog.devart.com/tag/odbc-driver-for-trello) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexpand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html) [Twitter](https://twitter.com/intent/tweet?text=Expand+Your+Connectivity+With+New+ODBC+Drivers+for+Dynamics+365+Business+Central%2C+Trello%2C+and+ClickUp&url=https%3A%2F%2Fblog.devart.com%2Fexpand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/expand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html&title=Expand+Your+Connectivity+With+New+ODBC+Drivers+for+Dynamics+365+Business+Central%2C+Trello%2C+and+ClickUp) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/expand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html&title=Expand+Your+Connectivity+With+New+ODBC+Drivers+for+Dynamics+365+Business+Central%2C+Trello%2C+and+ClickUp) [Copy URL](https://blog.devart.com/expand-your-connectivity-with-new-odbc-drivers-for-dynamics-365-business-central-trello-and-clickup.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025"} {"url": "https://blog.devart.com/explanation-of-microsoft-poco-and-self-tracking-templates.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Explanation of Microsoft POCO and Self-Tracking Templates By [dotConnect Team](https://blog.devart.com/author/dotconnect) July 13, 2011 [1](https://blog.devart.com/explanation-of-microsoft-poco-and-self-tracking-templates.html#comments) 4480 If a developer needs to introduce certain changes or adjust the behavior of Microsoft’s POCO or Self-Tracking templates of the EntityFramework model or to write a custom template from scratch based on Microsoft’s standard template, he or she will have to learn how these templates work. When studying Microsoft’s POCO and Self-Tracking generation templates, I have come across significant difficulties in understanding the exact way these templates work, since they are far from being trivial. In this blog article, I shall endeavor to describe their basic functions, explain some points that might be so easy to understand, as well as the structure of the templates. Hopefully, this information will prove useful and time- and effort-saving to all developers who need to work with Microsoft’s generation templates for EntityFramework models. Microsoft’s standard templates are written in the T4 template language. You can find their description, procedures for adding, as well as a small customization sample here: [http://msdn.microsoft.com/en-us/gg558520](http://msdn.microsoft.com/en-us/gg558520) . This article provides basic information on T4 generation templates for EntityFramework that is required for their use, as well as details on how to work with metadata, generate several files and some other aspects. Now let’s see how Microsoft’s POCO and Self-Tracking are designed and how they can be used in application development. Template Files Both templates consist of two files: .Context.tt and .tt. The .Context.tt template file generates the ObjectContext code of the EntityFramework model, while .tt generates classes for Entities and the model’s complex types, as well as additional classes that are required by the generated model to work properly. This POCO template file also generates the .cs file that contains the FixupCollection class required for notification in POCO models. The Self-Tracking template also generates additional files containing classes and interfaces required by the model to work properly (for more information see [http://msdn.microsoft.com/en-us/library/ff477604.aspx](http://msdn.microsoft.com/en-us/gg558520) ). Classes, Functions and Structures Most Frequently Used in Templates The WriteHeader function is used to generate the header of each template’s output file; it is responsible for generating the same-type code in the beginning of each file: comments that describe the file and declaration of the set of namespaces in use. The CodeRegion class is used in templates to insert regions into generated code, as well as to manage indents in code. Its Begin(string) and End() methods are used to emit #region and #endregion into the generated C# code and to omit the region if the template does not generate code inside the region. The name of the region is sent to the Begin(string) function as a parameter, and the class itself calculates indents for the current region. In templates, it is also possible to use CodeRegion.GetIndent(int) , the static method of this class, to obtain a string of spaces equivalent to the number of indents desired. The result returned by the CodeRegion.GetIndent(int) function is sent to the PushIndent(string) template function. This template function is responsible for setting indents for generating code, since it receives the value of the indent that will be appended as the prefix to each string of the template’s generated code. Thus, we can manage the formatting of generated code. The PopIndent() template method resets the current indent value and returns the previous value to be used for the purpose of formatting. EntityFrameworkTemplateFileManager class is used in the template to manage output files. Its functionality is well-described in the MSDN article (see the link in the beginning of this article), so I will not dwell on this topic any further. CodeGenerationTools Class This class is most often used in templates. In the beginning of the template, you can see the “code” local variable is created; this variable is then frequently used within the template. Escape , its most frequently used function, has lots of overloads and is used to transform a string representation of the object in a way that is safe for the compiler. For example, your model has an entity called “class”; this name is the name of the class for that entity and the following code is generated for it: public class However, it is impossible to use the “class” identifier for the C# compiler, since the compiler will not process a code like: public class class … To use the reserved “class” word as the name of a class, we have to transform it into the compiler-legal form: public class @class … The Escape function checks if it is necessary to transform the identifier into the safe form and, if that is so, performs the transformation and returns an always legal identifier. The EscapeNamespace function uses the same workflow and always returns an identifier that can legally be used as the name of a namespace. The second most frequently used function of this class, FieldName , returns the name of the internal field for a class member, with this name being formed as the class member name with the prefix “_” . The AbstractOption function checks whether the class is defined in the metadata of the model as abstract and, if that is so, returns a compiler-legal identifier of the abstract class. The CreateLiteral(object value) function transforms an object into the code string for its initialization by the compiler. For example, if a value of the System.Guid type is sent to the function, then we get the following string: “new Guid(“”)”. This function is used to generate the code that initializes the default values of the properties of entities, if the default values are defined in the metadata. I will not describe other functions of the class, since they are seldom used and intuitively understandable. Accessibility Class This class is also used in templates often enough and is a set of static functions, which returns a representation of a compiler-legal object access identifier. Other Structures When working with navigational structures, one can often come across the structure ((AssociationType).RelationshipType).IsForeignKey . This structure returns True if the association has a referential constraint. The .ToEndMember.GetEntityType() returns the entity referenced by the navigation property. The method GetDependentProperties() gets the collection of properties that are on the dependent end of a referential constraint for the specified navigation property. The GetPrincipalProperties method of the navigation property gets the collection of properties that are on the principal end of a referential constraint for the specified navigation property. Peculiarities of .Context.tt Template The .Context.tt contains no difficulties or aspects, which have not been described above, and generates rather trivial code of the context class that includes ObjectSet properties and function imports. For Self-Tracking generation, this template also contains code for the generation of the additional file .Context.Extensions.cs: fileManager.StartNewFile\n (Path.GetFileNameWithoutExtension(Host.TemplateFile) + \".Extensions.cs\");\n BeginNamespace(namespaceName, code);\n WriteApplyChanges(code); // generates the main content of the file\n EndNamespace(namespaceName); Peculiarities of .tt Template The .tt is significantly more complex. It is worth saying from the outset that one should not be dismayed by a large volume of code in the Self-Tracking template – this code is not as difficult as it might seem at first glance. The code for the generation of .cs is placed in the very beginning of the template. In the template for POCO, this looks like: WriteHeader(fileManager);\nBeginNamespace(namespaceName, code);\nWriteCustomObservableCollection();// generates the main content of the file\nEndNamespace(namespaceName); The generated file contains the FixupCollection class that is required for notification in POCO models. In the template for Self-Tracking, this looks like: WriteHeader(fileManager);\nBeginNamespace(namespaceName, code);\nWriteObjectChangeTracker();\nWriteIObjectWithChangeTracker();\nWriteCustomObservableCollection();\nWriteINotifyComplexPropertyChanging();\nWriteEqualityComparer();\nEndNamespace(namespaceName); The generated file contains all classes and interfaces that are required for the model to work properly. For more information on the subject, see [http://msdn.microsoft.com/en-us/library/ff477604.aspx](http://msdn.microsoft.com/en-us/library/ff477604.aspx) . Entity Generation The template is iterated on the model’s entities and that results in generating classes for each entity. The generation of class definition takes possible inheritance into account. Class generation is divided into the following regions: Primitive Properties Generation, Complex Properties generation (whose type is complex), Navigation Properties generation, and Association Fixup generation. For the Self-Tracking template, the ChangeTracking region is also generated. Primitive Properties Region The Primitive Properties region contains generated entity properties of the simple (not complex) type. When the default value is set for the property, this value is used to generate initialization code that is performed by the CreateLiteral property of the CodeGenerationTools class. If the property is a foreign key property, that is, if there is a navigation property that references the Master class for the entity and the generated primitive property participates in the Referentional Constraint of this navigation property association, then additional code is generated by the setter method for the property to keep association fixup. In the Self-Tracking template, the IsOriginalValueMember(EdmProperty edmProperty) method of the OriginalValueMembers class returns True if the СoncurrencyMode property of a generated property has the Fixed value or if the generated property is a foreign key property. Complex Properties Region The Complex Properties region contains generated entity properties of the complex type. The POCO template for the class field generates the initialization code with a new value of this complex type. The Self-Tracking template does not perform this operation, since, in the code it generates, initialization of a complex type field is performed on the setter method of the corresponding property. Navigation Properties Region The Navigation Properties region contains generated navigation properties of the entity. Depending on the Multiplicity of the navigation property, a property containing either an instance of the master class for the entity or a collection of details classes is generated. Peculiarities of MetadataTools class methods for Self-Tracking templates When generating a navigation property with Multiplicity = Many, the IsCascadeDeletePrincipal(NavigationProperty navProperty) method of the MetadataTools class returns True , if the navigation property’s OnDelete Action has the Cascade value, or if the collection of properties that are on the principal end of a referential constraint for the specified navigation property contains at least one entity key property. When generating a navigation property with Multiplicity = Many, the IsPrincipalEndOfIdentifyingRelationship(AssociationEndMember associationEnd) method of the MetadataTools class returns True , if the collection of properties that are on the dependent end of a referential constraint for the specified navigation property contains at least one entity key property. ChangeTracking Region in Self-Tracking Template The ChangeTracking region that is available only in the Self-Tracking template contains the generated ChangeTracker property and methods that with relationships synchronization logic. In this region, it is worth paying attention to the IsSaveReference(MetadataTools tools, NavigationProperty navProperty) method of the template itself. This method returns True , if this is a foreign key association that is added without adding foreign key properties, that is, a referential constraint is not used in its mapping. Association Fixup Region The Association Fixup region contains generated methods that are used to update associations and navigation properties when changes are introduced. Besides the functions described above, all the other aspects of this region are few and intuitively understandable. Complex Type Generation After generating entities, the iteration proceeds to the model’s complex types, for which classes are generated. This includes generating a class definition, as well as Primitive and Complex properties. For the Self-Tracking template, ChangeTracking is additionally generated. Hopefully, this article will help you master Microsoft’s standard POCO and Self-Tracking templates for generating EntityFramework models. Tags [entity framework](https://blog.devart.com/tag/entity-framework) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexplanation-of-microsoft-poco-and-self-tracking-templates.html) [Twitter](https://twitter.com/intent/tweet?text=Explanation+of+Microsoft+POCO+and+Self-Tracking+Templates&url=https%3A%2F%2Fblog.devart.com%2Fexplanation-of-microsoft-poco-and-self-tracking-templates.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/explanation-of-microsoft-poco-and-self-tracking-templates.html&title=Explanation+of+Microsoft+POCO+and+Self-Tracking+Templates) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/explanation-of-microsoft-poco-and-self-tracking-templates.html&title=Explanation+of+Microsoft+POCO+and+Self-Tracking+Templates) [Copy URL](https://blog.devart.com/explanation-of-microsoft-poco-and-self-tracking-templates.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 1 COMMENT DotNetShoutout July 14, 2011\t\t\t\t\t\t At\t\t\t\t\t\t 8:21 am Explanation of Microsoft POCO and Self-Tracking Templates… Thank you for submitting this cool story – Trackback from DotNetShoutout… Comments are closed."} {"url": "https://blog.devart.com/exploring-dependency-injection-in-aspnet6.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [Uncategorized](https://blog.devart.com/category/uncategorized) Exploring Dependency Injection in ASP.NET 6 By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) October 5, 2023 [0](https://blog.devart.com/exploring-dependency-injection-in-aspnet6.html#respond) 2276 Introduction Both Dependency Injection (DI) and Inversion of Control (IoC) are widely used architectural patterns in software engineering to build applications that are loosely coupled, testable, scalable, and maintainable. Dependency Injection is a first-class citizen in ASP.NET Core. Support for dependency injection is built into ASP.NET Core. You can leverage dependency injection to build applications that are loosely coupled, testable, and extendable. This article talks about Dependency Injection, why it is important, its types, and how it can be implemented in ASP.NET Core applications. This article builds a simple application to demonstrate the concepts covered and uses PostgreSQL as the database to store data and dotConnect for PostgreSQL as the data provider for PostgreSQL. In this article, we’ll connect to PostgreSQL using [dotConnect for PostgreSQL](https://www.devart.com/dotconnect/postgresql/download.html) which is high high-performance and enhanced data provider for PostgreSQL that is built on top of ADO.NET and can work on both connected and disconnected modes. Pre-requisites You’ll need the following tools to deal with code examples: Visual Studio 2022 Community Edition PostgreSQL dotConnect for PostgreSql You can download PostgreSQL from here: [https://www.postgresql.org/download/](https://www.postgresql.org/download/) You can download a trial version of dotConnect for PostgreSQL from here: What Are We Building Here? In this article we’ll explore the Inversion of Control (IoC) and Dependency Injection (DI) architectural principles and examine how we can implement them in ASP.NET 6. Here are the steps we’ll follow throughout this article to accomplish this: Gain an understanding of IoC and DI principles and why they are important Create an ASP.NET 6 Core Web API project in Visual Studio 2022 Add the Devart.Data.PostgreSql NuGet package to the project Create a table named customer in a database in PostgreSql and insert a few records into it Create a Repository class named CustomerRepository to retrieve data from the database table Configure Program.cs file to enable an instance of CustomerRepository to be injected Create an ApiController class named CustomerController that would leverage dependency injection to retrieve data from the database table using the CustomerRepository class Execute the application What is Inversion of Control (IoC)? Inversion of Control (IoC) is a software architecture design principle that fosters coding against interfaces instead of implementation by inverting the control flow in an application. Per the IoC principle, high-level modules or components in an application should never depend on low-level modules or components; instead, they should depend on abstractions. In this case, instead of a component controlling the program flow using a component, the control flow is inverted or “controlled” by a framework or container. Per the IoC design principle, an object should not create instances of objects it depends on. Instead, the dependent objects should be created by a framework or a container. Dependency injection (DI) is a strategy for implementing Inversion of control (IoC) in which a class’s dependencies (e.g., the objects on which it depends) are provided to it externally, rather than the class creating them itself. By enabling you to inject the dependencies externally, it removes internal dependencies from the implementation. Dependency injection facilitates loose coupling between the components of an application, hence making your source code modular and easier to maintain. What is Dependency Injection? Dependency injection is a technique for implementing IoC in which a class’s dependencies (e.g., the objects on which it depends) are provided to it externally, rather than the class creating them itself. By enabling you to inject the dependencies externally, it removes internal dependencies from the implementation. Dependency injection facilitates loose coupling between the components of an application, hence making your source code modular and easier to maintain. Types of Dependency Injection There are three types of dependency injection: constructor injection, method injection, and setter injection. When using constructor injection, you can inject an instance of a class to another class using a parameter constructor. The following code snippet shows how you can implement constructor injection in ASP.NET Core: public class DemoController : Controller \n\n{ \n\n    private readonly ILogger _logger; \n\n    public DemoController(ILogger logger) \n\n    { \n\n        _logger = logger; \n\n    } \n\n} In method injection, you can use the parameters of a method to pass the required dependencies. The following code snippet illustrates this: public class DemoService \n\n{ \n\n    private ILogger _logger; \n\n    public void SetLogger(ILogger Logger) \n\n    { \n\n        _logger = Logger; \n\n    } \n\n    public void Log(string data) \n\n    { \n\n        _logger.LogInformation(data); \n\n    } \n\n} You can pass the dependency using the SetLogger method as shown in the following piece of code: ILogger logger = loggerFactory.CreateLogger(); \n\nDemoService demoService = new DemoService(); \n\ndemoService.SetLogger(loggerFactory.CreateLogger()); \n\ndemoService.Log(\"This is a sample text.\"); In setter injection, the dependency is passed via a property. The code snippet given below demonstrates setter injection: public class DemoService \n\n{ \n\n    private ILogger _logger; \n\n    public ILogger Logger  \n\n    {  \n\n        get  \n\n        {  \n\n            return _logger;  \n\n        } \n\n        set \n\n        { \n\n            _logger = value; \n\n        } \n\n    } \n\n    public void Log(string data) \n\n    { \n\n        _logger.LogInformation(data); \n\n    } \n\n} You can pass the dependency using the following piece of code: using var loggerFactory = LoggerFactory.Create(loggingBuilder => loggingBuilder \n\n    .SetMinimumLevel(LogLevel.Trace) \n\n    .AddConsole()); \n\nDemoService demoService = new DemoService(); \n\ndemoService.Logger = loggerFactory.CreateLogger(); \n\ndemoService.Log(\"This is a sample text.\"); Understanding Dependency Injection Lifetimes in ASP.NET Core Dependency injection lifetime refers to the time span of an instance, i.e., the duration till which the instance will be live after it is created in the container. ASP.NET Core provides support for the following lifetimes: Transient This specifies that a new instance will be created for each Http request. So, when you register a service with the container using a transient lifetime, a new instance of the service will be created with each each time you inject the instance. The following code snippet shows how you can register a service with the container using a transient lifetime: builder.Services.AddTransient(); Singleton This specifies that only one instance for the entire application will be created. The following code snippet shows how you can register a service with the container using a singleton lifetime: builder.Services.AddSingleton(); Scoped This specifies only one instance per scope will be created. Once the server receives an HTTP request from a client, it creates an HttpContext that includes details about the request. At this point, the application creates a scope for the current request. It should be noted that a scoped lifetime is analogous to the lifetime of an HTTP request. When you register a service instance using a scoped lifetime, all types in the application will use the same instance. The following code snippet shows how you can register a service with the contain with a scoped lifetime: builder.Services.AddScoped(); Start a new ASP.NET 6 Core Web API Project In this section we’ll learn how to create a new ASP.NET 6 Core Web API project in Visual Studio 2022. Now, follow the steps outlined below: Open Visual Studio 2022. Click Create a new project . Select ASP.NET Core Web API and click Next. Specify the project name and location to store that project in your system. Optionally, checkmark the Place solution and project in the same directory checkbox. Click Next. In the Additional information window, select .NET 6.0 (Long-term support) as the project version. Disable the Configure for HTTPS and Enable Docker Support options (uncheck them). Since we’ll not be using authentication in this example, select the Authentication type as None . Since we won’t use Open API in this example, deselect the Enable OpenAPI support checkbox. Since we’ll not be using minimal APIs in this example, ensure that the Use controllers (uncheck to use minimal APIs) is checked. Leave the Do not use top-level statements checkbox unchecked. Click Create to finish the process. We’ll use this project in this article. Install NuGet Package(s) into the API Project In your API project, i.e., the project you just created, you should install the dotConnect for PostgreSql package in your project. dotConnect for PostgreSQL is a high-performance data provider for PostgreSQL built on ADO.NET technology that provides a comprehensive solution for building PostgreSQL-based database applications. You can install this package either from the NuGet Package Manager tool inside Visual Studio or, from the NuGet Package Manager console using the following command: PM> Install-Package Devart.Data.PostgreSql Create the Database You can create a database using the pgadmin tool. To create a database using this Launch this tool, follow the steps given below: Launch the pgadmin tool Expand the Servers section Select Databases Right-click and click Create -> Database… Specify the name of the database and leave the other options to their default values Click Save to complete the process Alternatively, you can use the following script to create the database: -- Database: demo \n\nDROP DATABASE IF EXISTS demo; \n\nCREATE DATABASE demo \n\n    WITH \n\n    OWNER = postgres \n\n    ENCODING = 'UTF8' \n\n    LC_COLLATE = 'English_India.1252' \n\n    LC_CTYPE = 'English_India.1252' \n\n    TABLESPACE = pg_default \n\n    CONNECTION LIMIT = -1 \n\n    IS_TEMPLATE = False; Create a database table Select and expand the database you just created Select Schemas -> Tables Right-click on Tables and select Create -> Table… The table script is given below for your reference: CREATE TABLE customer ( \n\n    customer_id serial NOT NULL, \n\n    first_name VARCHAR (255) NOT NULL, \n\n    last_name VARCHAR (255) NOT NULL, \n\n    address VARCHAR (255) NOT NULL, \n\n    email VARCHAR (255) NOT NULL, \n\n    phone VARCHAR (255) NOT NULL, \n\n    CONSTRAINT customer_pk PRIMARY KEY (customer_id) \n\n); We’ll use this database in the subsequent sections of this article to demonstrate how we can work with Integration Tests in ASP.NET Core using dotConnect for PostgreSql. Add a few records to the Customer table Now, run the following script in your database to insert a few records in the customer table: INSERT INTO customer ( \n\n    first_name, \n\n    last_name, \n\n    address, \n\n    email, \n\n    phone \n\n)  \n\nVALUES \n\n('Joydip', 'Kanjilal', 'Hyderabad, India', joydipkanjilal@yahoo.com','1234567890'), \n\n('Debanjan', 'Banerjee', 'Kolkata,India', ’debanjan.mediaguru@gmail.com','0987654321'), \n\n('Rohit', 'Sharma', 'Bangalore, India', 'rohitms@yahoo.com','5566778899'); Figure 1 below illustrates the pgAdmin editor where you can write and execute your scripts: Figure 1: Displaying the records inserted into the customer table Create the Model Class Create a solution folder in the Solution Explorer window and name it as Models. Next, create a .cs file called Customer.cs with the following code in there: public class Customer \n\n    { \n\n        public int Customer_Id { get; set; } \n\n        public string First_Name { get; set; } \n\n        public string Last_Name { get; set; } \n\n        public string Address { get; set; } \n\n   public string Email { get; set; } \n\n   public string Phone { get; set; } \n\n    } \n\nCreate the CustomerRepository Class \n\nThe ICustomerRepository interface would look like this: \n\n    public interface ICustomerRepository \n\n    { \n\n        public List GetCustomers(); \n\n    } The CustomerRepository class implements the GetCustomers method of the ICustomerRepository interface and encapsulates all database operations. public class CustomerRepository: ICustomerRepository \n\n    { \n\n     public List GetCustomers() \n\n     { \n\n      try \n\n      { \n\n       List < Customer > customers = new List < Customer > (); \n\n       using(PgSqlConnection pgSqlConnection = \n\n        new PgSqlConnection(\"User Id = postgres;  \n\n        Password = Specify the Db password here\" + \n\n         \"host=localhost;database=demo; \n\n        License Key=Specify your license key here\")) \n\n       { \n\n        using(PgSqlCommand pgSqlCommand = new PgSqlCommand()) \n\n        { \n\n         pgSqlCommand.CommandText = \n\n          \"Select * From customer\"; \n\n         pgSqlCommand.Connection = pgSqlConnection; \n\n         if (pgSqlConnection.State != \n\n          System.Data.ConnectionState.Open) \n\n          pgSqlConnection.Open(); \n\n         using(var pgSqlReader = \n\n          pgSqlCommand.ExecuteReader()) \n\n         { \n\n          while (pgSqlReader.Read()) \n\n          { \n\n           Customer customer = new Customer(); \n\n           customer.Customer_Id = \n\n            int.Parse(pgSqlReader.GetValue(0).ToString()); \n\n           customer.First_Name = \n\n            pgSqlReader.GetValue(1).ToString(); \n\n           customer.Last_Name = \n\n            pgSqlReader.GetValue(2).ToString(); \n\n           customer.Address = \n\n            pgSqlReader.GetValue(3).ToString(); \n\n           customer.Email = \n\n            pgSqlReader.GetValue(4).ToString(); \n\n           customer.Phone = \n\n            pgSqlReader.GetValue(5).ToString(); \n\n           customers.Add(customer); \n\n          } \n\n         } \n\n        } \n\n       } \n\n       return customers; \n\n      } \n\n      catch \n\n      { \n\n       throw; \n\n      } \n\n     } \n\n    } Create the CustomerController Class Next, select and right-click on the Controllers solution folder and create a new controller class called CustomerController with the following code in there: [Route(\"api/[controller]\")] \n\n    [ApiController] \n\n    public class CustomerController: ControllerBase \n\n    { \n\n     private readonly ICustomerRepository _customerRepository; \n\n     public CustomerController(ICustomerRepository customerRepository) \n\n     { \n\n      _customerRepository = customerRepository; \n\n     } \n\n     [HttpGet] \n\n     public List Get() \n\n     { \n\n      return _customerRepository.GetCustomers(); \n\n     } \n\n    } Note how an instance of type ICustomerRepository is injected in the constructor of the CustomerController class. Remember that you must add an instance of type ICustomerRepository to the services container using the following piece of code in the Program.cs file: builder.Services.AddScoped(); When you execute the application and run the Http Get endpoint of the CustomerController class, you’ll see the records of the customer database table displayed in the web browser: Figure 2: The records of the customer database table are displayed License Key Validation Error When you execute the application, you might run into license validation errors if no valid license key is available. If the license key validation fails, you will encounter a Devart.Common.LicenseException. To resolve this error, you must either have a license key and already be a user, or install the installation file which will install a trial key into the system. Summary Both Inversion of Control and Dependency Injection facilitate building applications that are loosely coupled, flexible, and easy to maintain. However, while DI can help you create classes with responsibilities separated, it adds a lot of complexity, and there is a learning curve involved before one can start using IoC and DI. Additionally, DI introduces a runtime penalty, which, although negligible and you can ignore in most of your applications, can become a challenge in performance-critical systems. Tags [ASP.NET](https://blog.devart.com/tag/asp-net) [dotconnect](https://blog.devart.com/tag/dotconnect) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexploring-dependency-injection-in-aspnet6.html) [Twitter](https://twitter.com/intent/tweet?text=Exploring+Dependency+Injection+in+ASP.NET+6%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fexploring-dependency-injection-in-aspnet6.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/exploring-dependency-injection-in-aspnet6.html&title=Exploring+Dependency+Injection+in+ASP.NET+6%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/exploring-dependency-injection-in-aspnet6.html&title=Exploring+Dependency+Injection+in+ASP.NET+6%C2%A0) [Copy URL](https://blog.devart.com/exploring-dependency-injection-in-aspnet6.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Export and Import JSON Data Via dbForge Data Pump for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) July 23, 2020 [0](https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html#respond) 3554 In the previous articles, we studied [the basic rules of SQL database design](https://blog.devart.com/sql-database-design-basics-with-example.html) , created a database schema diagram for a recruitment service, and filled [the newly-created database with test data](https://blog.devart.com/generate-test-data-with-sql-data-generator.html) on employees. Img.1. The database structure diagram for a recruitment service The database contains the following\nentities: Employee Company Position Project Skill This time, let’s consider the ways to transfer data from one SQL Server database to another one through export and import. This can come in handy when the management system of the client database is an older version than the data source. Meaning, when it is impossible to use a backup copy or when only a part of data has to be transferred, as all data weighs too much. Export data from SQL Server to JSON tables Firstly, the functionality for data export is available in [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) as well as dbForge Data Pump. When it comes to filling SQL databases with external source data and migrating data between systems, [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) is the add-in that is always of great help. Let us first consider its use in SSMS. To begin with, let us perform data\nexport. To do this, in [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-2017) , right-click on the database you need, and in the drop-down menu, select the Data Pump\\ Export Data… command: Img.2. SQL Data export using Data Pump The export settings window will then appear. On the “Export format” tab, you can select different formats. We select JSON format in our case, to transfer data and import it into another database. Let’s then press the “Next” button: Img.3. Setting the “Export format” tab Next, on the “Source” tab, you need to select the necessary tables for export. In this case, we select all user tables and press the “Next” button: Img.4. Setting the “Source” tab Following that, on the “Data formats” tab, you need to select the required columns on the “Columns” tab and setup data formats on the “Formats” tab. In the present case, we choose all columns and leave settings of the “Formats” at their defaults. Then, press the “Next” button: Img.5. Setting the “Data formats” tab On the “Output settings” tab, we need to configure the output parameters of the exported data. Here, we leave the default settings and click on the “Next” button: Img.6. The “Output settings” tab On the “Exported rows” tab, you can configure the number of exported rows, depending on whether you need to export all rows or only a certain part of them. In this case, we leave the settings as default, that is, we will export rows. Now press the “Next” button: Img.7. Setting the “Exported rows” tab Next, on the “Errors handling” tab, you can specify the settings for the output of runtime errors during export. Leave the settings as default. Note that for later use, you can save all the specified settings as a template by clicking the “Save Template…” button in the left bottom corner. Now, you should press the “Export” button to start data export: Img.8. Setting the “Errors handling” tab While export is running, you can view\nthe process of data extraction on the “Export” tab. Upon the completion, you will see the “Finish” tab, where you can either close the window by pressing the “Finish” button or immediately go to the folder that contains generated files by pressing the “Open result folder…” button: Img.9. The result of data export Finally, you can see the folder with generated SQL database data in JSON files: Img.10. The folder with resulting data files Let’s open the contents of the dbo_Employee file: Img.11. The contents of the dbo_Employee file To sum up, we exported the data of the\nnew database to files with JSON format. Note that export can be useful when\nexchanging data between two types of database management systems. It is also quite handy to export data in Excel, to provide some data for analysts to research. Import JSON data to a SQL Server database Now, let us copy the resulting data to\nthe necessary server and restore the JobEmpl2 database there. To do this, we need to create a new database, JobEmpl2, with the same schema as the JobEmpl using the following script: SET CONCAT_NULL_YIELDS_NULL, ANSI_NULLS, ANSI_PADDING, QUOTED_IDENTIFIER, ANSI_WARNINGS, ARITHABORT, XACT_ABORT ON\nSET NUMERIC_ROUNDABORT, IMPLICIT_TRANSACTIONS OFF\nGO\n\nUSE [JobEmpl2]\nGO\n\nIF DB_NAME() <> N'JobEmpl2' SET NOEXEC ON\nGO\n\n--\n-- Start Transaction\n--\nBEGIN TRANSACTION\nGO\n\n--\n-- Create table [dbo].[Skill]\n--\nCREATE TABLE [dbo].[Skill] (\n [SkillID] [int] IDENTITY,\n [SkillName] [nvarchar](255) NOT NULL,\n CONSTRAINT [PK_Skill_SkillID] PRIMARY KEY CLUSTERED ([SkillID])\n)\nON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[Project]\n--\nCREATE TABLE [dbo].[Project] (\n [ProjectID] [int] IDENTITY,\n [ProjectName] [nvarchar](255) NOT NULL,\n [Description] [nvarchar](max) NOT NULL,\n CONSTRAINT [PK_Project_ProjectID] PRIMARY KEY CLUSTERED ([ProjectID])\n)\nON [PRIMARY]\nTEXTIMAGE_ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[ProjectSkill]\n--\nCREATE TABLE [dbo].[ProjectSkill] (\n [ProjectID] [int] NOT NULL,\n [SkillID] [int] NOT NULL,\n CONSTRAINT [PK_ProjectSkill] PRIMARY KEY CLUSTERED ([ProjectID], [SkillID])\n)\nON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_ProjectSkill_ProjectID] on table [dbo].[ProjectSkill]\n--\nALTER TABLE [dbo].[ProjectSkill] WITH NOCHECK\n ADD CONSTRAINT [FK_ProjectSkill_ProjectID] FOREIGN KEY ([ProjectID]) REFERENCES [dbo].[Project] ([ProjectID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_ProjectSkill_SkillID] on table [dbo].[ProjectSkill]\n--\nALTER TABLE [dbo].[ProjectSkill] WITH NOCHECK\n ADD CONSTRAINT [FK_ProjectSkill_SkillID] FOREIGN KEY ([SkillID]) REFERENCES [dbo].[Skill] ([SkillID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[Position]\n--\nCREATE TABLE [dbo].[Position] (\n [PositionID] [int] IDENTITY,\n [PostitionName] [nvarchar](255) NOT NULL,\n CONSTRAINT [PK_Position_PositionID] PRIMARY KEY CLUSTERED ([PositionID])\n)\nON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[Employee]\n--\nCREATE TABLE [dbo].[Employee] (\n [EmployeeID] [int] IDENTITY,\n [FirstName] [nvarchar](255) NOT NULL,\n [LastName] [nvarchar](255) NOT NULL,\n CONSTRAINT [PK_Employee_EmployeeID] PRIMARY KEY CLUSTERED ([EmployeeID])\n)\nON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[Company]\n--\nCREATE TABLE [dbo].[Company] (\n [CompanyID] [int] IDENTITY,\n [CompanyName] [nvarchar](255) NOT NULL,\n [Description] [nvarchar](255) NOT NULL,\n CONSTRAINT [PK_Company_CompanyID] PRIMARY KEY CLUSTERED ([CompanyID])\n)\nON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[JobHistory]\n--\nCREATE TABLE [dbo].[JobHistory] (\n [EmployeeID] [int] NOT NULL,\n [CompanyID] [int] NOT NULL,\n [PositionID] [int] NOT NULL,\n [ProjectID] [int] NOT NULL,\n [StartDate] [date] NOT NULL,\n [FinishDate] [date] NULL,\n [Description] [nvarchar](max) NOT NULL,\n [Achievements] [nvarchar](max) NULL,\n [ReasonsForLeavingTheProject] [nvarchar](max) NULL,\n [ReasonsForLeavingTheCompany] [nvarchar](max) NULL,\n CONSTRAINT [PK_JobHistory] PRIMARY KEY CLUSTERED ([EmployeeID], [CompanyID], [PositionID], [ProjectID])\n)\nON [PRIMARY]\nTEXTIMAGE_ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_JobHistory_Company_CompanyID] on table [dbo].[JobHistory]\n--\nALTER TABLE [dbo].[JobHistory] WITH NOCHECK\n ADD CONSTRAINT [FK_JobHistory_Company_CompanyID] FOREIGN KEY ([CompanyID]) REFERENCES [dbo].[Company] ([CompanyID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_JobHistory_Employee_EmployeeID] on table [dbo].[JobHistory]\n--\nALTER TABLE [dbo].[JobHistory] WITH NOCHECK\n ADD CONSTRAINT [FK_JobHistory_Employee_EmployeeID] FOREIGN KEY ([EmployeeID]) REFERENCES [dbo].[Employee] ([EmployeeID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_JobHistory_Position_PositionID] on table [dbo].[JobHistory]\n--\nALTER TABLE [dbo].[JobHistory] WITH NOCHECK\n ADD CONSTRAINT [FK_JobHistory_Position_PositionID] FOREIGN KEY ([PositionID]) REFERENCES [dbo].[Position] ([PositionID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_JobHistory_Project_ProjectID] on table [dbo].[JobHistory]\n--\nALTER TABLE [dbo].[JobHistory] WITH NOCHECK\n ADD CONSTRAINT [FK_JobHistory_Project_ProjectID] FOREIGN KEY ([ProjectID]) REFERENCES [dbo].[Project] ([ProjectID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Commit Transaction\n--\nIF @@TRANCOUNT>0 COMMIT TRANSACTION\nGO\n\n--\n-- Set NOEXEC to off\n--\nSET NOEXEC OFF\nGO This script can be obtained in a number of ways. For instance, using the [Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) tool. Right-click on it and in the drop-down menu, select the Data Pump\\ “Import Data…” command: Img.12. Data import command in Data Pump Similarly to export, on the “Source file” tab, select the JSON format and the file itself. Then press the “Next” button: Img.13. Setting the “Source file” tab On the “Destination” tab, pick the existing Employee table in the dbo schema of JobEmpl2 database and then press “Next”: Img.14. Setting the “Destination” tab Next, on the “Options” tab, you can set encoding and see what the inserted data will look like. Leave the settings at their defaults and press the “Next” key: Img.15.Setting the “Options” tab On the “Data formats” tab, you can configure the format of inserted data. Here, we leave the default values and press the “Next” button: Img.16. Setting the “Data formats” tab On the “Mapping” tab, you can configure the mapping of the source and target fields. Let’s leave everything as it is and press the “Next” button: Img.17. Setting the “Mapping” tab Following that, on the “Modes” tab, you can set the data import mode. In our case, we leave adding data without deleting the existing data by means of the bulk insert. Then, press “Next”: Img.18. Setting the “Modes” tab On the “Output” tab, you can configure the target place for the data to be imported. In this case, select import data directly to the database and click “Next”: Img.19. Setting the “Output” tab Thereafter, on the “Errors handling” tab, you can configure the output of runtime errors during the import process. You can also save the previous settings as a template by clicking on the “Save Template…” button situated in the left bottom corner. We leave the default settings and press the “Import” button to start import: Img.20. Setting the “Errors handling” tab During the import process, on the “Import” tab, you can track the execution progress. As soon as the import is complete, the result is displayed on the “Finish” tab. Press the “Finish” button to close the window: Img.21. The data import result When you query the destination database, you can make sure that the data was successfully imported to the Employee table: Img.22. The imported data to the Employee table By the same token, you can import the\nremaining files to the corresponding tables. You can perform data export and import with the help of standard functionality of [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-2017) by right-clicking on the necessary database and selecting the corresponding command in the drop-down menu: Img.23. Data export and import with the standard SSMS tools Conclusion To put it briefly, data export and import allow not only to move data between databases and to provide data in a convenient form for subsequent processing but also to import the part of the necessary data into the required database. In the next part, we will examine how to conduct the object and data search in the created database. Tags [data export](https://blog.devart.com/tag/data-export) [data import](https://blog.devart.com/tag/data-import) [data pump](https://blog.devart.com/tag/data-pump) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexport-and-import-json-data-via-dbforge-data-pump-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Export+and+Import+JSON+Data+Via+dbForge+Data+Pump+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fexport-and-import-json-data-via-dbforge-data-pump-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html&title=How+to+Export+and+Import+JSON+Data+Via+dbForge+Data+Pump+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html&title=How+to+Export+and+Import+JSON+Data+Via+dbForge+Data+Pump+for+SQL+Server) [Copy URL](https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/export-azure-sql-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Export Data from Azure SQL Database By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) January 4, 2025 [0](https://blog.devart.com/export-azure-sql-database.html#respond) 2622 We won’t break any new ground by saying that data import and export rank among the most essential operations related to database management—and it’s the latter that we’ll talk about today. Moreover, we’ll talk about it in the context of Azure SQL Database, Microsoft’s fully managed relational database service. It is flexible, secure, and easily scalable on demand, and it’s continuously kept up-to-date by Microsoft. Still, if you are a user of Azure SQL Database, you don’t have all that many ways of exporting your data—and not all of them are as easy and flexible as you’d like them to be. But here, we’ll show you the most convenient ones and save the best for last. Contents Export from the Azure portal Export using the bulk copy program utility Export via SQL Server Management Studio Export via SSMS + dbForge Data Pump Methods comparison Export from the Azure portal The first way to be discussed today is export to a BACPAC file from the Azure portal . A BACPAC file is basically a ZIP file that contains the metadata and data from your database. It can be stored in Azure Blob or in local storage and later imported back into Azure SQL Database, Azure SQL Managed Instance, or a SQL Server instance—whichever you may require. This method is quite suitable if you need to export a database for archiving or for moving to another platform since the database schema gets exported alongside the data. However, be aware that the current maximum size of a BACPAC file is 200 GB for Azure Blob. Larger BACPAC files can be exported to local storage via [SqlPackage](https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage?view=sql-server-ver16) . Microsoft also warns that, for larger databases, BACPAC export and import may take lots of time, and may fail midway for various reasons. Also note that BACPAC files are not intended for backup-and-recovery operations, since Azure offers auto-backups for every database. Now let’s see how to export your database via the Azure portal. 1. Go to the [Azure portal](https://portal.azure.com/) (surely, you must be logged in to proceed) and find the required database. 2. Click it to open the database page and select Export on the toolbar. 3. Enter the BACPAC file name, select your Azure storage account and container for the export, and specify the credentials to access your database. 4. Select OK , and the export operation will commence. If you would like to keep an eye on its progress, you may open the page for the server that contains the database that’s being exported, and under Data management , select Import/Export history . That’s it! It’s very easy, but you need to make sure that the abovementioned considerations are not a problem in your particular case. Export using the bulk copy program utility The second way is to use the bulk copy program utility (a.k.a. BCP), which leads us straight to the command line. 1. First, make sure that the BCP utility is installed on your machine. To do so, open the Command Prompt and run the following: C:\\WINDOWS\\system32> bcp /v The output will be similar to this: C:\\WINDOWS\\system32> bcp /v BCP - Bulk Copy Program for Microsoft SQL Server. Copyright (C) Microsoft Corporation. All Rights Reserved. Version: 15.0.2000.5 If you don’t have the BCP utility installed on your machine, you can download the latest versions [for x64](https://go.microsoft.com/fwlink/?linkid=2142258) and [for x86](https://go.microsoft.com/fwlink/?linkid=2142257) . 2. Now, once you’re ready, run the following command after setting up valid parameters: bcp database.dbo.table out C:\\Path\\datafile.dat -c -U username -S Azure.database.windows.net This command will export table data from database.dbo.table to a file named datafile.dat under the specified path . -c refers to character data, -U refers to your relevant username, and -S refers to the name of the Azure server that you want to connect to. 3. Hit Enter , and you’ll be prompted to enter the password. After that, you will be informed about the success of your export, which will look similar to this: C:\\WINDOWS\\system32> bcp database.dbo.table out C:\\path\\datafile.dat -c -U username -S Azure.database.windows.net Password: Starting copy… 27 rows copied Network packet size (bytes): 4096 Click Time (ms.) Total : 1797 Average : (15.59 rows per sec.) 4. Once it’s done, check the specified file that will contain the exported rows. Export via SQL Server Management Studio The next way that will be scrutinized involves Microsoft’s SQL Server Management Studio, the golden standard for SQL Server databases that is just as nicely compatible with Azure SQL Database. 1. Open SSMS and establish your Azure SQL connection. Then right-click the required database and go to Tasks > Export Data . In the wizard that opens, click Next . 2. On the Choose a Data Source page, specify the required data source, server name, authentication details, and, of course, the database. After that, click Next . 3. Next comes the Choose a Destination page, where you need to select Flat File Destination , enter a name for your file with the path to the folder where it will be saved, and specify the format alongside a couple of additional settings. Once you make sure it’s all correct, click Next . 4. Select whether you want to export data based on a query—or simply leave it as it is to copy data from one or more tables or views. Click Next . 5. Here, specify the source table/view and delimiters and click Next . 6. Next comes the Save and Run Package page, where you indicate whether to save the SSIS package or run it immediately. You can leave the Run immediately option as it is and click Next . 7. Finally, review the package details. Once you make sure everything is correct, click Finish . 8. Wait until the execution is completed and click Close . That’s it! Now you can go to your exported file and check its contents. Well, that was a long ride, wasn’t it? Now let us suggest an alternative that’s far more convenient and far more flexible, with quite a few available data formats and settings at your service. Export via SSMS + dbForge Data Pump We said we’d save the best for last—and we’ll keep our word. Now what if we take the same SQL Server Management Studio, yet enhance it with [dbForge Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) , an add-in focused on filling SQL databases with external source data and migrating data between systems? We’ll get ourselves data export to 14 most popular data formats with the richest customization options you can possibly find. The 14 formats in question are as follows: HTML TXT XLS XLSX MDB RTF PDF JSON XML CSV ODBC DBF SQL Google Sheets Let’s take the XLS format to illustrate export with Data Pump. 1. Open SSMS. In Object Explorer , right-click a database, point to Data Pump , and then click Export Data to open the wizard. You will be greeted by the Export format page. Select the MS Excel format and click Next . 2. On the Source page, select a server connection, a database and its schema, table(s) and/or view(s) that you want to export, and click Next . 3. On the Output settings page, you have two main options: Export data into separate files, where you specify the path to the folder that they will be saved to Export data into a single file, where you specify the path and the file name You will find the list of files to be exported in the Exported files preview box. Additionally, you can enable a few options: append timestamp to the file name, auto-delete old files, and create an archive file with your exported files. 4. On the Options page, you can set the table grid options for exported data: text and background colors and fonts in Header and Rows , as well as width and color of Borders . The results of your configuration are shown in Preview . 5. On the Data formats page, you have two auxiliary tabs. The first one is Columns , where you can select columns for export and check their aliases and data types. The second one is Formats , where you can change the default settings for various formats, as well as select the required binary encoding from the drop-down list. 6. On the Page print settings page, you can configure the page size, orientation, margins, header and footer text (including the option to repeat the table header). You also get a preview of your configurations. 7. On the Exported rows page, you can select to export all rows, export the rows selected on the Data formats page, or export a specified range of rows. 8. On the Errors handling page, you can specify the error processing behavior using one of the three available options, as well as opt to write reports to a log file with a specified path. 9. Lastly, click Export . When your data export is completed, you have several options: you can open the exported file or folder, perform another export operation, view the log file, or simply click Finish . Let’s take a look at the exported document. That’s it! Note that you can click Export at any given moment—you don’t necessarily have to go through all the pages. They only serve to make your export settings more flexible. You should also note that the set of pages and settings in the wizard may differ depending on the selected format. Finally, let’s say that you can effortlessly [automate recurring export operations from the command line](https://docs.devart.com/data-pump/using-command-line-for-export/exporting-data-from-command-line.html) . Methods comparison Method Key Features Ease of Use Limitations Export via Azure Portal – Exports schema & data to a BACPAC file- Stores in Azure Blob or local storage- Imports into Azure SQL, Managed Instance, or SQL Server Easy (UI-based) – Max file size: 200GB for Azure Blob- Slow for large databases- Not suitable for backup & recovery Bulk Copy Program (BCP) – Command-line utility for bulk data export- Fast for large datasets- Supports various file formats Moderate (CLI-based) – Requires command-line knowledge- Schema not included- Manual authentication required SQL Server Management Studio (SSMS) – UI-based export tool- Supports multiple formats (CSV, Excel, etc.)- Allows selective export of tables & views Moderate (UI-based) – Limited format options- Requires SSMS installation SSMS + dbForge Data Pump – Exports to 14 formats (CSV, JSON, Excel, etc.)- Advanced customization options- Automatable via command line Easy (UI-based) – Requires dbForge Data Pump add-in- Some advanced features require a paid license Download dbForge Data Pump for a free 30-day trial today! How about giving the last one a go yourself? That will be rather easy. Just [download Data Pump for a free 30-day trial](https://www.devart.com/dbforge/sql/data-pump/download.html) and install it in a matter of minutes. Note that Data Pump comes as part of [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , an extensive bundle of 15 standalone apps and add-ins for SSMS and Visual Studio. And since data export comprises, like, about 1% of its functionality, let us give you a list of key database-related tasks that it helps you cover: Context-aware SQL code completion, formatting, and refactoring Debugging of T-SQL scripts, stored procedures, triggers, and functions Visual query building on diagrams that eliminates the need for coding Query performance optimization Version control of database schemas and static table data Comparison and synchronization of databases Simplified data analysis and reporting Easy data migration Generation of database documentation Generation of column-intelligent, compliant, and realistic test data Creation and management of database unit tests Real-time server and database performance monitoring Index defragmentation Automation of recurring operations from the command line Integration of all tools into a single DevOps cycle Well, that was long. Yet, there’s even more going on, but that would make the list even longer. That said, you may want to take a closer look at other tools included in the bundle—perhaps you will find something you’ve been looking for. FAQ How to export database from Azure SQL Server? There are several methods you can use to export database from Azure SQL Server: you can use Azure Portal (BACPAC File) for database migration and archiving, choose Bulk Copy Program (BCP) to quickly export table data via the command line, or perform export in SSMS if you want to get flat files. Depending on your project’s needs, pick any of these methods. How do I export an entire SQL database? To export a full SQL database, use the export method featuring the BACPAC File, which includes schema and data. This method is considered best for migration. To perform this export, go to the Azure Portal, select your database, click “Export,” and specify the BACPAC file name, storage account, and credentials. After that, click “Ok” to confirm you are all set, and the export will start. Alternatively, you can use [dbForge Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) to export data. To do it via this tool, open SSMS, right-click the database, and select “Data Pump” > “Export Data.” Choose a format, configure export settings, and run the export. Also, you can use [the dbForge Compare Bundle for SQL Server,](https://www.devart.com/dbforge/sql/compare-bundle/) which provides tools to compare both schema and data and extensive customization options (such as custom comparison configurations, data sources selection to compare, and automation). As you perform export, using these utilities in SSMS will let you compare whole databases, backups, and script folders to ensure you haven’t missed anything. How to export Azure SQL database data to CSV? You can export Azure SQL data to CSV using SSMS and Data Pump, especially when configuring delimiter, encoding, and structure. To handle CSV export in SSMS, go to “Data Pump” > “Export Data.” Here, select CSV format, set up the configurations for date, time, and currency formats, and define what you should do with null values and how you want the boolean values to appear. Next, run the export with advanced settings like automation and filtering. Tags [Azure SQL](https://blog.devart.com/tag/azure-sql) [data export](https://blog.devart.com/tag/data-export) [data pump](https://blog.devart.com/tag/data-pump) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexport-azure-sql-database.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Export+Data+from+Azure+SQL+Database&url=https%3A%2F%2Fblog.devart.com%2Fexport-azure-sql-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/export-azure-sql-database.html&title=How+to+Export+Data+from+Azure+SQL+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/export-azure-sql-database.html&title=How+to+Export+Data+from+Azure+SQL+Database) [Copy URL](https://blog.devart.com/export-azure-sql-database.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/export-salesforce-data-connect-reports-analytics-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) How to Export Salesforce Data and Connect Reports to Analytics Tools By [Victoria Shyrokova](https://blog.devart.com/author/victorias) February 25, 2025 [0](https://blog.devart.com/export-salesforce-data-connect-reports-analytics-tools.html#respond) 343 Salesforce is one of the best CRM platforms available. It helps companies manage sales efficiently, letting them smoothly perform customer interactions and boost many other business tasks. However, the real value of connecting Salesforce to analytics tools is an increased ability to analyze sales data for making data-driven decisions. Salesforce data can be very important for accurate reporting and advanced analytics, and its seamless integration can turn out to be a game-changer for business. This article will help you explore different methods for extracting Salesforce data and will instruct you on how to connect Salesforce to analytics tools like Power BI and Tableau. To understand more about how Salesforce helps businesses grow, check our article on [what is Salesforce and how can businesses benefit from it](https://blog.devart.com/what-is-salesforce-and-how-can-businesses-benefit-from-it.html) . Follow this guide to make your Salesforce data work for you. Table of contents Why exporting Salesforce data is essential Methods to export data from Salesforce Exporting Salesforce data using Devart ODBC drivers Connecting through the Salesforce API Connecting Salesforce data to popular analytics & data visualization tools Using BigQuery for advanced analytics Conclusion Why exporting Salesforce data is essential Exporting Salesforce data is essential for different reasons, which are backed by business success. Let’s overview them. Data backup and security. Salesforce stores a significant amount of business data. Backing it up regularly will ensure you have copies in case you lose your data accidentally or your system crashes down. Analytics and reporting. Salesforce provides reporting, but it might not be advanced enough for deep analysis. Companies can export data and utilize Salesforce reporting tools such as Tableau and Power BI to identify trends, visualize opportunities, and empower better decision-making. Integration. Salesforce data exporting allows seamless Salesforce integration with many other platforms that are currently used, such as ERP systems and big data analytical solutions. It facilitates the ability to view the overall business performance in an easy way. Limitations . Salesforce data limitations include inflexible data visualizations, restricted access to historical data, and export of  Salesforce data to external tools. Methods to export data from Salesforce There are various Salesforce export options available for different use cases. Below are the main export Salesforce data methods that involve built-in and third-party Salesforce data tools. Salesforce data export service Salesforce data loader Salesforce reports export Exporting Salesforce data using Devart ODBC Drivers Salesforce APIs Salesforce data export service The Salesforce CSV export feature offers built-in functionality for exporting data in CSV format. The native Salesforce tools allow users to create backups regularly through large-scale exports. Exporting often includes extensive volumes of data for analysis, migration, and integration with other systems. Limitations. Updates are infrequent in the Salesforce data export service, making it unusable for real-time analytics. Furthermore, the exported files can become vast and complex to manage. Salesforce data loader Salesforce data loader is a bulk import and export client tool that is used to read data from CSV files for imports and write data for exports to CSV files. It processes the data in the form of groups for bulk operation. By configuring the tool, users can map fields and select operations they want to perform (insert, update, delete, and export) to process data.  This approach is quite helpful for managing advanced datasets. Prerequisites for using Salesforce data loader include downloading and installing the application, which requires a compatible Java Runtime Environment, obtaining Salesforce credentials like username, password, and security token, and preparing data in CSV files formatted correctly for bulk data export Salesforce and import. Salesforce report export Let’s learn how to create and export Salesforce reports to Excel or into CSV files for small, precise datasets or manual database updates. For more information on using Excel for data manipulation, including mass updates, see our detailed article on [Salesforce recodrs mass update using Excel Add-ins](https://blog.devart.com/how-to-mass-update-records-in-salesforce-using-excel-add-ins.html) . While easy to use, this method has limitations in scalability for extensive datasets. The process lacks efficiency, and you may encounter data limits while exporting massive datasets. Creating custom reports in Salesforce and exporting them to Excel or in CSV files involves several simple steps. Let’s check them out. In Salesforce, go to the Reports , and navigate to New Report . Select the correct report type (e.g., Accounts ). Customize the Report Fields . Drag fields to add columns. Apply filters to narrow down the data. Group data by specific fields to organize and summarize it. Summarize numerical data. This is regularly used in aggregation with grouping. Add a chart to visualize the data. Choose a format, such as Tabular , Summary , Matrix , or Joined. Click Run and Save to save the report. Click Export , navigate to CSV or Excel , and hit Export to export the data. Exporting Salesforce data using Devart ODBC drivers [Devart ODBC Salesforce drivers](https://www.devart.com/odbc/salesforce/) provide everything one needs for seamless Salesforce ODBC data export and connecting Salesforce to ODBC-compatible applications. Using them, you can easily retrieve Salesforce data. Moreover, these drivers make it easy to connect Salesforce to ODBC-compliant platforms, like Power BI and Tableau. By choosing them, you can save time by automating data exports and syncing with external databases. This opens up a lot of possibilities for data integration and analysis. Configuring the Devart ODBC driver for Salesforce Download and install the Devart ODBC Driver for Salesforce. Open ODBC data source administrator. Create a new system DSN to store connection details. Configure the DSN by adding the Salesforce credentials like username and password. Click Test the connection . After configuring DSN, you can connect to Salesforce data from any ODBC-compliant applications, such as Power BI and Tableau, using the DSN name. Advantages of using Devart ODBC Drivers SQL-based querying. SQL syntax is used to retrieve and manipulate Salesforce data. Real-time data access: Access the most up-to-date Salesforce data directly from your applications. Integration with popular tools: Seamlessly integrate Salesforce data with analytical tools like Excel, Power BI, and Tableau. Connecting through the Salesforce API Developers use Salesforce REST API and Salesforce SOAP API to query information via SOQL/SOSL. Developers request information via an API and then transform it to pass it to other systems. Such custom exporting solutions can be used to synchronize real-time data and deal with more complex scenarios. Real-time, programmatic Salesforce exports enable integration with external systems, automatic backing up of information, data warehousing, and generating custom links such as with ERP or marketing automation. Salesforce exports via APIs require significant technical expertise in API concepts, and knowledge of programming languages like Python. For more information on connecting Salesforce data to applications using Python, read [this guide](https://blog.devart.com/connect-salesforce-python.html) . You are also required to understand JSON, XML, and security rules. Developers must also have experience authenticating and managing API limits, as well as debugging connectivity issues. Connecting Salesforce data to popular analytics & data visualization tools To unlock more profound insights from Salesforce data connectivity, let’s overview how to set up the connection between it and the most popular tools for analytics and data visualization. Keep reading to explore all the intricacies. Integrating with Tableau In order to start, you are required to have some practical experience working with Tableau and Salesforce, as well as Tableau installed on the host workstation. Also, ensure you have a Salesforce account access with all the permissions required. Open your Tableau Desktop application. To connect to Salesforce, go to the Connect pane, click More under the To a Server menu, and choose Salesforce . Click sign in . A new dialog box will open. To connect to Tableau, enter your Salesforce credentials ( username and password ) to log in to your Salesforce account and import data to Tableau. After you’ve signed in successfully, you’ll be asked to grant Tableau access to your Salesforce account and data. Click the Allow button to permit the import of your Salesforce data into Tableau. Once connected, select the objects or tables you need. Your data in Salesforce will now be accessible in Tableau as Measures and Dimensions . Start creating charts and dashboards to visualize the data. Click Sheet to visualize data by adding fields into the rows and columns. Tableau will extract your data source within a few moments (the speed depends on the size of the files). Once satisfied with your customized dashboard, publish it to your Tableau server by signing in to your Tableau Online Server and selecting Publish Workbook . Benefits of live connections vs. static data imports Live connection Always see the most current Salesforce data. Changes in Salesforce are immediately reflected in Tableau. Best for dashboards and up-to-the-minute reporting. Static data import Analyze data even without an internet connection. It can improve performance compared to live connections with very large datasets. Data in Tableau becomes outdated, and you need to refresh it to see the latest changes. Connecting to Power BI Now, let’s explore how to integrate Salesforce data to Power BI. Let’s assume that you have downloaded and installed Power BI Desktop and launched the application. Now, let’s explore how to integrate Salesforce data to Power BI. Download and install Power BI Desktop and launch the application. Go to Home tab on the Power BI Desktop ribbon. Click the Get Data button to see available data sources, and click More to view extra options. In the search bar, type Salesforce and choose either: Salesforce Objects for raw data from tables Salesforce Reports for pre-built reports Click Connect . A new page called Salesforce Object will open up. Select the URL type, such as Production or Custom URL, and click OK to proceed. Click Sign In to connect with the Salesforce account and input the Salesforce login credentials in the corresponding window. Enter your Salesforce login credentials (username and password associated with your Salesforce account) and click Login. Click on Connect. Once you’ve connected, the Navigator window will show you available Salesforce reports or objects. You have to select the actual reports or tables that you wish to import into Power BI, and click Load to import data into Power BI . After loading the data into Power BI, it will appear in the Field pane on the right side of the screen. From there, you can easily visualize by dragging charts and graphs onto the canvas. To successfully export Salesforce to Power BI for insightful reporting and dashboards, users can leverage either the direct Salesforce connector or more powerful middleware solutions. The Salesforce data for Power BI uses a direct connector for regular data imports and middleware for advanced transformations. Using BigQuery for advanced analytics For large-scale Salesforce data analysis,  it’s possible to use BigQuery. To do this, choose the export methods like Data Export Service, API, or set up an ETL tool to extract the data, store it temporarily, load it into BigQuery, and then transform it as needed. After that, you’ll be able to analyze it using SQL. This process is used to automate regular updates. Also, BigQuery’s scalability handles massive datasets and allows advanced Salesforce analytics beyond Salesforce’s limits. Conclusion Exporting Salesforce data is essential for reporting and analysis. This Salesforce data export guide has explored several popular data export methods featuring Data Export Service, Data Loader, Devart ODBC drivers, and API integration. Choosing one of them depends on one’s business objectives, so feel free to use the one that fits you most. Integrating and connecting Salesforce with analytics tools unlocks powerful insights for companies to make data-driven decisions. Get to know more about the benefits of using the Devart ODBC driver for Salesforce to export its data to ODBC-compliant platforms. Tags [odbc](https://blog.devart.com/tag/odbc) [ODBC driver for Salesforce](https://blog.devart.com/tag/odbc-driver-for-salesforce) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexport-salesforce-data-connect-reports-analytics-tools.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Export+Salesforce+Data+and+Connect+Reports+to+Analytics+Tools&url=https%3A%2F%2Fblog.devart.com%2Fexport-salesforce-data-connect-reports-analytics-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/export-salesforce-data-connect-reports-analytics-tools.html&title=How+to+Export+Salesforce+Data+and+Connect+Reports+to+Analytics+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/export-salesforce-data-connect-reports-analytics-tools.html&title=How+to+Export+Salesforce+Data+and+Connect+Reports+to+Analytics+Tools) [Copy URL](https://blog.devart.com/export-salesforce-data-connect-reports-analytics-tools.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025 [ODBC](https://blog.devart.com/category/odbc) [What is Data Integration? Definition, Types, Examples & Use Cases](https://blog.devart.com/what-is-data-integration.html) May 5, 2025"} {"url": "https://blog.devart.com/export-sql-server-data-to-an-excel-file.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Export SQL Server Data to an Excel File By [dbForge Team](https://blog.devart.com/author/dbforge) November 10, 2021 [0](https://blog.devart.com/export-sql-server-data-to-an-excel-file.html#respond) 36787 If you are in search of the optimal way to export data from SQL Server to Excel spreadsheets, look no further. Just check this article. Here we tried to gather together different methods—any of them could turn out to be useful for your particular case. Without further ado, let’s get started! CONTENTS 1. How to export data from a SQL table to Excel using the SQL Server Import and Export Wizard 2. How to export SQL query results to Excel with headers 3. How to export data from SQL Server to an Excel file using the Data Connection Wizard 4. How to export SQL Server data to an Excel file using a T-SQL statement 5. How to export data from SQL Server to Excel using dbForge Studio for SQL Server How to export data from a SQL table to Excel using the SQL Server Import and Export Wizard 1. Launch SSMS and connect to the required database. 2. In Object Explorer , go to the database that you want to export to Excel. Invoke the shortcut menu and go to Tasks > Export Data . The wizard opens. 3. On the Choose a Data Source page, specify the data source and server name from the drop-down list. You can also select either Windows or SQL Server Authentication and, if necessary, select another database to be exported. 4. The second page is Choose a Destination , where you can set the path to the required Excel file and select an Excel version. 5. The next page is Specify Table Copy or Query , where you need to select Copy data from one or more tables or views . 6. On the Select Source Tables and Views page, you can specify the required tables and views. Additionally, you can Edit Mappings or click Preview to see what data is going to be exported to the Excel file. 7. On the Review Data Type Mapping page, you can conveniently review the mapping of the selected tables. 8. On the Save and Run Package page, select the Run immediately check box and click Next . 9. We’re almost there! On the Complete the Wizard page, you will be able to see all the settings you have configured during this export operation. If everything is correct, click Finish to see the export progress and enjoy the results of the successful execution. That’s it! The results will be exported to a new sheet that will be added to your Excel file. Now let’s see how to export SQL query results to Excel with column headers. Additionally, to learn how to export and import data in SQL Server, feel free to watch [this video](https://youtu.be/qF1VxSbzm4g) . How to export SQL query results to Excel with headers The workflow here is quite similar to the previous case, with steps 1-4 being actually identical (so you may check them here ). But then you proceed to step 5—the Specify Table Copy or Query page—where you need to select Write a query to specify the data to transfer . Then, on the Provide a Source Query page, enter the required SQL statement. Click Parse to quickly check whether there are any typos in your statement. Okay, our SQL statement is valid. Let’s click Next to move on. On the Select Source Tables and Views page, you will see the source labeled as [Query] . You can also see that a sheet called Query will be added to the selected Excel file. The rest of the workflow is also identical to the previous case. On the Save and Run Package page, select the Run immediately check box. On the Complete the Wizard page, check all the settings you have configured and click Finish to complete your export. Your query results will be exported to a new sheet that will be added to your Excel file. How to export data from SQL Server to an Excel file using the Data Connection Wizard Or, to be more precise, we are talking import here, because this operation is launched from Excel on a machine that must be connected to SQL Server. The fastest way to launch the Data Connection Wizard is to go to the search bar, enter From SQL Server , and click the suggestion on the drop-down list. 1. On the Connect to Database Server page, enter the required Server name and choose the preferred authentication mode in the Log on credentials section. 2. On the Select Database and Table page, select the required database and one or more required tables from the grid. 3. On the Save Data Connection File and Finish page, you only need to leave everything as it is and click Finish . 4. The final dialog is Import Data , where you need to select to view your data in a Table and choose whether you want to import it to an Existing worksheet or a New worksheet . In the former case, you can specify the required starting cell. Now simply click OK , and the data will be imported into your Excel file. How to export SQL Server data to an Excel file using a T-SQL statement Our next export method brings us back to SSMS. You can use the [T-SQL OPENROWSET](https://docs.microsoft.com/en-us/sql/t-sql/functions/openrowset-transact-sql?view=sql-server-ver15) to export SQL Server data to Excel. Open the query editor via the New Query button on the menu and execute a query, similar to the following one, against the required database: INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Excel 12.0; Database=C:\\Documents\\SQL.xlsx;','SELECT * FROM [Sheet1$]')\nSELECT * FROM HumanResources.Department Please make sure you indicate a valid OLE DB provider name (indicated as Microsoft.ACE.OLEDB.12.0 in the example), Excel file name (indicated as SQL.xlsx in the example), and the table you want to export data from (indicated as HumanResources.Department in the example). Please note that before the execution of your query succeeds, it might encounter a few issues along the way. If it does, check [this guide](https://www.mssqltips.com/sqlservertip/6178/read-excel-file-in-sql-server-with-openrowset-or-opendatasource/) for a list of possible errors with detailed solutions. How to export data from SQL Server to Excel using dbForge Studio for SQL Server Finally, you can perform versatile export operations via [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , a multifunctional IDE that covers nearly every aspect of SQL Server development, management, and administration. Naturally, [data import and export tools](https://www.devart.com/dbforge/sql/studio/data-export-import.html) are part of its core functionality. Export to Excel becomes far more customizable than in the previous cases—yet just as easy. 1. In Object Explorer , right-click the required database/table/view and click Export Data on the shortcut menu to invoke the export wizard. On its first page, Export format , select the preferred format: MS Excel (.xls) or MS Excel 2007 (.xlsx). 2. On the Source page, check your connection, database, schema, and the tables and/or views selected for export. 3. On the Output settings page, select to export data to a single file. Additionally, you can append a timestamp to the file name, auto-delete old files, create an archive with exported files, and preview them. 4. On the Options page, you can configure the table grid options for exported data. The results are conveniently shown in the Preview section. 5. On the Data formats page, you have two tabs. The first one is Columns , where you can select columns for export and check their aliases and data types. The second one is Formats , where you can change the default format settings as well as select the required binary encoding from the drop-down list. 6. On the Page print settings page, you can configure the page size, orientation, margins, header and footer text (including the option to repeat the table header). 7. On the Exported rows page, you can select to export all rows, export the rows selected on the Data formats page, or export a specified range of rows. 8. On the Errors handling page, you can specify the error processing behavior and opt to write reports to a log file with a specified path. 9. That’s it! Now you only have to click Export , and the operation will be completed in a matter of moments. You will find the exported Excel file in the specified destination folder. It is also worth noting that you can save export settings to a template using the Save button in the lower left corner of the screen. For avid users of SQL Server Management Studio, we can suggest an alternative – [dbForge Data Pump](https://www.devart.com/dbforge/sql/data-pump/) , a powerful SSMS add-in that allows exporting data to 14 formats and importing data from 10 formats, including [Google Sheets](https://www.devart.com/dbforge/sql/data-pump/data-export-import-google-sheets.html) . [Download a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) of dbForge Studio for SQL Server to gain some firsthand experience and see it in action. Tags [data export](https://blog.devart.com/tag/data-export) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [excel](https://blog.devart.com/tag/excel) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexport-sql-server-data-to-an-excel-file.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Export+SQL+Server+Data+to+an+Excel+File&url=https%3A%2F%2Fblog.devart.com%2Fexport-sql-server-data-to-an-excel-file.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/export-sql-server-data-to-an-excel-file.html&title=How+to+Export+SQL+Server+Data+to+an+Excel+File) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/export-sql-server-data-to-an-excel-file.html&title=How+to+Export+SQL+Server+Data+to+an+Excel+File) [Copy URL](https://blog.devart.com/export-sql-server-data-to-an-excel-file.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Export SQL Stored Procedure to a File and Generate Its Script By [dbForge Team](https://blog.devart.com/author/dbforge) May 7, 2020 [0](https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html#respond) 13376 In previous articles, we have reviewed a general algorithm of [finding and deleting incomplete open transactions in SQL Server databases](https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html) , created a table for fixing incomplete transactions and a CRUD-stored procedure, and implemented numerous settings that will make our document workflow productive and handy. Let’s now export and test a stored procedure, generate its code, and turn it into a script – it will be executed on any host or a group of hosts by specifying proper entry parameters. Export Stored Procedure in SQL Server It can also be quite useful to generate a script for creating the necessary database objects, for example, generate a script to export a stored procedure to a file or copy the stored procedure to other solutions. To do this, follow the following steps: In the Object Explorer, right-click on your database Select Tasks from the context menu that appears Select the Generate Scripts command Fig. 1 Selecting the Generate Scripts command Select the objects to script Fig. 2 Selecting the objects you wish to script In the Set Scripting Options window, select Script to File Fig 3. Selecting Script To File Run the generated script against the target database. When all these steps are performed, we will get a generated script of the exported stored procedure. Testing a stored procedure Let’s return to a stored procedure that we have created in [part 2](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) . If we drag the cursor to a stored procedure, a drop-down menu will pop up: Fig 4. The stored procedure drop-down menu To work properly, your stored procedure should have a description. If it hasn’t been added yet, please run the following script: EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'Identification of frozen transactions (forgotten ones that do not have active requests) with their subsequent removal' , @level0type=N'SCHEMA',@level0name=N'srv', @level1type=N'PROCEDURE',@level1name=N'AutoKillSessionTranBegin' GO Or go for the object’s extended properties: Fig.5 Adding the stored procedure’s description You can also add a description using [this method](https://www.codeproject.com/Articles/5161784/Documenting-MS-SQL-Server-Databases) . Stored procedure’s code generation You can call it with a right-click on a stored procedure’s drop-down menu, then select “Script Object as CREATE/ALTER”: Fig.6 Selecting the “Script Object as CREATE/ALTER” in the context menu Once you do this, a script for the object change will pop up – in our case, it will be the stored procedure change. If there is no object, there would be a script for creating it. This script can be easily moved to proper MS SQL Server instances. Turning stored procedure code to a script You can select “Convert EXEC to Script” in a drop-down menu: Fig.7 Selecting the “Convert EXEC to Script” command in the SQL Complete main menu Instead of calling a stored procedure, a script will be created, entry parameters will become variables, and the stored procedure content will become a script. It’s a handy feature when testing your code: Fig.8 The result of creating a script from a stored procedure Now you can run the derived script on any host or a group of hosts by setting entry parameters’ proper values. Query creation history Let’s now review another important functionality – [SQL Complete: Execution History](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html) that displays the previous queries. To do this, click on the SQL Complete: Execution History button: Fig.9 Choosing the “Execution History” command in the SQL Complete main menu You will see the following window: Fig.10 The display of the executed scripts The top left corner of the window contains filters for words and time range, while the right corner shows the number of queries that were found based on the data entered in the filter. The bottom side of the window shows the query selected from the table. The table consists of the following columns: Status – shows whether a query was completed successfully (white checkmark in a green circle) or not (white cross in a red circle) QueryText – shows query text Size (Bytes) – shows size measured in bytes Execution On – shows date and time when a query was executed Duration – shows the time it took a query to be executed File – shows the file’s name and a full path to it Server – shows the server’s name User – shows a user who executed a script Database – shows a database in which a script was executed The search for the “QueryText” and “Execution On” columns is done in the “Search” and “From/To” filters. You can also sort data columns in ascending or descending order by clicking on a header of a proper column. By default, the “Execution On” column sorting is enabled in descending order. You can sort multiple columns at the same time by holding a SHIFT button. You can also set up more complex filters by clicking on the filter sign: Fig.11 Setting up column filtering Let’s now enter our stored procedure name “ AutoKillSessionTranBegin ” in the “Search” filter: Fig.12 Stored procedure search in the “Execution History” tab As a result, we get a full creation history of the srv.AutoKillSessionTranBegin procedure that was described before. If you right-click on a proper table row, you can open a script in a new window or copy it to the clipboard for pasting it wherever needed: Fig.13 Copying the selected script from the “Execution History” tab You can open a script in a new window by double-clicking on the desired script in a table row. You can also specify for how long the execution history is stored, max query size, and the history storage path: Fig.14. The “Execution History” command settings in SQL Complete You can read more details about the Execution History functionality [here](https://docs.devart.com/studio-for-sql-server/writing-and-executing-sql-statements/sql-query-execution-history.html) . That’s all, folks. The new functionality has been developed, tested, and moved to proper MS SQL Server instances. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql server transactions](https://blog.devart.com/tag/sql-server-transactions) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [sql stored procedure](https://blog.devart.com/tag/sql-stored-procedure) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexport-sql-stored-procedure-to-a-file-and-generate-its-script.html) [Twitter](https://twitter.com/intent/tweet?text=Export+SQL+Stored+Procedure+to+a+File+and+Generate+Its+Script&url=https%3A%2F%2Fblog.devart.com%2Fexport-sql-stored-procedure-to-a-file-and-generate-its-script.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html&title=Export+SQL+Stored+Procedure+to+a+File+and+Generate+Its+Script) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html&title=Export+SQL+Stored+Procedure+to+a+File+and+Generate+Its+Script) [Copy URL](https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/exportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Exportieren und Importieren von SQL Server-Datenbankdaten in ein SQL-Skript By [dbForge Team](https://blog.devart.com/author/dbforge) September 8, 2022 [0](https://blog.devart.com/exportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html#respond) 3678 Wir stehen recht häufig vor der Aufgabe, Datenbankverzeichnisdaten zu migrieren. Die Datenbank für einen Personalvermittlungsdienst, die wir in der vorangegangenen Artikelserie erstellt haben, bildet dabei keine Ausnahme. In diesem Artikel zeigen wir Ihnen, wie Sie Daten in eine .sql-Datei exportieren und anschließend in die Zieldatenbank importieren können. Sie können entweder die gesamte Datenbank oder bestimmte Objekte migrieren, je nachdem, was Sie vorhaben. Zunächst einmal sollten Sie wissen, dass es viele Möglichkeiten gibt, dieses Problem zu lösen: Mit [dbForge Data Compare for SQL Server](https://dzone.com/articles/synchronizing-data-in-sql-server-databases) . Verwendung der in [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) eingebetteten Datenexportwerkzeuge. Über die Implementierung von SSIS-Paketen. Die meisten der oben genannten Lösungen sind jedoch zu umständlich, wenn es darum geht, eine oder mehrere Tabellen zu migrieren. In diesem Artikel werden wir einen Blick auf die [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) -Lösung werfen, die ebenfalls Teil von [dbForge Studio for SQL Server](https://www.devart.com/en/dbforge/sql/studio/) ist. Dieses Tool erleichtert den Datenimport und -export, da es erweiterte Optionen, Vorlagen und eine Reihe von weit verbreiteten Datenformaten für den Export und Import bietet. Eine Schritt-für-Schritt-Anleitung für die [Erstellung einer Datenbank für einen Personalvermittlungsdienst](https://blog.devart.com/sql-database-design-basics-with-example.html) finden Sie [hier](https://blog.devart.com/sql-database-design-basics-with-example.html) . Abb.1. Datenbankschema für einen Personalvermittlungsdienst Wie exportiert man SQL Server-Daten in ein SQL-Skript? Wählen Sie Datenexport auf Datenbankebene aus Legen wir los! Wählen Sie die gewünschte Datenbank aus und klicken Sie mit der rechten Maustaste auf die gewünschte Tabelle (wenn Sie eine bestimmte Tabelle exportieren möchten) oder die Datenbank (wenn Sie mehrere Tabellen exportieren möchten) und wählen Sie Export Data : Klicken Sie auf Export Data Wählen Sie das Exportformat Auf der Registerkarte Export format können Sie dann zwischen verschiedenen Formaten für den Datenexport wählen. Wir wählen SQL-Skripte und drücken auf Next : Wählen Sie die zu exportierenden Daten Wählen Sie nun auf der Registerkarte Source die Tabellen aus, aus denen Daten exportiert werden sollen, dann klicken Sie auf Next . In diesem Fall wählen wir drei Verzeichnisse aus: 1. “dbo.Company” – Liste von Unternehmen. 2. “dbo.Position” – Liste der Stellen. 3. “dbo.Skill” – Liste der Fähigkeiten. Beachten Sie, dass es möglich ist, die Verbindung und die Datenbank jederzeit zu ändern. Wählen Sie die Art der Skripterstellung Wählen Sie dann auf der Registerkarte Options die Art der Skripterstellung für den Datenexport aus und legen Sie fest, ob Sie den Datenbanknamen in das Skript aufnehmen möchten. Klicken Sie dann auf Next . Beachten Sie, dass das Fenster 4 Arten der Skripterstellung für den Datenexport vorschlägt: 1. INSERT (EINFÜGEN). Ein Skript für das Einfügen von Daten wird generiert. 2. UPDATE (AKTUALISIEREN). Ein Skript für die Aktualisierung von Daten wird generiert, d.h. die passenden Schlüsselfelder werden gefunden und die Aktualisierung wird durchgeführt. 3. DELETE (LÖSCHEN). Ein Skript zum Löschen von Daten wird generiert, d.h. alle Daten, die mit den exportierten Daten nach Schlüsselfeldern auf der Seite der Zieldatenbank übereinstimmen, werden gelöscht. 4. MERGE (ZUSAMMENFÜHREN). Ein Skript zum Zusammenführen von Daten wird generiert, welches die ersten beiden Typen umfasst: INSERT und UPDATE. Wählen Sie die Spalten und Schlüsselfelder für den Export aus Auf der Registerkarte Table columns müssen Sie nun die erforderlichen Spalten und Schlüsselfelder für den Export auswählen (standardmäßig werden alle Spalten für den Export ausgewählt, und die Schlüsselfelder entsprechen den Definitionen der Primärschlüssel der entsprechenden Tabellen). Klicken Sie dann auf Next : Wählen Sie die zu exportierenden Daten aus Wählen Sie anschließend auf der Registerkarte Exported rows die zu exportierenden Daten aus und klicken Sie auf Next . Beachten Sie, dass Sie sowohl alle Zeilen als auch einen genauen Bereich von Zeilen für den Datenexport auswählen können. Konfigurieren Sie die Registerkarte Errors handling Zusätzlich können Sie auf der Registerkarte Errors handling Parameter für die Fehlerbehandlung konfigurieren. Beachten Sie, dass User häufig die Option Write a report to a log file wählen, wenn sie die Berichtsergebnisse analysieren müssen. Der Einfachheit halber sollten Sie die Standardoptionen beibehalten und auf Export klicken, um mit dem Datenexport zu beginnen. Schließen Sie den Datenexport ab Wenn der Datenexport abgeschlossen ist, können Sie entweder auf Finish klicken oder den Ordner mit den erstellten Skripten öffnen, indem Sie auf die Schaltfläche Open result folder klicken: Lassen Sie sich die Skripte anzeigen Als Ergebnis werden für jede Verzeichnistabelle 3 Skripte erstellt: Das T-SQL-Skript sieht folgendermaßen aus: SET DATEFORMAT ymd\nSET ARITHABORT, ANSI_PADDING, ANSI_WARNINGS, CONCAT_NULL_YIELDS_NULL, QUOTED_IDENTIFIER, ANSI_NULLS, NOCOUNT ON\nSET NUMERIC_ROUNDABORT, IMPLICIT_TRANSACTIONS, XACT_ABORT OFF\nGO\n\nSET IDENTITY_INSERT JobEmplDB.dbo.Skill ON\nGO\nINSERT JobEmplDB.dbo.Skill(SkillID, SkillName) VALUES (689, N'C#')\n...\nINSERT JobEmplDB.dbo.Skill(SkillID, SkillName) VALUES (14, N'SQL')\nGO\nSET IDENTITY_INSERT JobEmplDB.dbo.Skill OFF\nGO Sie müssen die generierten Skripte auf die Zieldatenbank anwenden. Was aber, wenn die Daten in einem anderen Format gespeichert wurden? Hierfür gibt es den Datenimport, den Sie öffnen können, indem Sie mit der rechten Maustaste auf die Datenbank oder die gewünschte Tabelle klicken: Abb.12. Auswahl des Datenimports auf Datenbankebene Abb.13. Auswahl des Datenimports auf Tabellenebene Machen Sie so weiter, wie wir den Datenexport durchgeführt haben. Besuchen Sie auch das Dokumentationszentrum, um mehr darüber zu erfahren, [wie Sie Daten aus einer CSV-Datei importieren](https://docs.devart.com/studio-for-sql-server/exporting-and-importing-data/csv-import.html) können. CSV ist ein kompaktes Textformat, das zum Speichern von Tabellendaten verwendet wird. Außerdem ist es ein sehr verbreitetes Format, da die meisten modernen Tabellenkalkulationsprogramme (wie Excel) mit Dateien im CSV-Format arbeiten können (Export/Import von Daten). Das war’s erstmal. Diesmal haben wir uns mit dem Import und Export von Daten in eine SQL-Datei mithilfe einer hochgradig anpassbaren Lösung von dbForge Studio for SQL Server beschäftigt. Tags [data export](https://blog.devart.com/tag/data-export) [data import](https://blog.devart.com/tag/data-import) [data pump](https://blog.devart.com/tag/data-pump) [sql file](https://blog.devart.com/tag/sql-file) [sql script](https://blog.devart.com/tag/sql-script) [SQL Server](https://blog.devart.com/tag/sql-server) [SSIS](https://blog.devart.com/tag/ssis) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fexportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html) [Twitter](https://twitter.com/intent/tweet?text=Exportieren+und+Importieren+von+SQL+Server-Datenbankdaten+in+ein+SQL-Skript&url=https%3A%2F%2Fblog.devart.com%2Fexportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/exportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html&title=Exportieren+und+Importieren+von+SQL+Server-Datenbankdaten+in+ein+SQL-Skript) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/exportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html&title=Exportieren+und+Importieren+von+SQL+Server-Datenbankdaten+in+ein+SQL-Skript) [Copy URL](https://blog.devart.com/exportieren-importieren-sql-server-datenbankdaten-in-ein-sql-skript.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/fast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) Fast and Simple Database App Development and Deployment to Linux in RAD Studio By [DAC Team](https://blog.devart.com/author/dac) April 20, 2017 [6](https://blog.devart.com/fast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html#comments) 6518 The Devart company released [UniDAC](https://www.devart.com/unidac/) with support for Linux 64-bit platform. UniDAC makes application development and maintenance easier and simpler because the use of the Direct mode in a custom application does not require the installation of client libraries, additional drivers, etc. This helps to avoid the overhead when accessing a DBMS, hence increasing performance. So, in this article, we will demonstrate the capability of UniDAC to establish a connection to [various DBMSs](https://www.devart.com/unidac/compatibility.html) in the Direct mode: ASE DBF MongoDB MySQL Oracle PostgreSQL SQLite SQL Azure SQL Server Creating a console application Let’s create a new UniDAC application for Linux 64-bit in RAD Studio 10.2 Tokyo. Go to File on the main menu, click New , then click Other In the appeared dialog box, click Console Application . Configuring UniDAC to connect in the Direct mode To use UniDAC in a console application, you should add the Uni unit in the uses section, as well as the unit with the UniDAC provider for each DBMS. Let’s add providers for all DBMSs which are mentioned at the beginning of the article: program UniDAC_Linux;\n{$APPTYPE CONSOLE}\n{$R *.res}\nuses\n SysUtils,\n Uni,\n ASEUniProvider, // add this unit for ASE\n DBFUniProvider, // add this unit for DBF\n MongoDBUniProvider, // add this unit for MongoDB \n MySQLUniProvider, // add this unit for MySQL \n OracleUniProvider, // add this unit for Oracle \n PostgreSQLUniProvider, // add this unit for PostgreSQL \n SQLiteUniProvider, // add this unit for SQLite \n SQLServerUniProvider; // add this unit for SQL Server & SQL Azure Creating connection and dataset instances, as well as executing SQL queries and data fetching are similar for all UniDAC providers: var\n UniConnection: TUniConnection;\n UniQuery: TUniQuery;\nbegin\n UniConnection := TUniConnection.Create(nil);\n UniQuery := TUniQuery.Create(nil);\n UniQuery.Connection := UniConnection; Establishing connection and data fetching in the Direct mode The Deployment tab contains only the application file – there are no additional libraries or files: Delphi-code for ASE: UniConnection.ProviderName := 'ASE';\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n UniConnection.Server := 'ASE_DB';\n UniConnection.Port := 5000;\n UniConnection.Username := 'sa';\n UniConnection.Password := '*****';\n UniConnection.Database := 'DEMO';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC ASE Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select @@version');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); ASE execution result: Delphi-code for DBF: UniConnection.ProviderName := 'DBF';\n UniConnection.SpecificOptions.Values['DBFFormat'] := 'dfdBaseVII';\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n UniConnection.Database := '/home/test/Documents';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC DBF Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); DBF execution result: Delphi-code for MongoDB: UniConnection.ProviderName := 'MongoDB';\n UniConnection.Server := 'MongoDBServer';\n UniConnection.Port := 27017;\n UniConnection.SpecificOptions.Values['BSONLibrary'] := '/usr/lib64/libbson-1.0.so';\n UniConnection.SpecificOptions.Values['ClientLibrary'] := '/usr/lib64/libmongoc-1.0.so';\n UniConnection.Database := 'demo';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC MongoDB Provider ==--'+#13#10);\n Writeln(UniConnection.ServerVersionFull);\n MongoDB_Insert_EMP(UniQuery);\n WritelnQuery(UniQuery, '{\"find\":\"emp\", projection:{_id:0, empno:1, ename:1, job:1, hiredate:1}}'); MongoDB execution result: Delphi-code for MySQL: UniConnection.ProviderName := 'MySQL';\n UniConnection.Server := 'MySQL_db';\n UniConnection.Port := 3312;\n UniConnection.Username := 'root';\n UniConnection.Password := '*****';\n UniConnection.Database := 'demo';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC MySQL Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select @@version');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); MySQL execution result: Delphi-code for Oracle: UniConnection.ProviderName := 'Oracle';\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n UniConnection.Server := 'ORCL12C:1521/pdborcl';\n UniConnection.Username := 'scott';\n UniConnection.Password := '******';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC Oracle Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select * from v$version');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); Oracle execution result: Delphi-code for PostgreSQL UniConnection.ProviderName := 'PostgreSQL'; \n UniConnection.Server := 'pg_db'; \n UniConnection.Database := 'demo'; \n UniConnection.Username := 'postgres'; \n UniConnection.Password := '******'; \n UniConnection.Port := 5432; \n UniConnection.Connect; \n Writeln(#13#10+'--== UniDAC PostgreSQL Provider ==--'+#13#10); \n WritelnQuery(UniQuery, 'select version()');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); PostgreSQL execution result: Delphi-code for SQLite: UniConnection.ProviderName := 'SQLite';\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n UniConnection.SpecificOptions.Values['ForceCreateDatabase'] := 'True';\n UniConnection.Database := ':memory:';\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC SQLite Provider ==--'+#13#10);\n Writeln(UniConnection.ServerVersionFull);\n SQLite_Create_EMP(UniQuery);\n SQLite_Insert_EMP(UniQuery);\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); SQLite execution result: Delphi-code for SQL Azure: UniConnection.Disconnect;\n UniConnection.ProviderName := 'SQL Server';\n UniConnection.SpecificOptions.Values['Provider'] := 'prDirect';\n UniConnection.Server := 'qps1hrvdke.database.windows.net';\n UniConnection.Database := 'DEMO';\n UniConnection.Username := '****@qps1hrvdke';\n UniConnection.Password := '*******';\n UniConnection.Port := 1433;\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC SQL Server(Azure) Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select @@version');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); SQL Azure execution result: Delphi-code for SQL Server: UniConnection.Disconnect;\n UniConnection.ProviderName := 'SQL Server';\n UniConnection.SpecificOptions.Values['Provider'] := 'prDirect';\n UniConnection.Server := '192.168.0.15\\MSSQL2016';\n UniConnection.Database := 'DEMO';\n UniConnection.Username := 'sa';\n UniConnection.Password := '*****';\n UniConnection.Port := 1433;\n UniConnection.Connect;\n Writeln(#13#10+'--== UniDAC SQL Server Provider ==--'+#13#10);\n WritelnQuery(UniQuery, 'select @@version');\n WritelnQuery(UniQuery, 'select empno, ename, job, hiredate from emp'); SQL Server execution result: Here is the complete project source code: [UniDAC_Linux](https://blog.devart.com/wp-content/uploads/2017/04/UniDAC_Linux.zip) Conclusion In this article, we showed how easily and simply you can create applications for Linux without deploying additional files, client libraries or drivers. For this, you only need to install UniDAC, write a few code lines, and the application to work with databases on Linux is complete. Tags [delphi](https://blog.devart.com/tag/delphi) [direct mode](https://blog.devart.com/tag/direct-mode) [linux](https://blog.devart.com/tag/linux) [performance](https://blog.devart.com/tag/performance) [rad studio](https://blog.devart.com/tag/rad-studio) [unidac](https://blog.devart.com/tag/unidac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html) [Twitter](https://twitter.com/intent/tweet?text=Fast+and+Simple+Database+App+Development+and+Deployment+to+Linux+in+RAD+Studio&url=https%3A%2F%2Fblog.devart.com%2Ffast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/fast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html&title=Fast+and+Simple+Database+App+Development+and+Deployment+to+Linux+in+RAD+Studio) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/fast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html&title=Fast+and+Simple+Database+App+Development+and+Deployment+to+Linux+in+RAD+Studio) [Copy URL](https://blog.devart.com/fast-and-simple-database-application-development-and-deployment-to-linux-in-rad-studio.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 6 COMMENTS Peter Edwards April 27, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 3:08 pm When you refer to DBF – does that mean dbase compatible files ? Thanks DAC Team April 27, 2017\t\t\t\t\t\t At\t\t\t\t\t\t 3:37 pm Hello, Peter! Thank you for the comment. The list of supported DBF formats you can find by the link: [https://www.devart.com/unidac/compatibility.html](https://www.devart.com/unidac/compatibility.html) Kerbadou Ghazali May 27, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 4:02 pm Hi, thanks for great products!, where can we found our PostgreSQL dabase located in Windows 7? DAC Team May 29, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 10:06 am Hello, Kerbadou! The question about DB administration is not related to UniDAC functionality. Please contact your DB administrator or PostgreSQL provider to get an answer to this question. Hur AKDULGER December 31, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 4:22 pm Will you support Informix in the near future? Regards. Hur DAC Team January 2, 2019\t\t\t\t\t\t At\t\t\t\t\t\t 2:32 pm Hello, Hur! If you have an ODBC driver for Informix, you can work with Informix using UniDAC and this driver. A possibility of working with Informix without ODBC is not on our roadmap. Comments are closed."} {"url": "https://blog.devart.com/fastest_way_to_calculate_the_record_count.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) What is the fastest way to calculate the record COUNT? By [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) December 15, 2015 [3](https://blog.devart.com/fastest_way_to_calculate_the_record_count.html#comments) 37732 I have always liked simple questions with lots of pitfalls. Here is the one: how do you count the total number of records in a table? At first sight, it’s a snap, but if you dig a little deeper, you can reveal lots of peculiar nuances. So, let’s start from a simple thing. Do the following queries differ in terms of the end result? SELECT COUNT(*) FROM Sales.SalesOrderDetail\nSELECT COUNT_BIG(*) FROM Sales.SalesOrderDetail Most of you will say there is no difference. The given queries will return the identical result, but COUNT will return the value of the INT type, while COUNT_BIG will return the value of the BIGINT type. If we analyze the execution plan, we will notice the differences, that are often overlooked. When using COUNT , the plan will show the Compute Scalar operation. If we take a look at the operator properties, we will see the following: [Expr1003] = Scalar Operator(CONVERT_IMPLICIT(int,[Expr1004],0)) It happens because COUNT_BIG is used implicitly when calling COUNT , and then the result is converted into INT . Remember that data type conversion increases the processor load. Many of you may say that this operator is not a big deal in terms of execution. However, there is a thing worth mentioning – SQL Server tends to underestimate the Compute Scalar operators. Nevertheless, the aforesaid example does not require worrying about performance – the truncation of Int64 to Int32 does not require much resources. I also know people who like using SUM instead of COUNT : SELECT SUM(1) FROM Sales.SalesOrderDetail This variant is approximately identical to COUNT – we will also get the excessive Compute Scalar in execution plan: [Expr1003] = Scalar Operator(CASE WHEN [Expr1004]=(0) THEN NULL ELSE [Expr1005] END) Let’s explode one myth. If you indicate a constant value in COUNT , the query won’t become faster, since the optimizer create identical execution plan for these queries: SELECT COUNT_BIG(*) FROM Sales.SalesOrderDetail\nSELECT COUNT_BIG(1) FROM Sales.SalesOrderDetail Now, let’s dwell on performance issues. If we use the aforesaid queries, we should use Full Index Scan (or Full Table Scan if it is a heap table) for counting SQL Server records. Anyways, these operations are far from being fast. The best way to get the record count is to use the [sys.dm_db_partition_stats](https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2012/ms187737(v=sql.110)) or [sys.partitions](https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2012/ms175012(v=sql.110)) system views (there is also [sysindexes](https://docs.microsoft.com/en-us/sql/relational-databases/system-compatibility-views/sys-sysindexes-transact-sql?view=sql-server-2017) , but it has been left for the backward compatibility with SQL Server 2000 ). USE AdventureWorks2012\nGO\n\nSET STATISTICS IO ON\nSET STATISTICS TIME ON\nGO\n\nSELECT COUNT_BIG(*)\nFROM Sales.SalesOrderDetail\n\nSELECT SUM(p.[rows])\nFROM sys.partitions p\nWHERE p.[object_id] = OBJECT_ID('Sales.SalesOrderDetail')\n AND p.index_id < 2\n\nSELECT SUM(s.row_count)\nFROM sys.dm_db_partition_stats s\nWHERE s.[object_id] = OBJECT_ID('Sales.SalesOrderDetail')\n AND s.index_id < 2 If we compare execution plans, the access to system views is less consuming: If we test on AdventureWorks , the advantages of system views are not that obvious: Table 'SalesOrderDetail'. Scan count 1, logical reads 276, ...\n SQL Server Execution Times:\n CPU time = 12 ms, elapsed time = 26 ms.\n\nTable 'sysrowsets'. Scan count 1, logical reads 5, ...\n SQL Server Execution Times:\n CPU time = 4 ms, elapsed time = 4 ms.\n\nTable 'sysidxstats'. Scan count 1, logical reads 2, ...\n SQL Server Execution Times:\n CPU time = 2 ms, elapsed time = 1 ms. Execution time for the partitioned table with 30 millions records: Table 'big_test'. Scan count 6, logical reads 114911, ...\n SQL Server Execution Times:\n CPU time = 4859 ms, elapsed time = 5079 ms.\n\nTable 'sysrowsets'. Scan count 1, logical reads 25, ...\n SQL Server Execution Times:\n CPU time = 0 ms, elapsed time = 2 ms.\n\nTable 'sysidxstats'. Scan count 1, logical reads 2, ...\n SQL Server Execution Times:\n CPU time = 0 ms, elapsed time = 2 ms. In case you need to check the records availability in a table, the usage of metadata does not deliver any particular advantages (as it has been showed above). IF EXISTS(SELECT * FROM Sales.SalesOrderDetail)\n PRINT 1\n\nIF EXISTS(\n SELECT * FROM sys.dm_db_partition_stats\n WHERE [object_id] = OBJECT_ID('Sales.SalesOrderDetail')\n AND row_count > 0\n) PRINT 1 Table 'SalesOrderDetail'. Scan count 1, logical reads 2,...\n SQL Server Execution Times:\n CPU time = 1 ms, elapsed time = 3 ms.\n\nTable 'sysidxstats'. Scan count 1, logical reads 2,...\n SQL Server Execution Times:\n CPU time = 4 ms, elapsed time = 5 ms. In practical terms, it will be a bit slower, since SQL Server generates a more complicated execution plan for selection from metadata. Here is one more case I encountered: IF (SELECT COUNT(*) FROM Sales.SalesOrderHeader) > 0\n PRINT 1 With optimizer, this case can be simplified to the plan we got in EXISTS . It gets more interesting when we need to count the number of records for all tables at once. In my practice, I encountered several variants. Variant #1 with applying undocumented procedure, that bypasses all user tables by cursor: IF OBJECT_ID('tempdb.dbo.#temp') IS NOT NULL\n DROP TABLE #temp\nGO\nCREATE TABLE #temp (obj SYSNAME, row_count BIGINT)\nGO\n\nEXEC sys.sp_MSForEachTable @command1 = 'INSERT #temp SELECT ''?'', COUNT_BIG(*) FROM ?'\n\nSELECT *\nFROM #temp\nORDER BY row_count DESC Variant #2 – a dynamic SQL that generates the SELECT COUNT(*) queries: DECLARE @SQL NVARCHAR(MAX)\n\nSELECT @SQL = STUFF((\n SELECT 'UNION ALL SELECT ''' + SCHEMA_NAME(o.[schema_id]) + '.' + o.name + ''', COUNT_BIG(*)\n FROM [' + SCHEMA_NAME(o.[schema_id]) + '].[' + o.name + ']'\n FROM sys.objects o\n WHERE o.[type] = 'U'\n AND o.is_ms_shipped = 0\n FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 10, '') + ' ORDER BY 2 DESC'\n\nPRINT @SQL\nEXEC sys.sp_executesql @SQL Variant #3 – a fast variant for everyday use: SELECT SCHEMA_NAME(o.[schema_id]), o.name, t.row_count\nFROM sys.objects o\nJOIN (\n SELECT p.[object_id], row_count = SUM(p.row_count)\n FROM sys.dm_db_partition_stats p\n WHERE p.index_id < 2\n GROUP BY p.[object_id]\n) t ON t.[object_id] = o.[object_id]\nWHERE o.[type] = 'U'\n AND o.is_ms_shipped = 0\nORDER BY t.row_count DESC In spite of all praises I heaped upon system views, there are still some unexpected “pleasures” you may experience when working with them. I remember an amusing bug – system views were being updated incorrectly during migration from SQL Server 2000 to 2005 . The most “lucky” ones were receiving incorrect values of row count in tables from metadata. The restatement was [DBCC UPDATEUSAGE](https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2012/ms188414(v=sql.110)) . With SQL Server 2005 SP1 , this bug has been fixed and everything seemed to be quite okay. However, I faced the same problem once more when restoring a backup from SQL Server 2005 SP4 to SQL Server 2012 SP2 . I couldn’t reproduce it on real environment, that’s why I tricked optimizer a bit: UPDATE STATISTICS Person.Person WITH ROWCOUNT = 1000000000000000000 Let’s consider a simple example. Execution of the most tame query began to take longer than usual: SELECT FirstName, COUNT(*)\nFROM Person.Person\nGROUP BY FirstName Viewing query plan revealed completely inadequate value of EstimatedNumberOfRows : Then, I viewed the clustered index statistics: DECLARE @SQL NVARCHAR(MAX)\nDECLARE @obj SYSNAME = 'Person.Person'\nSELECT @SQL = 'DBCC SHOW_STATISTICS(''' + @obj + ''', ' + name + ') WITH STAT_HEADER'\nFROM sys.stats\nWHERE [object_id] = OBJECT_ID(@obj)\n AND stats_id < 2\n\nEXEC sys.sp_executesql @SQL Everything was all right. As for the aforementioned system views, SELECT rowcnt\nFROM sys.sysindexes\nWHERE id = OBJECT_ID('Person.Person')\n AND indid < 2\n\nSELECT SUM([rows])\nFROM sys.partitions p\nWHERE p.[object_id] = OBJECT_ID('Person.Person')\n AND p.index_id < 2 well, they were far from being okay: The query did not contain predicates for filtering, and the optimizer chose Full Index Scan . During Full Index/Table Scan , the optimizer takes the expected number of rows from metadata instead of statistics (I’m not quite sure whether it always happens). It’s no secret that SQL Server generates execution plan on the basis of Estimated number of rows and counts the memory required for its execution. The incorrect evaluation may lead to occupation of excessive memory. Here is the result of incorrect evaluation of row count: session_id query_cost requested_memory_kb granted_memory_kb required_memory_kb used_memory_kb\n---------- ---------------- -------------------- -------------------- -------------------- --------------------\n56 11331568390567 769552 769552 6504 6026 The problem has been resolved in a fairly straightforward way: DBCC UPDATEUSAGE(AdventureWorks2012, 'Person.Person') WITH COUNT_ROWS\nDBCC FREEPROCCACHE After the query recompilation, things settled into shape: session_id query_cost requested_memory_kb granted_memory_kb required_memory_kb used_memory_kb\n---------- ------------------- -------------------- -------------------- -------------------- --------------------\n52 0,291925808638711 1168 1168 1024 952 If system views do not serve as a magic wand anymore, what else can we do? Well, we can fall back on the old school practice: SELECT COUNT_BIG(*) FROM ... But, I would not rely much on the result during intensive insertion into table. Much less, the “magic” NOLOCK hint still does not guarantee the correct value: SELECT COUNT_BIG(*) FROM ... WITH(NOLOCK) As a matter of fact, we need to execute the query under the SERIALIZABLE isolation level to get the correct number of records in the table. Alternatively, we can use the TABLOCKX hint. SELECT COUNT_BIG(*) FROM ... WITH(TABLOCKX) In result, we get the exclusive lock of the table for the time of query execution. What is better? The answer is – decide for yourself. My choice is metadata. It gets yet more interesting, when you need to count the number of rows by condition: SELECT City, COUNT_BIG(*)\nFROM Person.[Address]\n--WHERE City = N'London'\nGROUP BY City If there are no frequent insert-delete operations in the table, we can create an indexed view: IF OBJECT_ID('dbo.CityAddress', 'V') IS NOT NULL\n DROP VIEW dbo.CityAddress\nGO\n\nCREATE VIEW dbo.CityAddress\nWITH SCHEMABINDING\nAS\n SELECT City, [Rows] = COUNT_BIG(*)\n FROM Person.[Address]\n GROUP BY City\nGO\n\nCREATE UNIQUE CLUSTERED INDEX IX ON dbo.CityAddress (City) For these queries, the optimizer will generate identical plan based on the clustered index of the view: SELECT City, COUNT_BIG(*)\nFROM Person.[Address]\nWHERE City = N'London'\nGROUP BY City\n\nSELECT *\nFROM dbo.CityAddress\nWHERE City = N'London' Here are the execution plans with and without an indexed view: In this post, I wanted to show that there are no ideal solutions for every day of your life. You should act as the occasion requires in every particular case. All tests have been performed on SQL Server 2012 SP3 (11.00.6020) . Execution plans have been taken from SSMS 2014 and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/sql-query-profiler.html#header) . Conclusion When I need to count the number of table rows, I use metadata – it is the fastest way. Do not be afraid of the old bug case I described. If there is a need to quickly count the number of rows from a perspective of some field or by condition, I try using indexed views or filtered indexes. It all depends upon the situation. When a table is small or when productivity issue is not put at stake, the old-school SELECT COUNT(*)… will be the best option. Tags [performance](https://blog.devart.com/tag/performance) [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffastest_way_to_calculate_the_record_count.html) [Twitter](https://twitter.com/intent/tweet?text=What+is+the+fastest+way+to+calculate+the+record+COUNT%3F&url=https%3A%2F%2Fblog.devart.com%2Ffastest_way_to_calculate_the_record_count.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/fastest_way_to_calculate_the_record_count.html&title=What+is+the+fastest+way+to+calculate+the+record+COUNT%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/fastest_way_to_calculate_the_record_count.html&title=What+is+the+fastest+way+to+calculate+the+record+COUNT%3F) [Copy URL](https://blog.devart.com/fastest_way_to_calculate_the_record_count.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 3 COMMENTS Anil December 18, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 5:21 am Nice information. Thanks a lot for explaining. taskeen May 4, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 11:05 am Thanks, but what if we want to search on certain column in tbl Sales.SalesOrderDetail i.e. where SalesOrderDetail.Date =’01/01/2016 thanks ‘ Sergey Syrovatchenko May 4, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 1:59 pm In this case you need create an indexed view: CREATE VIEW Sales.vw_SalesOrderDetail WITH SCHEMABINDING AS SELECT [Date], [Rows] = COUNT_BIG(*) FROM Sales.SalesOrderDetail GROUP BY [Date] GO CREATE UNIQUE CLUSTERED INDEX IX ON Sales.vw_SalesOrderDetail ([Date]) GO SELECT [Rows] FROM Sales.vw_SalesOrderDetail WHERE [Date] = ‘20160101’ Comments are closed."} {"url": "https://blog.devart.com/fetch-zoho-books-data-in-a-net-application.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [How To](https://blog.devart.com/category/how-to) Fetch Zoho Books Data in a .NET Application By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) August 28, 2024 [0](https://blog.devart.com/fetch-zoho-books-data-in-a-net-application.html#respond) 872 Zoho Books is an online accounting software that helps you manage finance, automate business workflows, and streamline collaboration across departments. Integration of Zoho Books data into a .NET application can be extremely beneficial when handling large amounts of data. This approach provides a simplified reporting method, saving your team’s time Fetching data from Zoho Books to an external application can enhance data analysis, reporting, and integration with other business processes. It allows users to leverage Zoho Books data in custom applications for tailored workflows and improved decision-making. It features integration with .NET applications, high performance, and comprehensive security. With dotConnect, users can execute SQL queries directly against Zoho Books data, eliminating the need to learn complex APIs. This tutorial will show you how to connect Zoho Books to your .NET application to Fetch Data (Invoices). Table of Contents Setting Up Your .NET Environment How to Get Zoho Books Refresh Token How to Connect to Zoho Books API From .NET Application How to Fetch Invoices From Zoho Books Create a UI and Manage Zoho Books Invoices in Your App Conclusion Setting Up Your .NET Environment Before we start, let’s briefly overview the prerequisites you should have  in order to succeed in setting up the connectivity. If you don’t have access to some of the tools and accounts listed below, take your time to get ones before you proceed. [Visual Studio 2022](https://visualstudio.microsoft.com/downloads/) . If you don’t have Visual Studio on your machine, our IDE of choice will be Visual Studio 2022. We will be using the community version, which you can download and install. [dotConnect for Zoho Books](https://www.devart.com/dotconnect/zohobooks/) is a feature-rich tool for incorporating Zoho Books data into our .NET applications. [Zoho Books](https://www.zoho.com/books/) account with some invoices that have been created. In case to avoid any issues on the way, please note that one should have a [specific role](https://www.zoho.com/books/help/settings/users.html) . [Zoho Developer](https://www.zoho.com/developer/) console configurations for Client ID, Client Secret, and Redirect URL. Now, let’s move on to preparational steps. Follow the guide below to set up a new .NET Application project. Open the Visual Studio that you have installed and click Create a new project option. In the Search Template field, search for ASP.NET MVC and select it. Give your project and name. For example, I have called mine Zoho_Books_.NET . Then, select .NET 8 as the target framework and click Create . Next, we are going to install dotConnect for Zoho Books in our project via the NuGet Package Manager in Visual Studio. In the taskbar, click Tools , navigate to NuGet package manager , and then proceed to Manage NuGet packages for the console . The NuGet Package Manager page will open up. On this page, click Browse and search for Devart.Data.ZohoBooks , click Devart.Data.ZohoBooks . Then select your project, and click Install . After installing the NuGet package, activate the trial version or enter your license key by navigating to the official website and downloading the dczohobooks Installer. Execute the downloaded file and follow the instructions to install dotConnect for Zoho Books on your machine. When you run the application by clicking F5 the scaffolded web page has to appear. This page serves as a starting point for your application, allowing you to build further functionalities that can interact with Zoho Books data, such as displaying, updating, or managing financial records directly from your web application. How to Get Zoho Books Refresh Token Before we can interact with the Zoho API, we need to obtain a token that grants access to the Client service specified in the Zoho Developer Console. To get the Client ID and Client Secret, follow the steps described in this [tutorial](https://www.zoho.com/accounts/protocol/oauth-setup.html) . To get the refresh token, we’ll have to get a code first. public class AccessCodeResponse\n{\n    public string access_token { get; set; }\n    public string refresh_token { get; set; }\n    public string scope { get; set; }\n    public string api_domain { get; set; }\n    public string token_type { get; set; }\n    public int expires_in { get; set; }\n} To do this, let’s create a class called AccessCodeResponse , and copy the code provided above into it. Next, create a class called AuthorizationClass in your solution and paste the code you see below into it. private const string AuthorizationEndpoint = \"https://accounts.zoho.com/oauth/v2/auth\";\n        private const string ClientId = \"1000.LR49IV7TR1EM9PJX3H05YY7CAO4QCY\";\n        private const string ClientSecret = \"2fdc969392bc3b60c34855fa5a3aab217e7cb696f2\";\n        private const string RedirectUri = \"https://localhost:3000/oauth/callback/\";\n        private const string Scope = \"ZohoBooks.fullaccess.all\";        private const string License = \"********\";\n\n        public async Task GetAuthorizationCodeAsync()\n        {\n            string accessToken = \"\";\n            string AccessCodeEndpoint;\n            string authorizationUrl = $\"{AuthorizationEndpoint}?scope={Scope}&client_id={ClientId}&redirect_uri={Uri.EscapeDataString(RedirectUri)}&response_type=code&access_type=Offline&prompt=consent&licensekey={License}\";\n\n            // Open the default browser with the authorization URL\n            Process browser = Process.Start(new ProcessStartInfo(authorizationUrl) { UseShellExecute = true });\n\n            Console.WriteLine(\"Please log in and authorize the application.\");\n            Console.WriteLine(\"After authorization, you will be redirected to a page that may show an error.\");\n            Console.WriteLine(\"Copy the entire URL from your browser's address bar and paste it here:\");\n\n            string redirectedUrl = Console.ReadLine();\n\n            // Close the browser\n            browser?.Close();\n\n            // Parse the authorization code from the URL\n            Uri uri = new Uri(redirectedUrl);\n            string authCode = System.Web.HttpUtility.ParseQueryString(uri.Query).Get(\"code\");\n            AccessCodeEndpoint = System.Web.HttpUtility.ParseQueryString(uri.Query).Get(\"accounts-server\");\n\n            if (string.IsNullOrEmpty(authCode))\n            {\n                throw new Exception(\"Authorization code not found in the redirected URL.\");\n            }\n\n            var parameters = new Dictionary\n        {\n            { \"client_id\", ClientId },\n            { \"grant_type\", \"authorization_code\" },\n            { \"client_secret\", ClientSecret },\n            { \"redirect_uri\", RedirectUri },\n            { \"code\", authCode }\n        };\n\n            using (var client = new HttpClient())\n            {\n                var content = new FormUrlEncodedContent(parameters);\n                var response = await client.PostAsync($\"{AccessCodeEndpoint}/oauth/v2/token\", content);\n\n                if (response.IsSuccessStatusCode)\n                {\n                    var responseContent = await response.Content.ReadAsStringAsync();\n                    var root = JsonConvert.DeserializeObject(responseContent);\n                    accessToken = root.refresh_token;\n                }\n                else\n                {\n                    Console.WriteLine($\"Error: {response.StatusCode}\");\n                }\n            }\n            return accessToken;\n        }\n    } That’s a lot of code, so let’s briefly go through it. In this example, we have created a method called GetAuthorizationCodeAsync . This method starts out when calling the authorizationUrl . Next, we have created a browser process to listen for the response. After the response appears as a URL in the browser’s address bar, we have to copy that URL and paste it into the console that will open when the application runs. The authentication code and account server properties from the pasted URL can be used then in another API call to obtain access and refresh tokens. Remember to replace your ClientId , ClientSecret , and RedirectUri with the correct values configured in your Zoho API Console. We will explain all these steps later. Now, let’s create a new class called ConnectionClass to simulate a connection to the Zoho Books database instance. Go ahead and add the piece of code showcased below. public class ConnectionClass\n{\n    private const string ClientId = \"1000.LR49IV7TR1EM9PJX3H05YY7CAO4QCY\";\n    private const string ClientSecret = \"2fdc969392bc3b60c34855fa5a3aab217e7cb696f2\";    private const string License = \"********\";\n    public void CreateZohoBooksConnection(string token)\n    {\n        Devart.Data.ZohoBooks.ZohoBooksConnection myConn = new Devart.Data.ZohoBooks.ZohoBooksConnection();\n        myConn.ConnectionString = $\"ClientId={ClientId};ClientSecret={ClientSecret};RefreshToken={token};LicenseKey={License}\";\n        myConn.Open();\n        string connMessage = \"Connection is \" + myConn.State.ToString();\n        Console.WriteLine(connMessage);\n        myConn.Close();\n    }\n} This class uses the Refresh token gotten from GetAuthorizationCodeAsync along with our ClientId and ClientSecret to open a connection to the Zoho Books API. Next, under the controllers folder, navigate to HomeController and replace it with the following code. public IActionResult Index()\n{\n    AuthorizationClass authorizationClass = new AuthorizationClass();\n    string code = authorizationClass.GetAuthorizationCodeAsync().Result;\n    Console.WriteLine($\"Authorization code: {code}\");\n    ConnectionClass connectionClass = new ConnectionClass();\n    connectionClass.CreateZohoBooksConnection(code);\n    return View();\n} This piece of code will open a connection to the Zoho Books Database when the application starts up. How to Connect to Zoho Books API From .NET Application Now that we have completed all the preparations let’s finally connect to the Zoho Books API. To showcase the process, we’ll simulate the application to which we’ll pull the data. When you run the application, an authorization page comes up asking if the application should grant access to the Zoho client. You should click Accept . After that, you’ll be redirected to the Zoho page with the access code embedded in the URL. Simultaneously, a console will open up. Go ahead, copy that URL into the console, and press Enter . If all goes well, you will get a message in the console that the connection has been opened. Please note that the code generated from the URL is time-bound, so you will need to copy and paste it into the console quickly so that it does not expire. How to Fetch Invoices From Zoho Books Okay, so we have been able to open a connection to our Zoho Books Database. Let’s try to retrieve some data, specifically we’ll be pulling the invoices from our Zoho Books database. Modify your ConnectionClass so it looks like this: public class ConnectionClass\n{\n    private const string ClientId = \"1000.LR49IV7TR1EM9PJX3H05YY7CAO4QCY\";\n    private const string ClientSecret = \"2fdc969392bc3b60c34855fa5a3aab217e7cb696f2\";     private const string License = \"2fdc969392bc3b60c34855fa5a3aab217e7cb696f2\";\n    public DataTable GetInvoices(string token)\n    {\n        Devart.Data.ZohoBooks.ZohoBooksConnection myConn = new Devart.Data.ZohoBooks.ZohoBooksConnection();\n        myConn.ConnectionString = $\"ClientId={ClientId};ClientSecret={ClientSecret};RefreshToken={token};LicenseKey={License}\";\n        DataTable dt;\n        using (ZohoBooksConnection connection = new ZohoBooksConnection(myConn.ConnectionString))\n        {\n            connection.Open();\n            string query = \"SELECT InvoiceNumber,ReferenceNumber,CustomerName,CompanyName,Date,DueDate,CurrencyCode,Total FROM Invoices\";\n            Devart.Data.ZohoBooks.ZohoBooksDataAdapter dataAdpater = new Devart.Data.ZohoBooks.ZohoBooksDataAdapter(query, connection);\n            DataSet data = new DataSet();\n            dataAdpater.Fill(data, \"Invoices\");\n            dt = data.Tables[\"Invoices\"];\n        }\n        myConn.Close();\n        return dt;\n    }\n} Note, that we are retrieving only a few columns for the sake of brevity. Create a UI and Manage Zoho Books Invoices in Your App Let’s create a simple UI to view the data we have just retrieved. Add this method in HomeController class. static string GenerateHtmlTable(DataTable dataTable)\n{\n    StringBuilder htmlBuilder = new StringBuilder();\n\n    // Add border and spacing to the table\n    htmlBuilder.Append(\"
\");\n\n    // Add table header\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n    htmlBuilder.Append(\"\");\n\n    // Add table rows\n    foreach (DataRow row in dataTable.Rows)\n    {\n        htmlBuilder.Append(\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append($\"\");\n        htmlBuilder.Append(\"\");\n    }\n\n    htmlBuilder.Append(\"
InvoiceNumberReferenceNumberCustomerNameCompanyNameDateDueDateCurrencyCodeTotal
{row[\"InvoiceNumber\"]}{row[\"ReferenceNumber\"]}{row[\"CustomerName\"]}{row[\"CompanyName\"]}{row[\"Date\"]}{row[\"DueDate\"]}{row[\"CurrencyCode\"]}{row[\"Total\"]}
\");\n\n    return htmlBuilder.ToString();\n} Then modify the Index method so it looks like this. public IActionResult Index()\n{\n    AuthorizationClass authorizationClass = new AuthorizationClass();\n    string code = authorizationClass.GetAuthorizationCodeAsync().Result;\n    Console.WriteLine($\"Authorization code: {code}\");\n    ConnectionClass connectionClass = new ConnectionClass();\n    DataTable dataTable = connectionClass.GetInvoices(code);\n    string htmlTable = GenerateHtmlTable(dataTable);\n    ViewBag.HtmlTable = htmlTable;\n    return View();\n} Now, locate the index.cshtml view. Then replace the code there with this version. @{\n ViewData[\"Title\"] = \"Home Page\";\n}\n\n
\n

ZohoBooks Invoices

\n
\n @Html.Raw(ViewBag.HtmlTable)\n
\n
Now you can run the application and check the result. Conclusion In this article, we have explored the seamless integration of Zoho Books data into a .NET application. We have successfully retrieved an authorization code from the ZohoBooks API using dotConnect for Zoho Books. With this auth code, we have obtained both the access token and refresh token necessary for connecting to Zoho Books. We also demonstrated how effortlessly data can be retrieved from the Invoices table using the ZohoBooksDataAdapter from dotConnect for Zoho Books. Try it yourself by [downloading a free trial](https://www.devart.com/dotconnect/zohobooks/download.html) and experience the seamless integration and efficient data handling of Zoho Books data in your .NET applications. Tags [Zoho Books integration](https://blog.devart.com/tag/zoho-books-integration) [Zoho integration](https://blog.devart.com/tag/zoho-integration) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffetch-zoho-books-data-in-a-net-application.html) [Twitter](https://twitter.com/intent/tweet?text=Fetch+Zoho+Books+Data+in+a+.NET+Application&url=https%3A%2F%2Fblog.devart.com%2Ffetch-zoho-books-data-in-a-net-application.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/fetch-zoho-books-data-in-a-net-application.html&title=Fetch+Zoho+Books+Data+in+a+.NET+Application) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/fetch-zoho-books-data-in-a-net-application.html&title=Fetch+Zoho+Books+Data+in+a+.NET+Application) [Copy URL](https://blog.devart.com/fetch-zoho-books-data-in-a-net-application.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Find and Delete Incomplete Open Transactions in SQL Server – Part 3 By [dbForge Team](https://blog.devart.com/author/dbforge) May 4, 2020 [0](https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html#respond) 2697 In two previous articles, we have reviewed a solution for deleting incomplete open transactions in SQL Server. By incomplete transaction, we shall basically understand an active (running) transaction that doesn’t have any active (running) queries for some long period of time T. The general algorithm for deleting incomplete transactions: 1. Create two tables: one table to store and analyze information about current incomplete transactions and the second one to archive the transactions selected from the first table according to the delete actions for subsequent analysis. 2. Gather information about transactions and their sessions that have no queries, i.e., transactions launched and left forgotten within a certain timespan T. 3. Update the table containing current incomplete transactions from step 1 (if an incomplete transaction acquires an active request, then such a transaction is no longer considered to be incomplete and is deleted from the table). 4. Identify the sessions to be killed (a session has at least one incomplete transaction mentioned in the table from step 1 and there are no queries running for that session as well). 5. Archive the information you are going to delete (details about the sessions, connections, and transactions that are to be killed). 6. Kill the selected sessions. 7. Delete the processed entries along with those that cannot be deleted and have been in the table from step 1 for too long. We also presented the process of [creating a CRUD stored procedure](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) . Let’s now review a number of settings that help you work more efficiently. Tabs Coloring functionality Import and Export Reset Settings Restoring Documents Sessions Marking Execution Environment using colors Let’s create two windows: one for the testing stand and one for the production environment. Now let’s paint every window in proper colors (green for the testing environment, red for the production environment). To do this, we need to right-click on the proper window, select Tabs Color in a drop-down menu, and select a color. Pic. 1. Changing the document window colors Now we get the colored tabs depending on colors that we selected: Pic. 2. The result of changing the script windows colors of the documents This functionality can also be changed in SQL Complete\\Options\\Tabs Color: Pic.3. Tabs and instances of color settings Here you can change, add, or delete colors for specific hosts. [Tabs Coloring](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html#tabs_coloring) functionality not only allows us to discern environment types (testing, production, etc.) but does the same job for important MS SQL instances. Importing, exporting, and reset to default settings in SQL Complete Among other settings, SQL Complete lets you adjust import and export parameters: Pic.4. SQL Complete settings import and export After you click on “Import and Export Settings…”, the following window pops up: Pic.5. SQL Complete settings export This window allows for selecting either import, or export, or reset to default settings. In our case, we go for settings export and press the “Next” button. Then we check proper sections and press the “Next” button: Pic.6. Selecting the exported data Then we select the “save” folder, make the file’s name, and press the “Execute” button: Pic.7. Setting up the file’s destination for exported settings and launching the export process Once the export is done, we receive a message saying that it’s over and press the “Finish” button. Pic.8. Finishing the settings export process The import process goes the same way. Let’s now return to our new stored procedure in two windows created above. You can read more info about importing, exporting, and resetting to default your SQL Complete settings [here](https://docs.devart.com/sqlcomplete/setting-up-sql-complete/importing-and-exporting-settings.html) . Restoring documents To restore your sessions based on windows with scripts, you should use the SQL Complete\\Document Sessions command: Pic.9. Selecting the “Documents Sessions” command in the SQL Complete Main Menu Once you selected it, the restorable sessions window pops up. Pic.10. Selecting the sessions to restore Right-click on the necessary document to restore it: Pic.11. Restoring the selected document You can find more information about the Documents Sessions functionality [here](https://docs.devart.com/sqlcomplete/tab-and-document-management/restoring-documents.html) . If we accidentally close a window with a script, this won’t be a problem. By using the SQL Complete\\Restore Last Closed Document command, you can re-open that window and not lose any of your important scripts: Pic.12. Selecting the “Restore Last Closed Document” command in the SQL Complete Main Menu By using the SQL Complete\\Recently Closed Documents command, we can do the same trick of restoring any recently closed file: Pic.13. Selecting the “Recently Closed Documents” command in the SQL Complete Main Menu In this article, we reviewed a number of [productivity features](https://www.devart.com/dbforge/sql/sqlcomplete/productivity-extension.html) that help you work more efficiently in the process of implementing algorithms for deleting incomplete transactions in SQL Server with the help of SQL Complete. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql server transactions](https://blog.devart.com/tag/sql-server-transactions) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html) [Twitter](https://twitter.com/intent/tweet?text=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+3&url=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html&title=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+3) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html&title=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+3) [Copy URL](https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Managing Open Transactions in SQL Server By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) April 26, 2024 [0](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html#respond) 5817 Open transactions in SQL Server occur frequently and demand close attention and proper handling to prevent issues. Locking data and blocking access to it, causing transaction log growth, affecting data consistency, and increasing resource consumption are just a few potential problems stemming from open transactions. Monitoring open transactions and ensuring their timely completion (either committing or rolling back) is a best practice, and SQL Server offers various tools and commands to identify such open transactions. This article will explore these tools and methods and provide a guide on automatically finding and resolving open transactions. Contents Understanding open transactions Prerequisites and preparations Finding SQL Server incomplete transactions Addressing lost transactions Automating cleanup and maintenance tasks Best practices for managing open transactions Conclusion Understanding open transactions In SQL Server, transactions are either committed upon success or rolled back if they encounter errors or are canceled. However, there are instances where transactions are left idle, which should be avoided. For example, when script execution is interrupted without issuing COMMIT or ROLLBACK statements, it leaves the transaction in an “open” idle state. These open transactions consume resources and can lead to issues like running out of disk space and overall system damage. Detecting and resolving such open transactions is crucial to prevent these problems. Prerequisites and preparations To illustrate various methods to identify and delete open transactions as well as the method of automation of this process, we’ll use SQL Server Management Studio (SSMS), the standard tool for managing SQL Server databases. We’ll be utilizing an enhanced version of SSMS that includes the [dbForge SQL Complete add-in](https://www.devart.com/dbforge/sql/sqlcomplete/) . This powerful coding assistant offers a host of additional features, including notifications about open transactions. SQL Complete integrates smoothly with SSMS, enriching the familiar interface with numerous extra functionalities for all SQL Server professionals. Also, we will use the test A dventureWorks2022 SQL Server database to illustrate the methods of finding idle transactions. To demonstrate the automation of the process, we have prepared the DemoDatabase test database. However, you can use the scripts presented in this article on your databases. Finding SQL Server incomplete transactions SQL Server provides an in-built method of defining if there are any open transactions, the DBCC OPENTRAN command. DBCC OPENTRAN; If any open transactions have been detected, SSMS will return the information about them as “oldest active transaction,” which you can see in the screenshot below. Note: The DBCC OPENTRAN command shows open transactions for the database for which it is executed. Another method is querying the sys.sysprocesses Dynamic Management View (DMV). SELECT * FROM sys.sysprocesses WHERE open_tran = 1 This command output is presented in a standard query window and provides detailed information about the transaction. Besides, there is the possibility to present the information retrieved by the DBCC OPENTRAN command as a temporary table. This format is preferred by many specialists. -- Create a temporary table to accept the results.\nCREATE TABLE #OpenTranStatus (\n ActiveTransaction VARCHAR(25),\n Details sql_variant\n );\n-- Execute the command, putting the results in the table.\nINSERT INTO #OpenTranStatus\n EXEC ('DBCC OPENTRAN WITH TABLERESULTS, NO_INFOMSGS');\n \n-- Display the results.\nSELECT * FROM #OpenTranStatus; Finally, if you use SQL Complete , this add-in will notify the user about any open transactions visually: The option to warn the user about any open transactions is enabled by default in SQL Complete. If not, you can activate it manually in the Notifications section of the SQL Complete Options menu: Once you have defined the open transaction that you need to delete, you can use the KILL command and the session ID to get rid of it: KILL 58; Note: Executing the KILL command requires admin privileges. Addressing lost transactions We have examined different ways of finding the open transactions in SQL Server. However, when you are the database administrator who deals with multiple databases and many hundreds and thousands of transactions. We can collect information about SQL Server active transactions and their sessions that have no requests (transactions that were launched and left forgotten). Further, that information can be presented in a table format. We’ll use the below code to create the table to store information about current lost transactions in DemoDatabase : SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[SessionTran] (\n\t[SessionID] INT NOT NULL\n ,[TransactionID] BIGINT NOT NULL\n ,[CountTranNotRequest] TINYINT NOT NULL\n ,[CountSessionNotRequest] TINYINT NOT NULL\n ,[TransactionBeginTime] DATETIME NOT NULL\n ,[InsertUTCDate] DATETIME NOT NULL\n ,[UpdateUTCDate] DATETIME NOT NULL\n ,CONSTRAINT [PK_SessionTran] PRIMARY KEY CLUSTERED\n\t(\n\t[SessionID] ASC,\n\t[TransactionID] ASC\n\t) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) ON [PRIMARY]\n) ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_Count] DEFAULT ((0)) FOR [CountTranNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_CountSessionNotRequest] DEFAULT ((0)) FOR [CountSessionNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_InsertUTCDate] DEFAULT (GETUTCDATE()) FOR [InsertUTCDate]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_UpdateUTCDate] DEFAULT (GETUTCDATE()) FOR [UpdateUTCDate]\nGO In this script: SessionID identifies the session TransactionID identifies the lost transaction CountTranNotRequest stands for the number of times the transaction was recorded as lost CountSessionNotRequest stands for the number of times the session was recorded as one that has no active queries and contains a lost transaction TransactionBeginTime refers to the start date and time of the lost transaction InsertUTCDate identifies the date and time (UTC) when the record was made UpdateUTCDate identifies the date and time (UTC) when the record was updated. Similarly, we create a table to archive open transactions selected from the first table according to the delete actions in the same DemoDatabase . SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[KillSession](\n[ID] [int] IDENTITY(1,1) NOT NULL,\n[session_id] [smallint] NOT NULL,\n[transaction_id] [bigint] NOT NULL,\n[login_time] [datetime] NOT NULL,\n[host_name] [nvarchar](128) NULL,\n[program_name] [nvarchar](128) NULL,\n[host_process_id] [int] NULL,\n[client_version] [int] NULL,\n[client_interface_name] [nvarchar](32) NULL,\n[security_id] [varbinary](85) NOT NULL,\n[login_name] [nvarchar](128) NOT NULL,\n[nt_domain] [nvarchar](128) NULL,\n[nt_user_name] [nvarchar](128) NULL,\n[status] [nvarchar](30) NOT NULL,\n[context_info] [varbinary](128) NULL,\n[cpu_time] [int] NOT NULL,\n[memory_usage] [int] NOT NULL,\n[total_scheduled_time] [int] NOT NULL,\n[total_elapsed_time] [int] NOT NULL,\n[endpoint_id] [int] NOT NULL,\n[last_request_start_time] [datetime] NOT NULL,\n[last_request_end_time] [datetime] NULL,\n[reads] [bigint] NOT NULL,\n[writes] [bigint] NOT NULL,\n[logical_reads] [bigint] NOT NULL,\n[is_user_process] [bit] NOT NULL,\n[text_size] [int] NOT NULL,\n[language] [nvarchar](128) NULL,\n[date_format] [nvarchar](3) NULL,\n[date_first] [smallint] NOT NULL,\n[quoted_identifier] [bit] NOT NULL,\n[arithabort] [bit] NOT NULL,\n[ansi_null_dflt_on] [bit] NOT NULL,\n[ansi_defaults] [bit] NOT NULL,\n[ansi_warnings] [bit] NOT NULL,\n[ansi_padding] [bit] NOT NULL,\n[ansi_nulls] [bit] NOT NULL,\n[concat_null_yields_null] [bit] NOT NULL,\n[transaction_isolation_level] [smallint] NOT NULL,\n[lock_timeout] [int] NOT NULL,\n[deadlock_priority] [int] NOT NULL,\n[row_count] [bigint] NOT NULL,\n[prev_error] [int] NOT NULL,\n[original_security_id] [varbinary](85) NOT NULL,\n[original_login_name] [nvarchar](128) NOT NULL,\n[last_successful_logon] [datetime] NULL,\n[last_unsuccessful_logon] [datetime] NULL,\n[unsuccessful_logons] [bigint] NULL,\n[group_id] [int] NOT NULL,\n[database_id] [smallint] NOT NULL,\n[authenticating_database_id] [int] NULL,\n[open_transaction_count] [int] NOT NULL,\n[most_recent_session_id] [int] NULL,\n[connect_time] [datetime] NULL,\n[net_transport] [nvarchar](40) NULL,\n[protocol_type] [nvarchar](40) NULL,\n[protocol_version] [int] NULL,\n[encrypt_option] [nvarchar](40) NULL,\n[auth_scheme] [nvarchar](40) NULL,\n[node_affinity] [smallint] NULL,\n[num_reads] [int] NULL,\n[num_writes] [int] NULL,\n[last_read] [datetime] NULL,\n[last_write] [datetime] NULL,\n[net_packet_size] [int] NULL,\n[client_net_address] [nvarchar](48) NULL,\n[client_tcp_port] [int] NULL,\n[local_net_address] [nvarchar](48) NULL,\n[local_tcp_port] [int] NULL,\n[connection_id] [uniqueidentifier] NULL,\n[parent_connection_id] [uniqueidentifier] NULL,\n[most_recent_sql_handle] [varbinary](64) NULL,\n[LastTSQL] [nvarchar](max) NULL,\n[transaction_begin_time] [datetime] NOT NULL,\n[CountTranNotRequest] [tinyint] NOT NULL,\n[CountSessionNotRequest] [tinyint] NOT NULL,\n[InsertUTCDate] [datetime] NOT NULL,\n CONSTRAINT [PK_KillSession] PRIMARY KEY CLUSTERED \n(\n[ID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[KillSession] ADD CONSTRAINT [DF_KillSession_InsertUTCDate] DEFAULT (getutcdate()) FOR [InsertUTCDate]\nGO This table will list all “killed” transactions. Specifically, the InsertUTCDate field will provide the exact time when each transaction was terminated and logged in the table. Automating cleanup and maintenance tasks Searching for open transactions and killing them manually would require too much time and resources. This tedious job can be automated, and it should be automated. Having analyzed the requests, their statuses, and sessions, open a new window and type the following scripts to find open transactions in SQL Server that are left incomplete. declare @tbl table (\nSessionID int,\nTransactionID bigint,\nIsSessionNotRequest bit,\nTransactionBeginTime datetime\n); Click to open and view the full code of the srv.AutoKillSessionTranBegin stored procedure SET ANSI_NULLS ON;\nGO\nSET QUOTED_IDENTIFIER ON;\nGO\nALTER PROCEDURE [srv].[AutoKillSessionTranBegin] @minuteOld2 INT = 30, --old age of a running transaction\n@countIsNotRequests2 INT = 5 --number of hits in the table\nAS\nBEGIN\n /*\n --definition of frozen transactions (forgotten ones that do not have active requests) with their subsequent removal\n */\n\n SET NOCOUNT ON;\n SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;\n\n\n DECLARE @tbl TABLE (\n SessionID INT\n ,TransactionID BIGINT\n ,IsSessionNotRequest BIT\n ,TransactionBeginTime DATETIME\n );\n\n --collect information (transactions and their sessions that have no requests, i.e., transactions that are started and forgotten)\n INSERT INTO @tbl (SessionID,\n TransactionID,\n IsSessionNotRequest,\n TransactionBeginTime)\n SELECT\n t.[session_id] AS SessionID\n ,t.[transaction_id] AS TransactionID\n ,CASE\n WHEN EXISTS (SELECT TOP (1)\n 1\n FROM sys.dm_exec_requests AS r\n WHERE r.[session_id] = t.[session_id]) THEN 0\n ELSE 1\n END AS IsSessionNotRequest\n ,(SELECT TOP (1)\n dtat.[transaction_begin_time]\n FROM sys.dm_tran_active_transactions AS dtat\n WHERE dtat.[transaction_id] = t.[transaction_id])\n AS TransactionBeginTime\n FROM sys.dm_tran_session_transactions AS t\n WHERE t.[is_user_transaction] = 1\n AND NOT EXISTS (SELECT TOP (1)\n 1\n FROM sys.dm_exec_requests AS r\n WHERE r.[transaction_id] = t.[transaction_id]);\n\n --update the table of running transactions that have no active queries\n ;\n MERGE srv.SessionTran AS st USING @tbl AS t\n ON st.[SessionID] = t.[SessionID]\n AND st.[TransactionID] = t.[TransactionID]\n WHEN MATCHED\n THEN UPDATE\n SET [UpdateUTCDate] = GETUTCDATE()\n ,[CountTranNotRequest] = st.[CountTranNotRequest] + 1\n ,[CountSessionNotRequest] =\n CASE\n WHEN (t.[IsSessionNotRequest] = 1) THEN (st.[CountSessionNotRequest] + 1)\n ELSE 0\n END\n ,[TransactionBeginTime] = COALESCE(t.[TransactionBeginTime], st.[TransactionBeginTime])\n WHEN NOT MATCHED BY TARGET\n AND (t.[TransactionBeginTime] IS NOT NULL)\n THEN INSERT ([SessionID]\n , [TransactionID]\n , [TransactionBeginTime])\n VALUES (t.[SessionID], t.[TransactionID], t.[TransactionBeginTime])\n WHEN NOT MATCHED BY SOURCE\n THEN DELETE;\n\n --list of sessions to delete (containing frozen transactions)\n DECLARE @kills TABLE (\n SessionID INT\n );\n\n --detailed information for the archive\n DECLARE @kills_copy TABLE (\n SessionID INT\n ,TransactionID BIGINT\n ,CountTranNotRequest TINYINT\n ,CountSessionNotRequest TINYINT\n ,TransactionBeginTime DATETIME\n );\n\n --collect those sessions that need to be killed\n INSERT INTO @kills_copy (SessionID,\n TransactionID,\n CountTranNotRequest,\n CountSessionNotRequest,\n TransactionBeginTime)\n SELECT\n SessionID\n ,TransactionID\n ,CountTranNotRequest\n ,CountSessionNotRequest\n ,TransactionBeginTime\n FROM srv.SessionTran\n WHERE [CountTranNotRequest] >= @countIsNotRequests2\n AND [CountSessionNotRequest] >= @countIsNotRequests2\n AND [TransactionBeginTime] <= DATEADD(MINUTE, -@minuteOld2, GETDATE());\n\n --archive what we are going to delete (detailed information about deleted sessions, connections and transactions)\n INSERT INTO [srv].[KillSession] ([session_id]\n , [transaction_id]\n , [login_time]\n , [host_name]\n , [program_name]\n , [host_process_id]\n , [client_version]\n , [client_interface_name]\n , [security_id]\n , [login_name]\n , [nt_domain]\n , [nt_user_name]\n , [status]\n , [context_info]\n , [cpu_time]\n , [memory_usage]\n , [total_scheduled_time]\n , [total_elapsed_time]\n , [endpoint_id]\n , [last_request_start_time]\n , [last_request_end_time]\n , [reads]\n , [writes]\n , [logical_reads]\n , [is_user_process]\n , [text_size]\n , [language]\n , [date_format]\n , [date_first]\n , [quoted_identifier]\n , [arithabort]\n , [ansi_null_dflt_on]\n , [ansi_defaults]\n , [ansi_warnings]\n , [ansi_padding]\n , [ansi_nulls]\n , [concat_null_yields_null]\n , [transaction_isolation_level]\n , [lock_timeout]\n , [deadlock_priority]\n , [row_count]\n , [prev_error]\n , [original_security_id]\n , [original_login_name]\n , [last_successful_logon]\n , [last_unsuccessful_logon]\n , [unsuccessful_logons]\n , [group_id]\n , [database_id]\n , [authenticating_database_id]\n , [open_transaction_count]\n , [most_recent_session_id]\n , [connect_time]\n , [net_transport]\n , [protocol_type]\n , [protocol_version]\n , [encrypt_option]\n , [auth_scheme]\n , [node_affinity]\n , [num_reads]\n , [num_writes]\n , [last_read]\n , [last_write]\n , [net_packet_size]\n , [client_net_address]\n , [client_tcp_port]\n , [local_net_address]\n , [local_tcp_port]\n , [connection_id]\n , [parent_connection_id]\n , [most_recent_sql_handle]\n , [LastTSQL]\n , [transaction_begin_time]\n , [CountTranNotRequest]\n , [CountSessionNotRequest])\n SELECT\n ES.[session_id]\n ,kc.[TransactionID]\n ,ES.[login_time]\n ,ES.[host_name]\n ,ES.[program_name]\n ,ES.[host_process_id]\n ,ES.[client_version]\n ,ES.[client_interface_name]\n ,ES.[security_id]\n ,ES.[login_name]\n ,ES.[nt_domain]\n ,ES.[nt_user_name]\n ,ES.[status]\n ,ES.[context_info]\n ,ES.[cpu_time]\n ,ES.[memory_usage]\n ,ES.[total_scheduled_time]\n ,ES.[total_elapsed_time]\n ,ES.[endpoint_id]\n ,ES.[last_request_start_time]\n ,ES.[last_request_end_time]\n ,ES.[reads]\n ,ES.[writes]\n ,ES.[logical_reads]\n ,ES.[is_user_process]\n ,ES.[text_size]\n ,ES.[language]\n ,ES.[date_format]\n ,ES.[date_first]\n ,ES.[quoted_identifier]\n ,ES.[arithabort]\n ,ES.[ansi_null_dflt_on]\n ,ES.[ansi_defaults]\n ,ES.[ansi_warnings]\n ,ES.[ansi_padding]\n ,ES.[ansi_nulls]\n ,ES.[concat_null_yields_null]\n ,ES.[transaction_isolation_level]\n ,ES.[lock_timeout]\n ,ES.[deadlock_priority]\n ,ES.[row_count]\n ,ES.[prev_error]\n ,ES.[original_security_id]\n ,ES.[original_login_name]\n ,ES.[last_successful_logon]\n ,ES.[last_unsuccessful_logon]\n ,ES.[unsuccessful_logons]\n ,ES.[group_id]\n ,ES.[database_id]\n ,ES.[authenticating_database_id]\n ,ES.[open_transaction_count]\n ,EC.[most_recent_session_id]\n ,EC.[connect_time]\n ,EC.[net_transport]\n ,EC.[protocol_type]\n ,EC.[protocol_version]\n ,EC.[encrypt_option]\n ,EC.[auth_scheme]\n ,EC.[node_affinity]\n ,EC.[num_reads]\n ,EC.[num_writes]\n ,EC.[last_read]\n ,EC.[last_write]\n ,EC.[net_packet_size]\n ,EC.[client_net_address]\n ,EC.[client_tcp_port]\n ,EC.[local_net_address]\n ,EC.[local_tcp_port]\n ,EC.[connection_id]\n ,EC.[parent_connection_id]\n ,EC.[most_recent_sql_handle]\n ,(SELECT TOP (1)\n text\n FROM sys.dm_exec_sql_text(EC.[most_recent_sql_handle]))\n AS [LastTSQL]\n ,kc.[TransactionBeginTime]\n ,kc.[CountTranNotRequest]\n ,kc.[CountSessionNotRequest]\n FROM @kills_copy AS kc\n INNER JOIN sys.dm_exec_sessions ES WITH (READUNCOMMITTED)\n ON kc.[SessionID] = ES.[session_id]\n INNER JOIN sys.dm_exec_connections EC WITH (READUNCOMMITTED)\n ON EC.session_id = ES.session_id;\n\n --collecting sessions\n INSERT INTO @kills (SessionID)\n SELECT\n [SessionID]\n FROM @kills_copy\n GROUP BY [SessionID];\n\n DECLARE @SessionID INT;\n\n --direct deletion of selected sessions\n WHILE (EXISTS (SELECT TOP (1)\n 1\n FROM @kills)\n )\n BEGIN\n SELECT TOP (1)\n @SessionID = [SessionID]\n FROM @kills;\n\n BEGIN TRY\n EXEC sp_executesql N'kill @SessionID'\n ,N'@SessionID INT'\n ,@SessionID;\n END TRY\n BEGIN CATCH\n END CATCH;\n\n DELETE FROM @kills\n WHERE [SessionID] = @SessionID;\n END;\n\n SELECT\n st.[SessionID]\n ,st.[TransactionID] INTO #tbl\n FROM srv.SessionTran AS st\n WHERE st.[CountTranNotRequest] >= 250\n OR st.[CountSessionNotRequest] >= 250\n OR EXISTS (SELECT TOP (1)\n 1\n FROM @kills_copy kc\n WHERE kc.[SessionID] = st.[SessionID]);\n\n --deletion of processed records, as well as those that cannot be deleted and they are too long in the table for consideration\n DELETE FROM st\n FROM #tbl AS t\n INNER JOIN srv.SessionTran AS st\n ON t.[SessionID] = st.[SessionID]\n AND t.[TransactionID] = st.[TransactionID];\n\n DROP TABLE #tbl;\nEND; Let us execute this script to check if it is correct. The query is executed successfully, and the script selects incomplete transactions. Therefore, we can insert this code into the stored procedure. SQL Complete includes a robust script generator that we can use to create the EXECUTE code block in the new window to use our stored procedure. This script will emulate the open transaction scenario. The stored procedure has been executed successfully. Now we can see the output of its work – an open transaction is added to the srv.SessionTran table that we created at the previous stage. The srv.KillSession table serves for archiving the records. When we execute that stored procedure again, it kills the open transaction automatically. We can view the record of that killed transaction by simply executing SELECT * FROM DemoDatabase.srv.KillSession Best practices for managing open transactions Database administrators should always be mindful of open transactions, as they can occur due to various factors. Let’s explore the primary reasons for idle transactions in SQL Server and the methods to prevent them. Transactional Errors Open transactions are often caused by transactional errors, uncommitted data reads, or long-held locks. To prevent these issues, we should identify long-running queries, check for blocking resources, use appropriate isolation levels, and analyze error logs. Query Design and Indexing Efficiency Poorly optimized queries, missing indexes, or neglecting COMMIT/ROLLBACK statements can result in open transactions. To mitigate this problem, focus on optimizing queries, implementing proper indexing, and ensuring transaction control statements are included. Insufficient HDD Resources An overloaded CPU, insufficient memory, slow disk I/O, or network bottlenecks can hinder query performance and contribute to open transactions. Recommendations include canceling unnecessary processes, rescheduling resource-intensive tasks, and considering hardware upgrades if necessary. Deadlocks Deadlocks occur when processes wait for resources held by each other, potentially leaving transactions open until resolved. To minimize open transactions due to deadlocks, set transaction timeouts, script deadlock identification procedures, and assign transaction priorities. Transaction Exceptions Exceptions within transaction blocks, if not handled properly, can prevent transaction completion. To address this issue, examine exceptions, enhance error handling mechanisms, and conduct thorough code reviews. Conclusion Open transactions can lead to significant issues in SQL Server operations, which is why it’s crucial to identify and address them promptly. In this article, we’ve outlined straightforward methods for detecting open transactions using the graphical user interface of SSMS and the dbForge SQL Complete add-in. Additionally, we’ve provided guidance on automating the processes of identifying and resolving open transactions. SQL Complete offers robust coding assistance, code formatting, scripting, and notification features, making it a valuable tool in this context. You can try out the [fully functional SQL Complete trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) free of charge for 14 days, allowing you to integrate its capabilities into SSMS and leverage its enhanced functionality in your actual workload. Tags [delete incomplete transactions](https://blog.devart.com/tag/delete-incomplete-transactions) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-sql-server-incomplete-open-transactions.html) [Twitter](https://twitter.com/intent/tweet?text=Managing+Open+Transactions+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-sql-server-incomplete-open-transactions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html&title=Managing+Open+Transactions+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html&title=Managing+Open+Transactions+in+SQL+Server) [Copy URL](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html?_ga=2.195959468.1454847682.1586957622-1125767723.1586957622", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Managing Open Transactions in SQL Server By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) April 26, 2024 [0](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html#respond) 5817 Open transactions in SQL Server occur frequently and demand close attention and proper handling to prevent issues. Locking data and blocking access to it, causing transaction log growth, affecting data consistency, and increasing resource consumption are just a few potential problems stemming from open transactions. Monitoring open transactions and ensuring their timely completion (either committing or rolling back) is a best practice, and SQL Server offers various tools and commands to identify such open transactions. This article will explore these tools and methods and provide a guide on automatically finding and resolving open transactions. Contents Understanding open transactions Prerequisites and preparations Finding SQL Server incomplete transactions Addressing lost transactions Automating cleanup and maintenance tasks Best practices for managing open transactions Conclusion Understanding open transactions In SQL Server, transactions are either committed upon success or rolled back if they encounter errors or are canceled. However, there are instances where transactions are left idle, which should be avoided. For example, when script execution is interrupted without issuing COMMIT or ROLLBACK statements, it leaves the transaction in an “open” idle state. These open transactions consume resources and can lead to issues like running out of disk space and overall system damage. Detecting and resolving such open transactions is crucial to prevent these problems. Prerequisites and preparations To illustrate various methods to identify and delete open transactions as well as the method of automation of this process, we’ll use SQL Server Management Studio (SSMS), the standard tool for managing SQL Server databases. We’ll be utilizing an enhanced version of SSMS that includes the [dbForge SQL Complete add-in](https://www.devart.com/dbforge/sql/sqlcomplete/) . This powerful coding assistant offers a host of additional features, including notifications about open transactions. SQL Complete integrates smoothly with SSMS, enriching the familiar interface with numerous extra functionalities for all SQL Server professionals. Also, we will use the test A dventureWorks2022 SQL Server database to illustrate the methods of finding idle transactions. To demonstrate the automation of the process, we have prepared the DemoDatabase test database. However, you can use the scripts presented in this article on your databases. Finding SQL Server incomplete transactions SQL Server provides an in-built method of defining if there are any open transactions, the DBCC OPENTRAN command. DBCC OPENTRAN; If any open transactions have been detected, SSMS will return the information about them as “oldest active transaction,” which you can see in the screenshot below. Note: The DBCC OPENTRAN command shows open transactions for the database for which it is executed. Another method is querying the sys.sysprocesses Dynamic Management View (DMV). SELECT * FROM sys.sysprocesses WHERE open_tran = 1 This command output is presented in a standard query window and provides detailed information about the transaction. Besides, there is the possibility to present the information retrieved by the DBCC OPENTRAN command as a temporary table. This format is preferred by many specialists. -- Create a temporary table to accept the results.\nCREATE TABLE #OpenTranStatus (\n ActiveTransaction VARCHAR(25),\n Details sql_variant\n );\n-- Execute the command, putting the results in the table.\nINSERT INTO #OpenTranStatus\n EXEC ('DBCC OPENTRAN WITH TABLERESULTS, NO_INFOMSGS');\n \n-- Display the results.\nSELECT * FROM #OpenTranStatus; Finally, if you use SQL Complete , this add-in will notify the user about any open transactions visually: The option to warn the user about any open transactions is enabled by default in SQL Complete. If not, you can activate it manually in the Notifications section of the SQL Complete Options menu: Once you have defined the open transaction that you need to delete, you can use the KILL command and the session ID to get rid of it: KILL 58; Note: Executing the KILL command requires admin privileges. Addressing lost transactions We have examined different ways of finding the open transactions in SQL Server. However, when you are the database administrator who deals with multiple databases and many hundreds and thousands of transactions. We can collect information about SQL Server active transactions and their sessions that have no requests (transactions that were launched and left forgotten). Further, that information can be presented in a table format. We’ll use the below code to create the table to store information about current lost transactions in DemoDatabase : SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[SessionTran] (\n\t[SessionID] INT NOT NULL\n ,[TransactionID] BIGINT NOT NULL\n ,[CountTranNotRequest] TINYINT NOT NULL\n ,[CountSessionNotRequest] TINYINT NOT NULL\n ,[TransactionBeginTime] DATETIME NOT NULL\n ,[InsertUTCDate] DATETIME NOT NULL\n ,[UpdateUTCDate] DATETIME NOT NULL\n ,CONSTRAINT [PK_SessionTran] PRIMARY KEY CLUSTERED\n\t(\n\t[SessionID] ASC,\n\t[TransactionID] ASC\n\t) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) ON [PRIMARY]\n) ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_Count] DEFAULT ((0)) FOR [CountTranNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_CountSessionNotRequest] DEFAULT ((0)) FOR [CountSessionNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_InsertUTCDate] DEFAULT (GETUTCDATE()) FOR [InsertUTCDate]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD CONSTRAINT [DF_SessionTran_UpdateUTCDate] DEFAULT (GETUTCDATE()) FOR [UpdateUTCDate]\nGO In this script: SessionID identifies the session TransactionID identifies the lost transaction CountTranNotRequest stands for the number of times the transaction was recorded as lost CountSessionNotRequest stands for the number of times the session was recorded as one that has no active queries and contains a lost transaction TransactionBeginTime refers to the start date and time of the lost transaction InsertUTCDate identifies the date and time (UTC) when the record was made UpdateUTCDate identifies the date and time (UTC) when the record was updated. Similarly, we create a table to archive open transactions selected from the first table according to the delete actions in the same DemoDatabase . SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[KillSession](\n[ID] [int] IDENTITY(1,1) NOT NULL,\n[session_id] [smallint] NOT NULL,\n[transaction_id] [bigint] NOT NULL,\n[login_time] [datetime] NOT NULL,\n[host_name] [nvarchar](128) NULL,\n[program_name] [nvarchar](128) NULL,\n[host_process_id] [int] NULL,\n[client_version] [int] NULL,\n[client_interface_name] [nvarchar](32) NULL,\n[security_id] [varbinary](85) NOT NULL,\n[login_name] [nvarchar](128) NOT NULL,\n[nt_domain] [nvarchar](128) NULL,\n[nt_user_name] [nvarchar](128) NULL,\n[status] [nvarchar](30) NOT NULL,\n[context_info] [varbinary](128) NULL,\n[cpu_time] [int] NOT NULL,\n[memory_usage] [int] NOT NULL,\n[total_scheduled_time] [int] NOT NULL,\n[total_elapsed_time] [int] NOT NULL,\n[endpoint_id] [int] NOT NULL,\n[last_request_start_time] [datetime] NOT NULL,\n[last_request_end_time] [datetime] NULL,\n[reads] [bigint] NOT NULL,\n[writes] [bigint] NOT NULL,\n[logical_reads] [bigint] NOT NULL,\n[is_user_process] [bit] NOT NULL,\n[text_size] [int] NOT NULL,\n[language] [nvarchar](128) NULL,\n[date_format] [nvarchar](3) NULL,\n[date_first] [smallint] NOT NULL,\n[quoted_identifier] [bit] NOT NULL,\n[arithabort] [bit] NOT NULL,\n[ansi_null_dflt_on] [bit] NOT NULL,\n[ansi_defaults] [bit] NOT NULL,\n[ansi_warnings] [bit] NOT NULL,\n[ansi_padding] [bit] NOT NULL,\n[ansi_nulls] [bit] NOT NULL,\n[concat_null_yields_null] [bit] NOT NULL,\n[transaction_isolation_level] [smallint] NOT NULL,\n[lock_timeout] [int] NOT NULL,\n[deadlock_priority] [int] NOT NULL,\n[row_count] [bigint] NOT NULL,\n[prev_error] [int] NOT NULL,\n[original_security_id] [varbinary](85) NOT NULL,\n[original_login_name] [nvarchar](128) NOT NULL,\n[last_successful_logon] [datetime] NULL,\n[last_unsuccessful_logon] [datetime] NULL,\n[unsuccessful_logons] [bigint] NULL,\n[group_id] [int] NOT NULL,\n[database_id] [smallint] NOT NULL,\n[authenticating_database_id] [int] NULL,\n[open_transaction_count] [int] NOT NULL,\n[most_recent_session_id] [int] NULL,\n[connect_time] [datetime] NULL,\n[net_transport] [nvarchar](40) NULL,\n[protocol_type] [nvarchar](40) NULL,\n[protocol_version] [int] NULL,\n[encrypt_option] [nvarchar](40) NULL,\n[auth_scheme] [nvarchar](40) NULL,\n[node_affinity] [smallint] NULL,\n[num_reads] [int] NULL,\n[num_writes] [int] NULL,\n[last_read] [datetime] NULL,\n[last_write] [datetime] NULL,\n[net_packet_size] [int] NULL,\n[client_net_address] [nvarchar](48) NULL,\n[client_tcp_port] [int] NULL,\n[local_net_address] [nvarchar](48) NULL,\n[local_tcp_port] [int] NULL,\n[connection_id] [uniqueidentifier] NULL,\n[parent_connection_id] [uniqueidentifier] NULL,\n[most_recent_sql_handle] [varbinary](64) NULL,\n[LastTSQL] [nvarchar](max) NULL,\n[transaction_begin_time] [datetime] NOT NULL,\n[CountTranNotRequest] [tinyint] NOT NULL,\n[CountSessionNotRequest] [tinyint] NOT NULL,\n[InsertUTCDate] [datetime] NOT NULL,\n CONSTRAINT [PK_KillSession] PRIMARY KEY CLUSTERED \n(\n[ID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[KillSession] ADD CONSTRAINT [DF_KillSession_InsertUTCDate] DEFAULT (getutcdate()) FOR [InsertUTCDate]\nGO This table will list all “killed” transactions. Specifically, the InsertUTCDate field will provide the exact time when each transaction was terminated and logged in the table. Automating cleanup and maintenance tasks Searching for open transactions and killing them manually would require too much time and resources. This tedious job can be automated, and it should be automated. Having analyzed the requests, their statuses, and sessions, open a new window and type the following scripts to find open transactions in SQL Server that are left incomplete. declare @tbl table (\nSessionID int,\nTransactionID bigint,\nIsSessionNotRequest bit,\nTransactionBeginTime datetime\n); Click to open and view the full code of the srv.AutoKillSessionTranBegin stored procedure SET ANSI_NULLS ON;\nGO\nSET QUOTED_IDENTIFIER ON;\nGO\nALTER PROCEDURE [srv].[AutoKillSessionTranBegin] @minuteOld2 INT = 30, --old age of a running transaction\n@countIsNotRequests2 INT = 5 --number of hits in the table\nAS\nBEGIN\n /*\n --definition of frozen transactions (forgotten ones that do not have active requests) with their subsequent removal\n */\n\n SET NOCOUNT ON;\n SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;\n\n\n DECLARE @tbl TABLE (\n SessionID INT\n ,TransactionID BIGINT\n ,IsSessionNotRequest BIT\n ,TransactionBeginTime DATETIME\n );\n\n --collect information (transactions and their sessions that have no requests, i.e., transactions that are started and forgotten)\n INSERT INTO @tbl (SessionID,\n TransactionID,\n IsSessionNotRequest,\n TransactionBeginTime)\n SELECT\n t.[session_id] AS SessionID\n ,t.[transaction_id] AS TransactionID\n ,CASE\n WHEN EXISTS (SELECT TOP (1)\n 1\n FROM sys.dm_exec_requests AS r\n WHERE r.[session_id] = t.[session_id]) THEN 0\n ELSE 1\n END AS IsSessionNotRequest\n ,(SELECT TOP (1)\n dtat.[transaction_begin_time]\n FROM sys.dm_tran_active_transactions AS dtat\n WHERE dtat.[transaction_id] = t.[transaction_id])\n AS TransactionBeginTime\n FROM sys.dm_tran_session_transactions AS t\n WHERE t.[is_user_transaction] = 1\n AND NOT EXISTS (SELECT TOP (1)\n 1\n FROM sys.dm_exec_requests AS r\n WHERE r.[transaction_id] = t.[transaction_id]);\n\n --update the table of running transactions that have no active queries\n ;\n MERGE srv.SessionTran AS st USING @tbl AS t\n ON st.[SessionID] = t.[SessionID]\n AND st.[TransactionID] = t.[TransactionID]\n WHEN MATCHED\n THEN UPDATE\n SET [UpdateUTCDate] = GETUTCDATE()\n ,[CountTranNotRequest] = st.[CountTranNotRequest] + 1\n ,[CountSessionNotRequest] =\n CASE\n WHEN (t.[IsSessionNotRequest] = 1) THEN (st.[CountSessionNotRequest] + 1)\n ELSE 0\n END\n ,[TransactionBeginTime] = COALESCE(t.[TransactionBeginTime], st.[TransactionBeginTime])\n WHEN NOT MATCHED BY TARGET\n AND (t.[TransactionBeginTime] IS NOT NULL)\n THEN INSERT ([SessionID]\n , [TransactionID]\n , [TransactionBeginTime])\n VALUES (t.[SessionID], t.[TransactionID], t.[TransactionBeginTime])\n WHEN NOT MATCHED BY SOURCE\n THEN DELETE;\n\n --list of sessions to delete (containing frozen transactions)\n DECLARE @kills TABLE (\n SessionID INT\n );\n\n --detailed information for the archive\n DECLARE @kills_copy TABLE (\n SessionID INT\n ,TransactionID BIGINT\n ,CountTranNotRequest TINYINT\n ,CountSessionNotRequest TINYINT\n ,TransactionBeginTime DATETIME\n );\n\n --collect those sessions that need to be killed\n INSERT INTO @kills_copy (SessionID,\n TransactionID,\n CountTranNotRequest,\n CountSessionNotRequest,\n TransactionBeginTime)\n SELECT\n SessionID\n ,TransactionID\n ,CountTranNotRequest\n ,CountSessionNotRequest\n ,TransactionBeginTime\n FROM srv.SessionTran\n WHERE [CountTranNotRequest] >= @countIsNotRequests2\n AND [CountSessionNotRequest] >= @countIsNotRequests2\n AND [TransactionBeginTime] <= DATEADD(MINUTE, -@minuteOld2, GETDATE());\n\n --archive what we are going to delete (detailed information about deleted sessions, connections and transactions)\n INSERT INTO [srv].[KillSession] ([session_id]\n , [transaction_id]\n , [login_time]\n , [host_name]\n , [program_name]\n , [host_process_id]\n , [client_version]\n , [client_interface_name]\n , [security_id]\n , [login_name]\n , [nt_domain]\n , [nt_user_name]\n , [status]\n , [context_info]\n , [cpu_time]\n , [memory_usage]\n , [total_scheduled_time]\n , [total_elapsed_time]\n , [endpoint_id]\n , [last_request_start_time]\n , [last_request_end_time]\n , [reads]\n , [writes]\n , [logical_reads]\n , [is_user_process]\n , [text_size]\n , [language]\n , [date_format]\n , [date_first]\n , [quoted_identifier]\n , [arithabort]\n , [ansi_null_dflt_on]\n , [ansi_defaults]\n , [ansi_warnings]\n , [ansi_padding]\n , [ansi_nulls]\n , [concat_null_yields_null]\n , [transaction_isolation_level]\n , [lock_timeout]\n , [deadlock_priority]\n , [row_count]\n , [prev_error]\n , [original_security_id]\n , [original_login_name]\n , [last_successful_logon]\n , [last_unsuccessful_logon]\n , [unsuccessful_logons]\n , [group_id]\n , [database_id]\n , [authenticating_database_id]\n , [open_transaction_count]\n , [most_recent_session_id]\n , [connect_time]\n , [net_transport]\n , [protocol_type]\n , [protocol_version]\n , [encrypt_option]\n , [auth_scheme]\n , [node_affinity]\n , [num_reads]\n , [num_writes]\n , [last_read]\n , [last_write]\n , [net_packet_size]\n , [client_net_address]\n , [client_tcp_port]\n , [local_net_address]\n , [local_tcp_port]\n , [connection_id]\n , [parent_connection_id]\n , [most_recent_sql_handle]\n , [LastTSQL]\n , [transaction_begin_time]\n , [CountTranNotRequest]\n , [CountSessionNotRequest])\n SELECT\n ES.[session_id]\n ,kc.[TransactionID]\n ,ES.[login_time]\n ,ES.[host_name]\n ,ES.[program_name]\n ,ES.[host_process_id]\n ,ES.[client_version]\n ,ES.[client_interface_name]\n ,ES.[security_id]\n ,ES.[login_name]\n ,ES.[nt_domain]\n ,ES.[nt_user_name]\n ,ES.[status]\n ,ES.[context_info]\n ,ES.[cpu_time]\n ,ES.[memory_usage]\n ,ES.[total_scheduled_time]\n ,ES.[total_elapsed_time]\n ,ES.[endpoint_id]\n ,ES.[last_request_start_time]\n ,ES.[last_request_end_time]\n ,ES.[reads]\n ,ES.[writes]\n ,ES.[logical_reads]\n ,ES.[is_user_process]\n ,ES.[text_size]\n ,ES.[language]\n ,ES.[date_format]\n ,ES.[date_first]\n ,ES.[quoted_identifier]\n ,ES.[arithabort]\n ,ES.[ansi_null_dflt_on]\n ,ES.[ansi_defaults]\n ,ES.[ansi_warnings]\n ,ES.[ansi_padding]\n ,ES.[ansi_nulls]\n ,ES.[concat_null_yields_null]\n ,ES.[transaction_isolation_level]\n ,ES.[lock_timeout]\n ,ES.[deadlock_priority]\n ,ES.[row_count]\n ,ES.[prev_error]\n ,ES.[original_security_id]\n ,ES.[original_login_name]\n ,ES.[last_successful_logon]\n ,ES.[last_unsuccessful_logon]\n ,ES.[unsuccessful_logons]\n ,ES.[group_id]\n ,ES.[database_id]\n ,ES.[authenticating_database_id]\n ,ES.[open_transaction_count]\n ,EC.[most_recent_session_id]\n ,EC.[connect_time]\n ,EC.[net_transport]\n ,EC.[protocol_type]\n ,EC.[protocol_version]\n ,EC.[encrypt_option]\n ,EC.[auth_scheme]\n ,EC.[node_affinity]\n ,EC.[num_reads]\n ,EC.[num_writes]\n ,EC.[last_read]\n ,EC.[last_write]\n ,EC.[net_packet_size]\n ,EC.[client_net_address]\n ,EC.[client_tcp_port]\n ,EC.[local_net_address]\n ,EC.[local_tcp_port]\n ,EC.[connection_id]\n ,EC.[parent_connection_id]\n ,EC.[most_recent_sql_handle]\n ,(SELECT TOP (1)\n text\n FROM sys.dm_exec_sql_text(EC.[most_recent_sql_handle]))\n AS [LastTSQL]\n ,kc.[TransactionBeginTime]\n ,kc.[CountTranNotRequest]\n ,kc.[CountSessionNotRequest]\n FROM @kills_copy AS kc\n INNER JOIN sys.dm_exec_sessions ES WITH (READUNCOMMITTED)\n ON kc.[SessionID] = ES.[session_id]\n INNER JOIN sys.dm_exec_connections EC WITH (READUNCOMMITTED)\n ON EC.session_id = ES.session_id;\n\n --collecting sessions\n INSERT INTO @kills (SessionID)\n SELECT\n [SessionID]\n FROM @kills_copy\n GROUP BY [SessionID];\n\n DECLARE @SessionID INT;\n\n --direct deletion of selected sessions\n WHILE (EXISTS (SELECT TOP (1)\n 1\n FROM @kills)\n )\n BEGIN\n SELECT TOP (1)\n @SessionID = [SessionID]\n FROM @kills;\n\n BEGIN TRY\n EXEC sp_executesql N'kill @SessionID'\n ,N'@SessionID INT'\n ,@SessionID;\n END TRY\n BEGIN CATCH\n END CATCH;\n\n DELETE FROM @kills\n WHERE [SessionID] = @SessionID;\n END;\n\n SELECT\n st.[SessionID]\n ,st.[TransactionID] INTO #tbl\n FROM srv.SessionTran AS st\n WHERE st.[CountTranNotRequest] >= 250\n OR st.[CountSessionNotRequest] >= 250\n OR EXISTS (SELECT TOP (1)\n 1\n FROM @kills_copy kc\n WHERE kc.[SessionID] = st.[SessionID]);\n\n --deletion of processed records, as well as those that cannot be deleted and they are too long in the table for consideration\n DELETE FROM st\n FROM #tbl AS t\n INNER JOIN srv.SessionTran AS st\n ON t.[SessionID] = st.[SessionID]\n AND t.[TransactionID] = st.[TransactionID];\n\n DROP TABLE #tbl;\nEND; Let us execute this script to check if it is correct. The query is executed successfully, and the script selects incomplete transactions. Therefore, we can insert this code into the stored procedure. SQL Complete includes a robust script generator that we can use to create the EXECUTE code block in the new window to use our stored procedure. This script will emulate the open transaction scenario. The stored procedure has been executed successfully. Now we can see the output of its work – an open transaction is added to the srv.SessionTran table that we created at the previous stage. The srv.KillSession table serves for archiving the records. When we execute that stored procedure again, it kills the open transaction automatically. We can view the record of that killed transaction by simply executing SELECT * FROM DemoDatabase.srv.KillSession Best practices for managing open transactions Database administrators should always be mindful of open transactions, as they can occur due to various factors. Let’s explore the primary reasons for idle transactions in SQL Server and the methods to prevent them. Transactional Errors Open transactions are often caused by transactional errors, uncommitted data reads, or long-held locks. To prevent these issues, we should identify long-running queries, check for blocking resources, use appropriate isolation levels, and analyze error logs. Query Design and Indexing Efficiency Poorly optimized queries, missing indexes, or neglecting COMMIT/ROLLBACK statements can result in open transactions. To mitigate this problem, focus on optimizing queries, implementing proper indexing, and ensuring transaction control statements are included. Insufficient HDD Resources An overloaded CPU, insufficient memory, slow disk I/O, or network bottlenecks can hinder query performance and contribute to open transactions. Recommendations include canceling unnecessary processes, rescheduling resource-intensive tasks, and considering hardware upgrades if necessary. Deadlocks Deadlocks occur when processes wait for resources held by each other, potentially leaving transactions open until resolved. To minimize open transactions due to deadlocks, set transaction timeouts, script deadlock identification procedures, and assign transaction priorities. Transaction Exceptions Exceptions within transaction blocks, if not handled properly, can prevent transaction completion. To address this issue, examine exceptions, enhance error handling mechanisms, and conduct thorough code reviews. Conclusion Open transactions can lead to significant issues in SQL Server operations, which is why it’s crucial to identify and address them promptly. In this article, we’ve outlined straightforward methods for detecting open transactions using the graphical user interface of SSMS and the dbForge SQL Complete add-in. Additionally, we’ve provided guidance on automating the processes of identifying and resolving open transactions. SQL Complete offers robust coding assistance, code formatting, scripting, and notification features, making it a valuable tool in this context. You can try out the [fully functional SQL Complete trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) free of charge for 14 days, allowing you to integrate its capabilities into SSMS and leverage its enhanced functionality in your actual workload. Tags [delete incomplete transactions](https://blog.devart.com/tag/delete-incomplete-transactions) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-sql-server-incomplete-open-transactions.html) [Twitter](https://twitter.com/intent/tweet?text=Managing+Open+Transactions+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Ffind-and-delete-sql-server-incomplete-open-transactions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html&title=Managing+Open+Transactions+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html&title=Managing+Open+Transactions+in+SQL+Server) [Copy URL](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/find-invalid-objects-in-sql-server-databases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Find Invalid Objects in SQL Server Databases By [dbForge Team](https://blog.devart.com/author/dbforge) July 6, 2020 [0](https://blog.devart.com/find-invalid-objects-in-sql-server-databases.html#respond) 3690 In the course of time, we can experience situations when some certain database objects have not been distributed on one or several databases or database instances. This can happen for a number of reasons. An example of that would be a case when a stored procedure refers to a nonexistent table. In this article, we are going to take a closer look at the issue of how to find invalid objects in SQL Server databases with the help of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . To have a much more complete insight into the functionality of this tool, you will first need to consider the previous series of our articles. As stated in the [first part](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) , this tool allows us to find and delete incomplete SQL Server transactions, but not only that. In addition to that, IntelliSense-style code completion, highly customizable and shareable code-formatting, efficient code refactoring are among its advantages. This article is going to be the final part of the series of Deleting Lost Transactions in MS SQL Server articles. So let’s go over the topics we have discussed and review what we have accomplished in the four previous articles: [Analyzing the algorithm of deleting lost transactions](https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html) [Creating a stored procedure using CRUD](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html?_ga=2.195959468.1454847682.1586957622-1125767723.1586957622) [Testing the stored procedure and turning it into a script](https://blog.devart.com/export-sql-stored-procedure-to-a-file-and-generate-its-script.html) [Reviewing a number of useful settings that vastly help in the workflow](https://blog.devart.com/find-and-delete-incomplete-open-transactions-in-ms-sql-server-part-3.html) Finding invalid objects Apart from its other functions, with the help of the SQL Complete tool, we can [find invalid objects](https://www.devart.com/dbforge/sql/sqlcomplete/code-refactoring.html#find_invalid_objects) in SQL Server. To do that, first, select the SQL Complete\\Find Invalid objects command in the main menu: Pic.1. Choosing the “Find Invalid Objects” command in the SQL Complete main menu Once it’s done, in the window that opens, we select the required databases from MS SQL Server instances and press “Analyze” at the upper right corner or at the middle of the screen: Pic.2. Setting up and running the search for references to invalid objects. Now, the process of searching for references to invalid objects starts: Pic.3. The search process After the process is completed, you will see a table containing all objects that refer to invalid objects: Pic.4. The table showing all objects that refer to invalid database objects As you can see here, the table displays the type, schema, and name of the object that refers to the invalid object, the name of which is also displayed. At the bottom, you can see the code of the object that refers to the invalid object. In this case, the DeleteArchive stored procedure refers to the [nonexistent srv.KillSession table](https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html) , which is an unresolved reference to the object. After that, you can proceed to the location of the stored procedure, by right-clicking on it and selecting “Find in Object Explorer” in the context menu: Pic.5. Finding the object that refers to an invalid database object Finally, it will be enough to analyze the code of the stored procedure, which refers to the nonexistent table, and decide whether you need to create the srv.KillSession table: Pic.6. The code analysis result Summary In this final part, we took you through the whole process of finding invalid database objects, providing a step-by-step guide of how to accomplish that using the [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) tool. The result leaves the user with a decision whether you create the nonexistent object or change the reference to the existing one. Tags [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql server transactions](https://blog.devart.com/tag/sql-server-transactions) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffind-invalid-objects-in-sql-server-databases.html) [Twitter](https://twitter.com/intent/tweet?text=Find+Invalid+Objects+in+SQL+Server+Databases&url=https%3A%2F%2Fblog.devart.com%2Ffind-invalid-objects-in-sql-server-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/find-invalid-objects-in-sql-server-databases.html&title=Find+Invalid+Objects+in+SQL+Server+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/find-invalid-objects-in-sql-server-databases.html&title=Find+Invalid+Objects+in+SQL+Server+Databases) [Copy URL](https://blog.devart.com/find-invalid-objects-in-sql-server-databases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/searching-for-invalid-objects-with-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) Find Invalid Objects in Your Databases By [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) September 6, 2024 [6](https://blog.devart.com/find-invalid-objects-in-your-databases.html#comments) 117367 DBA has a number of duties that are primarily targeted at supporting database performance capabilities and data consistency. The administrator can use the CHECKDB command to easily verify data consistency; however, in case they need to find an invalid object in a database schema, some difficulties may occur. Introduction Running smooth operations and ensuring data integrity are very important in any database system. One issue that can get in the way is invalid objects present in databases. These are database objects that no longer function properly due to changes made to the database or its components. For example, invalid objects may appear when a referenced object is changed or removed. So, it is a good practice to find invalid objects to maintain the database performance and avoid unexpected issues or even downtime. In this article, we’ll explain what invalid objects are, how they can affect your database, and how to identify and fix them using scripts or dbForge Edge, an advanced toolset for searching and fixing invalid objects across your database and performing other database-related tasks. Understanding invalid objects By invalid, we usually mean different types of objects, such as synonyms, functions, packages, procedures, views, etc., that reference non-existing objects or objects that were changed in some way (e.g., renamed). For instance, you should be careful with synonyms created on objects because when you delete an object, its synonym will get an invalid status. If you recreate an object, its synonym must also be recompiled. When you attempt to use an invalid object, it will throw an error. You often discover invalid objects when you run preparation scripts, perform data export or import, upgrade, or apply patches. However, a really good way to work with them is to perform regular checks on the presence of mismatched objects before and after you introduce changes or upgrades. Invalid objects can be rather ambiguous. However, it’s important to remember that some are quite harmless. You can fix them with a simple action or a recompilation, automatically performed by a database when the object is accessed. For instance, there is not much reason to worry about invalid materialized views since they become valid as soon as you add data to the underlying tables. Encountering persisting issues that should be fixed by dumping data? Check the [MySQL Dump Tool](https://www.devart.com/dbforge/mysql/studio/mysql-backup.html) for easy backups and recovery. Alternatively, other objects indicate latent severe issues that cannot be recompiled. This is especially true when the change on the referenced object results in an error in the calling object, thus the latter cannot be recompiled. In both cases, finding the reason and identifying these objects in your database is crucial before further issues can occur. Depending on the database you use, you can choose from the scripts provided below to find invalid objects. Identifying invalid objects in different databases Now, let us see how to search for invalid objects in Oracle, SQL Server, and MySQL databases using [dbForge Edge](https://www.devart.com/dbforge/edge/) , which consists of the Studios: dbForge Studio for Oracle, dbForge Studio for SQL Server, dbForge Studio for MySQL, as well as dbForge Studio for PostgreSQL. Oracle Since [Oracle is an intricate and interrelated database](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , some objects reliant on one another often become ‘invalid’. As a rule, they are recompiled automatically on demand. However, this can be very time-consuming, especially regarding complex dependencies. The solution is to find these objects and apply different methods to recompile them. Let us show you how to get a list of invalid objects in Oracle with the following query: SELECT owner, object_type, object_name FROM all_objects WHERE status = 'INVALID' The information you will receive will help you decide on the next step to recompile the objects. If you experience difficulties trying to search for and fix invalid objects manually, [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/find-object-oracle.html) may come to your aid as it offers extensive functionality allowing you to reduce the number of errors and rename the objects without breaking dependencies between them. Learn more about how to recompile invalid objects with this tool in the [documentation](https://docs.devart.com/studio-for-oracle/searching-for-objects-and-data/recompiling-invalid-objects.html) . SQL Server SQL Server doesn’t allow finding mismatched objects directly. In most cases, you must execute a script to ensure the object is invalid. This is extremely inconvenient, though. For that reason, let us create and execute the script that will search for invalid objects: SELECT obj_name = QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name) , obj_type = o.type_desc , d.referenced_database_name , d.referenced_schema_name , d.referenced_entity_name FROM sys.sql_expression_dependencies d JOIN sys.objects o ON d.referencing_id = o.[object_id] WHERE d.is_ambiguous = 0 AND d.referenced_id IS NULL AND d.referenced_server_name IS NULL -- ignore objects from Linked server AND CASE d.referenced_class -- if does not exist WHEN 1 -- object THEN OBJECT_ID( ISNULL(QUOTENAME(d.referenced_database_name), DB_NAME()) + '.' + ISNULL(QUOTENAME(d.referenced_schema_name), SCHEMA_NAME()) + '.' + QUOTENAME(d.referenced_entity_name)) WHEN 6 – or user datatype THEN TYPE_ID( ISNULL(d.referenced_schema_name, SCHEMA_NAME()) + '.' + d.referenced_entity_name) WHEN 10 -- or XML schema THEN ( SELECT 1 FROM sys.xml_schema_collections x WHERE x.name = d.referenced_entity_name AND x.[schema_id] = ISNULL(SCHEMA_ID(d.referenced_schema_name), SCHEMA_ID()) ) END IS NULL The script is useful for primary analysis. However, it has some gaps. The main problem is that it does not show objects with invalid columns or parameters. CREATE VIEW dbo.vw_View AS SELECT ID = 1 GO CREATE PROCEDURE dbo.usp_Procedure AS BEGIN SELECT ID FROM dbo.vw_View END GO ALTER VIEW dbo.vw_View AS SELECT New_ID = 1 GO We will get an error while executing the storage procedure: Msg 207, Level 16, State 1, Procedure usp_Procedure, Line 6 Invalid column name 'ID'. Moreover, the script will not work on SQL Server 2005 . So, we can’t use the provided script as the primary one. However, SQL Server offers the sp_refreshsqlmodule system procedure. This procedure updates metadata for the specified non-schema-bound stored procedure, user-defined function, view, DML trigger, database-level DDL trigger, or server-level DDL trigger in the current database. Persistent metadata for these objects, such as data types of parameters, can become outdated because of changes made to their underlying objects. Thus, the sp_refreshsqlmodule procedure throws an error if an object contains invalid columns or properties. The procedure can be called inside a cursor for each object. If there are no invalid objects, the procedure is completed without errors. However, it is important to remember that script objects may have no dependencies or can contain no invalid objects initially. It is not reasonable to verify such objects. SQL Server will take care of that. The following script can be used for searching invalid objects: SET NOCOUNT ON; IF OBJECT_ID('tempdb.dbo.#objects') IS NOT NULL DROP TABLE #objects CREATE TABLE #objects ( obj_id INT PRIMARY KEY , obj_name NVARCHAR(1000) , err_message NVARCHAR(3000) NOT NULL , obj_type CHAR(2) NOT NULL ) INSERT INTO #objects (obj_id, obj_name, err_message, obj_type) SELECT t.referencing_id , obj_name = QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name) , 'Invalid object name ''' + t.obj_name + '''' , o.[type] FROM ( SELECT d.referencing_id , obj_name = MAX(COALESCE(d.referenced_database_name + '.', '') + COALESCE(d.referenced_schema_name + '.', '') + d.referenced_entity_name) FROM sys.sql_expression_dependencies d WHERE d.is_ambiguous = 0 AND d.referenced_id IS NULL AND d.referenced_server_name IS NULL -- ignore objects from Linked server AND CASE d.referenced_class -- if does not exist WHEN 1 -- object THEN OBJECT_ID( ISNULL(QUOTENAME(d.referenced_database_name), DB_NAME()) + '.' + ISNULL(QUOTENAME(d.referenced_schema_name), SCHEMA_NAME()) + '.' + QUOTENAME(d.referenced_entity_name)) WHEN 6 -- or user datatype THEN TYPE_ID( ISNULL(d.referenced_schema_name, SCHEMA_NAME()) + '.' + d.referenced_entity_name) WHEN 10 -- or XML schema THEN ( SELECT 1 FROM sys.xml_schema_collections x WHERE x.name = d.referenced_entity_name AND x.[schema_id] = ISNULL(SCHEMA_ID(d.referenced_schema_name), SCHEMA_ID()) ) END IS NULL GROUP BY d.referencing_id ) t JOIN sys.objects o ON t.referencing_id = o.[object_id] WHERE LEN(t.obj_name) > 4 -- hide valid aliases DECLARE @obj_id INT , @obj_name NVARCHAR(1000) , @obj_type CHAR(2) DECLARE cur CURSOR FAST_FORWARD READ_ONLY LOCAL FOR SELECT sm.[object_id] , QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name) , o.[type] FROM sys.sql_modules sm JOIN sys.objects o ON sm.[object_id] = o.[object_id] LEFT JOIN ( SELECT s.referenced_id FROM sys.sql_expression_dependencies s JOIN sys.objects o ON o.object_id = s.referencing_id WHERE s.is_ambiguous = 0 AND s.referenced_server_name IS NULL AND o.[type] IN ('C', 'D', 'U') GROUP BY s.referenced_id ) sed ON sed.referenced_id = sm.[object_id] WHERE sm.is_schema_bound = 0 -- objects without SCHEMABINDING AND sm.[object_id] NOT IN (SELECT o2.obj_id FROM #objects o2) AND OBJECTPROPERTY(sm.[object_id], 'IsEncrypted') = 0 AND ( o.[type] IN ('IF', 'TF', 'V', 'TR') --OR o.[type] = 'P' /* Microsoft Connect #656863 */ OR ( o.[type] = 'FN' AND -- ignore scalar functions, which are used in DEFAULT/CHECK constraints and COMPUTED columns sed.referenced_id IS NULL ) ) OPEN cur FETCH NEXT FROM cur INTO @obj_id, @obj_name, @obj_type WHILE @@FETCH_STATUS = 0 BEGIN BEGIN TRY BEGIN TRANSACTION EXEC sys.sp_refreshsqlmodule @name = @obj_name, @namespace = N'OBJECT' COMMIT TRANSACTION END TRY BEGIN CATCH IF XACT_STATE() != 0 ROLLBACK TRANSACTION INSERT INTO #objects (obj_id, obj_name, err_message, obj_type) SELECT @obj_id, @obj_name, ERROR_MESSAGE(), @obj_type END CATCH FETCH NEXT FROM cur INTO @obj_id, @obj_name, @obj_type END CLOSE cur DEALLOCATE cur SELECT obj_name, err_message, obj_type FROM #objects The same script for SQL Server 2005 : SET NOCOUNT ON;\nIF OBJECT_ID('tempdb.dbo.#objects') IS NOT NULL\n DROP TABLE #objects\n\nCREATE TABLE #objects (\n obj_name NVARCHAR(1000)\n , err_message NVARCHAR(3000) NOT NULL\n , obj_type CHAR(2) NOT NULL\n)\n\nDECLARE\n @obj_name NVARCHAR(1000)\n , @obj_type CHAR(2)\n\nDECLARE cur CURSOR FAST_FORWARD READ_ONLY LOCAL FOR\n SELECT\n QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name)\n , o.[type]\n FROM sys.sql_modules sm\n JOIN sys.objects o ON sm.[object_id] = o.[object_id]\n LEFT JOIN (\n SELECT s.referenced_major_id\n FROM sys.sql_dependencies s\n JOIN sys.objects o ON o.object_id = s.[object_id]\n WHERE o.[type] IN ('C', 'D', 'U')\n GROUP BY s.referenced_major_id\n ) sed ON sed.referenced_major_id = sm.[object_id]\n WHERE sm.is_schema_bound = 0\n AND OBJECTPROPERTY(sm.[object_id], 'IsEncrypted') = 0\n AND (\n o.[type] IN ('IF', 'TF', 'V', 'TR')\n OR (\n o.[type] = 'FN'\n AND\n sed.referenced_major_id IS NULL \n )\n )\n\nOPEN cur\n\nFETCH NEXT FROM cur INTO @obj_name, @obj_type\n\nWHILE @@FETCH_STATUS = 0 BEGIN\n\n BEGIN TRY\n\n BEGIN TRANSACTION\n EXEC sys.sp_refreshsqlmodule @name = @obj_name, @namespace = N'OBJECT' \n COMMIT TRANSACTION\n\n END TRY\n BEGIN CATCH\n\n IF XACT_STATE() != 0\n ROLLBACK TRANSACTION\n\n INSERT INTO #objects (obj_name, err_message, obj_type) \n SELECT @obj_name, ERROR_MESSAGE(), @obj_type\n\n END CATCH\n\n FETCH NEXT FROM cur INTO @obj_name, @obj_type\n\nEND\n\nCLOSE cur\nDEALLOCATE cur\n\nSELECT obj_name, err_message, obj_type\nFROM #objects Script execution results are as follows (for a test database): obj_name err_message obj_type\n--------------------------------- ------------------------------------------------------------------------------- --------\n[dbo].[vw_EmployeePersonalInfo] An insufficient number of arguments were supplied for ‘dbo.GetEmployee’ V\n[dbo].[udf_GetPercent] Invalid column name ‘Code’. FN\n[dbo].[trg_AIU_Sync] Invalid column name ‘DateOut’. P\n[dbo].[trg_IOU_SalaryEmployee] Invalid object name ‘dbo.tbl_SalaryEmployee’. TR\n[dbo].[trg_IU_ReturnDetail] The object ‘dbo.ReturnDetail’ does not exist or is invalid for this operation. TR\n[dbo].[ReportProduct] Invalid object name ‘dbo.ProductDetail’. IF SQL Server doesn’t check an object’s name while creating a synonym. So, a synonym can be created for a non-existing object. To find all invalid synonyms, you can use the following script: SELECT QUOTENAME(SCHEMA_NAME(s.[schema_id])) + '.' + QUOTENAME(s.name)\nFROM sys.synonyms s\nWHERE PARSENAME(s.base_object_name, 4) IS NULL -- ignore objects from Linked server\n AND OBJECT_ID(s.base_object_name) IS NULL If there is a need to add this check to a current script: ... SELECT obj_name, err_message, obj_type FROM #objects UNION ALL SELECT QUOTENAME(SCHEMA_NAME(s.[schema_id])) + '.' + QUOTENAME(s.name) COLLATE DATABASE_DEFAULT , 'Invalid object name ''' + s.base_object_name + '''' COLLATE DATABASE_DEFAULT , s.[type] COLLATE DATABASE_DEFAULT FROM sys.synonyms s WHERE PARSENAME(s.base_object_name, 4) IS NULL AND OBJECT_ID(s.base_object_name) IS NULL As you can see, metadata is a way to extend the standard functionality of SSMS to perform your day-to-day database tasks. If this task seems tiresome and requires much effort, you can simplify the process and save a substantial amount of time with SQL Complete, a code completion add-in for SSMS and VS. This powerful functionality allows you to easily detect mismatched objects across multiple databases and generate effective scripts to manage them. By the way, both [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) have advanced functionality with the Find Invalid Objects feature included. The Studio will help you save time and effort searching for invalid objects in one or multiple databases, either using the Find Invalid Objects manager or from the command line. Find invalid objects in a database using the Studio To begin, open the Studio. Then, you need to access the Find Invalid Objects manager by using one of the following ways: In Database Explorer , right-click the Database node and select Tasks > Find Invalid Objects . Go to the Administration tab of the Start Page and select Find Invalid Objects . On the ribbon, select Database > Find Invalid Objects . In the manager that opens, specify the connection, select the required databases, and click Analyze to start searching. The tool displays a grid of invalid objects, showing the object type, schema, name, and reason for invalidity. Additionally, the invalid object is highlighted in the SQL script generated in the Preview pane at the bottom of the page. dbForge Studio also provides the Find Invalid Objects CLI feature for identifying and managing invalid database objects efficiently. You can use the built-in Command Line Wizard to convert your search options into command-line syntax and save them as a *.bat file. This allows you to automate the search for invalid objects or schedule it to run from the command line. To open the Command Line Wizard, click Save Command Line on the toolbar. In the wizard, [specify the options](https://docs.devart.com/studio-for-sql-server/working-with-database-objects/finding-invalid-objects.html#options-of-the-command-line-wizard) for generating the batch file and performing the search. Then, click Validate to verify the script. To save the file, click Save and specify the path to the file. After creating the batch file, you can execute it from the command line using the /findinvalidobjects cmdlet command. Start with opening the Command Prompt or terminal. Then, navigate to the installation folder of dbForge Studio for SQL Server using the cd command. Note that the default installation folder is C:\\Program Files\\Devart\\dbForge Studio for SQL Server . To proceed, specify the following command: dbforgesql.com /findinvalidobjects /connection:\"Data Source=server_name; Integrated Security=False; User ID=username\"; /database:database_name where logfile.txt is a path to the log file the operation will create after completion, and report.csv is a path to the report file the operation will generate after completion. That’s it! The search for invalid objects has been done. MySQL Now, let’s take a closer look at how to find invalid objects in MySQL databases using the [MySQL Database Refactoring Tool](https://www.devart.com/dbforge/mysql/studio/database-refactoring.html) , which is available in [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) . The tool helps you search through a MySQL database schema to identify and recompile invalid objects, such as functions, procedures, triggers, and views. It also allows you to preview code changes and rename aliases. Open the Studio, navigate to the main Database menu, and select Tasks > Find Invalid Objects . Alternatively, right-click the database in Database Explorer and select Tasks > Find Invalid Objects . The Find Invalid Objects manager opens. Here, you need to choose the connection and database through which you want to search for invalid objects. To proceed, click Analyze . The results are displayed as a grid, where you can see the invalid object, its name, and why it is considered invalid. In addition, the tool highlights the invalid object in the code below the grid. You can also generate an ALTER or DROP script for the selected invalid object in a new SQL document or copy it to the clipboard. To automate the search for invalid objects, simply open the Command Prompt or terminal and run the /findinvalidobjects command: dbforgemysql.com /findinvalidobjects [/option_name1[:value | [parameter1:value parameter2:value ..]] /option_name2 ..] Replace the placeholders with your actual data. For more information, see [Finding Invalid Objects.](https://docs.devart.com/studio-for-mysql/working-with-database-objects/finding-invalid-objects.html) Want to learn more about ways to improve your workflows and reduce troubleshooting time? Explore [debugging MySQL stored routines](https://www.devart.com/dbforge/mysql/studio/debugging.html) within dbForge Studio for MySQL. dbForge Edge for database management In this article, we used the Studios that are part of dbForge Edge, an ultimate toolset that optimizes the workflow of software and database developers. This unified solution can meet the needs of DBAs, analysts, and DevOps engineers who work with different database systems on a regular basis. dbForge Edge offers a wide range of powerful features and functionalities, including: Cross-platform support for multiple database systems, including SQL Server, Oracle, MySQL, MariaDB, and PostgreSQL. A feature-rich toolset that covers all important aspects of database management. From database design to query building, data comparison and synchronization, and performance analysis and optimization tools, it has everything you need to efficiently manage your database structures and ensure data integrity and consistency. Enhanced Find Invalid Objects feature that allows users to quickly identify and fix invalid objects in their databases in the GUI or from the command line. This feature will streamline your workflow and improve your productivity. Automation of the search for invalid objects in DevOps workflows with the Find Invalid Objects cmdlet. This allows development teams to automate the detection of issues that might cause failures, and thus improve performance and deployments. So, the feature of this cross-platform suite to find and manage invalid objects, combined with its support for DevOps automation, makes it a useful tool for database administrators and developers who want to maintain high standards of database performance and reliability. Conclusion When working with a database, it is common practice to have a number of invalid objects that hinder your work and cause errors. The important thing to do is to find and validate them at the proper time. In this article, we have taken you through some of the most important things to know about invalid objects in Oracle and SQL databases and provided scripts that will assist you in identifying them. We also want to highlight that if you experience difficulties, there are automated ways to work with, identify, and fix invalid objects provided at Devart and other companies. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [Oracle](https://blog.devart.com/tag/oracle) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Sergey Syrovatchenko](https://blog.devart.com/author/sergeys) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffind-invalid-objects-in-your-databases.html) [Twitter](https://twitter.com/intent/tweet?text=Find+Invalid+Objects+in+Your+Databases&url=https%3A%2F%2Fblog.devart.com%2Ffind-invalid-objects-in-your-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/find-invalid-objects-in-your-databases.html&title=Find+Invalid+Objects+in+Your+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/find-invalid-objects-in-your-databases.html&title=Find+Invalid+Objects+in+Your+Databases) [Copy URL](https://blog.devart.com/find-invalid-objects-in-your-databases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 6 COMMENTS Tola January 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 5:23 am Good script. There is a mistake IF XACT_STATE() 0 should be IF XACT_STATE() 0. The script to find all invalid synonyms gave me this error though “Implicit conversion of char value to char cannot be performed because the collation of the value is unresolved due to a collation conflict.” Sergey Syrovatchenko January 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 10:11 am Tola, thanks for the note. Try to add COLLATE DATABASE_DEFAULT M. Lembke January 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 9:54 am Thanks for this great article. there is a little mistake in the script. Wrong IF XACT_STATE() 0 ROLLBACK TRANSACTION Correct IF XACT_STATE() = 0 ROLLBACK TRANSACTION Sergey Syrovatchenko January 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 10:03 am Done. M. Lembke, thanks for your note. JD March 7, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 6:14 pm Great script! Trying to Copy a DB and got error “Invalid Object Name ‘xxxxxx’. I was scratching my head for hours trying to figure out how to find this Invalid Object name. This script did the trick. Maxim Volkov September 8, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 1:11 pm Sometimes the script lists ‘inserted’ or ‘deleted’ aliases as invalid object. Comments are closed."} {"url": "https://blog.devart.com/finden-sie-ungultige-objekte-in-ihren-datenbanken.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Finden Sie ungültige Objekte in Ihren Datenbanken By [dbForge Team](https://blog.devart.com/author/dbforge) November 7, 2022 [0](https://blog.devart.com/finden-sie-ungultige-objekte-in-ihren-datenbanken.html#respond) 2660 DBA hat eine Reihe von Aufgaben, die hauptsächlich darauf abzielen, die Datenbank-Performance und die Datenkonsistenz zu unterstützen. Der Administrator kann den CHECKDB-Befehl verwenden, um die Datenkonsistenz einfach zu überprüfen, falls sie jedoch ein ungültiges Objekt in einem Datenbankschema finden müssen, können einige Schwierigkeiten auftreten. Einleitung Unter ungültig verstehen wir normalerweise verschiedene Arten von Objekten wie Synonyme, Funktionen, Pakete, Prozeduren, Ansichten usw., die einen Verweis auf nicht vorhandene Objekte oder auf Objekte, die auf irgendeine Weise geändert (z. B. umbenannt) wurden, haben. Seien Sie beispielsweise vorsichtig mit Synonymen, die auf Objekten erstellt wurden, denn wenn Sie ein Objekt löschen, erhält sein Synonym den Status ungültig, und wenn Sie ein Objekt neu erstellen, muss sein Synonym ebenfalls neu kompiliert werden. Wenn Sie versuchen, ein ungültiges Objekt zu verwenden, wird ein Fehler ausgegeben. Sie werden oft ungültige Objekte entdecken, wenn Sie Vorbereitungsskripts ausführen, Daten exportieren oder importieren, aktualisieren oder Patches anwenden. Aber eine wirklich gute Möglichkeit, mit ihnen zu arbeiten, besteht darin, regelmäßige Überprüfungen auf das Vorhandensein nicht übereinstimmender Objekte durchzuführen, bevor und nachdem Sie Änderungen oder Upgrades vornehmen. Die Sache ist, dass ungültige Objekte ziemlich mehrdeutig sind. Einige von ihnen sind ziemlich harmlos, sodass Sie sie mit einer einfachen Aktion oder einer Neukompilierung beheben können, die normalerweise automatisch von einer Datenbank durchgeführt wird, wenn auf das Objekt zugegriffen wird. Beispielsweise gibt es keinen Grund, sich über ungültige materialisierte Ansichten Gedanken zu machen, da sie gültig werden, sobald Sie Daten zu den zugrunde liegenden Tabellen hinzufügen. Alternativ weisen andere Objekte auf schwerwiegende latente Probleme hin und können nicht erfolgreich neu kompiliert werden. Dies gilt besonders dann, wenn die Änderung am referenzierten Objekt zu einem Fehler im aufrufenden Objekt führt und dieses somit nicht neu kompiliert werden kann. In beiden Fällen ist es wichtig, den Grund herauszufinden und diese Objekte in Ihrer Datenbank zu identifizieren, bevor weitere Probleme auftreten können. Je nach verwendeter Datenbank können Sie aus den unten bereitgestellten Skripts auswählen, um ungültige Objekte zu finden. Oracle Da Oracle eine komplexe und miteinander verbundene Datenbank ist, ist es oft der Fall, dass einige Objekte, die aufeinander angewiesen sind, ‘ungültig’ werden. Im Allgemeinen werden sie bei Bedarf automatisch neu kompiliert, besonders wenn es um komplexe Abhängigkeiten geht. Die Lösung besteht darin, diese Objekte zu finden und verschiedene Methoden anzuwenden, um sie neu zu kompilieren. Lassen Sie uns Ihnen zeigen, wie Sie mit der folgenden Abfrage eine Liste ungültiger Objekte in Oracle erhalten: SELECT owner, object_type, object_name\nFROM all_objects\nWHERE status = 'INVALID' Die Informationen, die Sie erhalten, helfen Ihnen bei der Entscheidung, welchen Schritt Sie als Nächstes unternehmen sollten, um die Objekte neu zu kompilieren. Wenn Sie Schwierigkeiten haben mit der manuellen Suche und Behebung von ungültigen Objekten, kann Ihnen [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/find-object-oracle.html) helfen, es umfangreiche Funktionen bietet, mit denen Sie die Anzahl der Fehler reduzieren und die Objekte umbenennen können, ohne die Abhängigkeiten zwischen ihnen aufzuheben. Erfahren Sie [hier](https://docs.devart.com/studio-for-oracle/debugging-pl-sql-units/recompiling-invalid-objects.html) mehr darüber, wie Sie ungültige Objekte mit diesem Tool am besten neu kompilieren. SQL Server SQL Server erlaubt es nicht, nicht übereinstimmende Objekte direkt zu finden. In den meisten Fällen müssen Sie ein Skript ausführen, um sicherzustellen, dass das Objekt ungültig ist. Dies ist jedoch äußerst unpraktisch. Aus diesem Grund lassen Sie uns ein Skript erstellen, das nach ungültigen Objekten sucht: SELECT\n obj_name = QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name)\n , obj_type = o.type_desc\n , d.referenced_database_name\n , d.referenced_schema_name\n , d.referenced_entity_name\nFROM sys.sql_expression_dependencies d\nJOIN sys.objects o ON d.referencing_id = o.[object_id]\nWHERE d.is_ambiguous = 0\n AND d.referenced_id IS NULL\n AND d.referenced_server_name IS NULL -- ignore objects from Linked server\n AND CASE d.referenced_class -- if does not exist\n WHEN 1 -- object\n THEN OBJECT_ID(\n ISNULL(QUOTENAME(d.referenced_database_name), DB_NAME()) + '.' + \n ISNULL(QUOTENAME(d.referenced_schema_name), SCHEMA_NAME()) + '.' + \n QUOTENAME(d.referenced_entity_name))\n WHEN 6 – or user datatype\n THEN TYPE_ID(\n ISNULL(d.referenced_schema_name, SCHEMA_NAME()) + '.' + d.referenced_entity_name) \n WHEN 10 -- or XML schema\n THEN (\n SELECT 1 FROM sys.xml_schema_collections x \n WHERE x.name = d.referenced_entity_name\n AND x.[schema_id] = ISNULL(SCHEMA_ID(d.referenced_schema_name), SCHEMA_ID())\n )\n END IS NULL Das Skript ist für die primäre Analyse nützlich. Allerdings gibt es darin einige Lücken. Das Hauptproblem besteht darin, dass das Skript keine Objekte mit ungültigen Spalten oder Parametern anzeigt. CREATE VIEW dbo.vw_View\nAS SELECT ID = 1\nGO\n\nCREATE PROCEDURE dbo.usp_Procedure\nAS BEGIN\n SELECT ID FROM dbo.vw_View\nEND\nGO\n\nALTER VIEW dbo.vw_View\nAS SELECT New_ID = 1\nGO Beim Ausführen des Speichervorgangs bekommen wir die Fehlermeldung: Msg 207, Level 16, State 1, Procedure usp_Procedure, Line 6\nInvalid column name 'ID'. Außerdem funktioniert das Skript nicht auf SQL Server 2005. Daher können wir das bereitgestellte Skript nicht als primäres verwenden. SQL Server bietet jedoch die Systemprozedur sp_refreshsqlmodule an. Diese Prozedur aktualisiert Metadaten für die angegebene nicht schemagebundene gespeicherte Prozedur, benutzerdefinierte Funktion, Ansicht, DML-Trigger, DDL-Trigger auf Datenbankebene oder DDL-Trigger auf Serverebene in der aktuellen Datenbank. Persistente Metadaten für diese Objekte, wie z. B. Datentypen von Parametern, können aufgrund von Änderungen an den zugrunde liegenden Objekten veraltet sein. Daher gibt die sp_refreshsqlmodule -Prozedur einen Fehler aus, wenn ein Objekt ungültige Spalten oder Eigenschaften enthält. Die Prozedur kann innerhalb eines Cursors für jedes Objekt aufgerufen werden. Wenn es keine ungültigen Objekte gibt, wird die Prozedur ohne Fehler abgeschlossen. Wichtig ist es, nicht zu vergessen, dass Skriptobjekte möglicherweise keine Abhängigkeiten haben oder anfänglich keine ungültigen Objekte enthalten können. Es ist nicht sinnvoll, solche Objekte zu überprüfen. SQL Server kümmert sich darum. Das folgende Skript kann für die Suche nach ungültigen Objekten verwendet werden: SET NOCOUNT ON;\nIF OBJECT_ID('tempdb.dbo.#objects') IS NOT NULL\n DROP TABLE #objects\n\nCREATE TABLE #objects (\n obj_id INT PRIMARY KEYa\n , obj_name NVARCHAR(1000)\n , err_message NVARCHAR(3000) NOT NULL\n , obj_type CHAR(2) NOT NULL\n)\n\nINSERT INTO #objects (obj_id, obj_name, err_message, obj_type)\nSELECT \n t.referencing_id\n , obj_name = QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name)\n , 'Invalid object name ''' + t.obj_name + ''''\n , o.[type]\nFROM (\n SELECT\n d.referencing_id\n , obj_name = MAX(COALESCE(d.referenced_database_name + '.', '') \n + COALESCE(d.referenced_schema_name + '.', '') \n + d.referenced_entity_name)\n FROM sys.sql_expression_dependencies d\n WHERE d.is_ambiguous = 0\n AND d.referenced_id IS NULL\n AND d.referenced_server_name IS NULL -- ignore objects from Linked server\n AND CASE d.referenced_class -- if does not exist\n WHEN 1 -- object\n THEN OBJECT_ID(\n ISNULL(QUOTENAME(d.referenced_database_name), DB_NAME()) + '.' + \n ISNULL(QUOTENAME(d.referenced_schema_name), SCHEMA_NAME()) + '.' + \n QUOTENAME(d.referenced_entity_name))\n WHEN 6 -- or user datatype\n THEN TYPE_ID(\n ISNULL(d.referenced_schema_name, SCHEMA_NAME()) + '.' + d.referenced_entity_name) \n WHEN 10 -- or XML schema\n THEN (\n SELECT 1 FROM sys.xml_schema_collections x \n WHERE x.name = d.referenced_entity_name\n AND x.[schema_id] = ISNULL(SCHEMA_ID(d.referenced_schema_name), SCHEMA_ID())\n )\n END IS NULL\n GROUP BY d.referencing_id\n) t\nJOIN sys.objects o ON t.referencing_id = o.[object_id]\nWHERE LEN(t.obj_name) > 4 -- hide valid aliases\n\nDECLARE\n @obj_id INT\n , @obj_name NVARCHAR(1000)\n , @obj_type CHAR(2)\n\nDECLARE cur CURSOR FAST_FORWARD READ_ONLY LOCAL FOR\n SELECT\n sm.[object_id]\n , QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name)\n , o.[type]\n FROM sys.sql_modules sm\n JOIN sys.objects o ON sm.[object_id] = o.[object_id]\n LEFT JOIN (\n SELECT s.referenced_id\n FROM sys.sql_expression_dependencies s\n JOIN sys.objects o ON o.object_id = s.referencing_id\n WHERE s.is_ambiguous = 0\n AND s.referenced_server_name IS NULL\n AND o.[type] IN ('C', 'D', 'U')\n GROUP BY s.referenced_id\n ) sed ON sed.referenced_id = sm.[object_id]\n WHERE sm.is_schema_bound = 0 -- objects without SCHEMABINDING\n AND sm.[object_id] NOT IN (SELECT o2.obj_id FROM #objects o2)\n AND OBJECTPROPERTY(sm.[object_id], 'IsEncrypted') = 0\n AND (\n o.[type] IN ('IF', 'TF', 'V', 'TR') --OR o.[type] = 'P' /* Microsoft Connect #656863 */\n OR (\n o.[type] = 'FN'\n AND\n -- ignore scalar functions, which are used in DEFAULT/CHECK constraints and COMPUTED columns\n sed.referenced_id IS NULL\n )\n )\n\nOPEN cur\n\nFETCH NEXT FROM cur INTO @obj_id, @obj_name, @obj_type\n\nWHILE @@FETCH_STATUS = 0 BEGIN\n\n BEGIN TRY\n\n BEGIN TRANSACTION\n EXEC sys.sp_refreshsqlmodule @name = @obj_name, @namespace = N'OBJECT' \n COMMIT TRANSACTION\n\n END TRY\n BEGIN CATCH\n\n IF XACT_STATE() != 0\n ROLLBACK TRANSACTION\n\n INSERT INTO #objects (obj_id, obj_name, err_message, obj_type) \n SELECT @obj_id, @obj_name, ERROR_MESSAGE(), @obj_type\n\n END CATCH\n\n FETCH NEXT FROM cur INTO @obj_id, @obj_name, @obj_type\n\nEND\n\nCLOSE cur\nDEALLOCATE cur\n\nSELECT obj_name, err_message, obj_type\nFROM #objects Dasselbe Skript für SQL Server 2005: SET NOCOUNT ON;\nIF OBJECT_ID('tempdb.dbo.#objects') IS NOT NULL\n DROP TABLE #objects\n\nCREATE TABLE #objects (\n obj_name NVARCHAR(1000)\n , err_message NVARCHAR(3000) NOT NULL\n , obj_type CHAR(2) NOT NULL\n)\n\nDECLARE\n @obj_name NVARCHAR(1000)\n , @obj_type CHAR(2)\n\nDECLARE cur CURSOR FAST_FORWARD READ_ONLY LOCAL FOR\n SELECT\n QUOTENAME(SCHEMA_NAME(o.[schema_id])) + '.' + QUOTENAME(o.name)\n , o.[type]\n FROM sys.sql_modules sm\n JOIN sys.objects o ON sm.[object_id] = o.[object_id]\n LEFT JOIN (\n SELECT s.referenced_major_id\n FROM sys.sql_dependencies s\n JOIN sys.objects o ON o.object_id = s.[object_id]\n WHERE o.[type] IN ('C', 'D', 'U')\n GROUP BY s.referenced_major_id\n ) sed ON sed.referenced_major_id = sm.[object_id]\n WHERE sm.is_schema_bound = 0\n AND OBJECTPROPERTY(sm.[object_id], 'IsEncrypted') = 0\n AND (\n o.[type] IN ('IF', 'TF', 'V', 'TR')\n OR (\n o.[type] = 'FN'\n AND\n sed.referenced_major_id IS NULL \n )\n )\n\nOPEN cur\n\nFETCH NEXT FROM cur INTO @obj_name, @obj_type\n\nWHILE @@FETCH_STATUS = 0 BEGIN\n\n BEGIN TRY\n\n BEGIN TRANSACTION\n EXEC sys.sp_refreshsqlmodule @name = @obj_name, @namespace = N'OBJECT' \n COMMIT TRANSACTION\n\n END TRY\n BEGIN CATCH\n\n IF XACT_STATE() != 0\n ROLLBACK TRANSACTION\n\n INSERT INTO #objects (obj_name, err_message, obj_type) \n SELECT @obj_name, ERROR_MESSAGE(), @obj_type\n\n END CATCH\n\n FETCH NEXT FROM cur INTO @obj_name, @obj_type\n\nEND\n\nCLOSE cur\nDEALLOCATE cur\n\nSELECT obj_name, err_message, obj_type\nFROM #objects Die Ergebnisse der Skriptausführung lauten wie folgt (für eine Testdatenbank): obj_name err_message obj_type\n--------------------------------- ------------------------------------------------------------------------------- --------\n[dbo].[vw_EmployeePersonalInfo] An insufficient number of arguments were supplied for ‘dbo.GetEmployee’ V\n[dbo].[udf_GetPercent] Invalid column name ‘Code’. FN\n[dbo].[trg_AIU_Sync] Invalid column name ‘DateOut’. P\n[dbo].[trg_IOU_SalaryEmployee] Invalid object name ‘dbo.tbl_SalaryEmployee’. TR\n[dbo].[trg_IU_ReturnDetail] The object ‘dbo.ReturnDetail’ does not exist or is invalid for this operation. TR\n[dbo].[ReportProduct] Invalid object name ‘dbo.ProductDetail’. IF SQL Server prüft beim Erstellen eines Synonyms nicht den Namen eines Objekts. So kann ein Synonym für ein nicht existierendes Objekt erstellt werden. Um alle ungültigen Synonyme zu finden, können Sie das folgende Skript verwenden: SELECT QUOTENAME(SCHEMA_NAME(s.[schema_id])) + '.' + QUOTENAME(s.name)\nFROM sys.synonyms s\nWHERE PARSENAME(s.base_object_name, 4) IS NULL -- ignore objects from Linked server\n AND OBJECT_ID(s.base_object_name) IS NULL Wenn Sie diese Prüfung zu einem aktuellen Skript hinzufügen müssen: ...\nSELECT obj_name, err_message, obj_type\nFROM #objects\n\nUNION ALL\n\nSELECT \n QUOTENAME(SCHEMA_NAME(s.[schema_id])) + '.' + QUOTENAME(s.name) COLLATE DATABASE_DEFAULT\n , 'Invalid object name ''' + s.base_object_name + '''' COLLATE DATABASE_DEFAULT\n , s.[type] COLLATE DATABASE_DEFAULT\nFROM sys.synonyms s\nWHERE PARSENAME(s.base_object_name, 4) IS NULL\n AND OBJECT_ID(s.base_object_name) IS NULL Wie Sie sehen können, können Sie mit Metadaten die Standardfunktionalität von SSMS erweitern, um Ihre täglichen Datenbankaufgaben zu erledigen. Wenn diese Aufgabe mühsame erscheint und viel Aufwand erfordert, können Sie den Prozess vereinfachen und viel Zeit mit [SQL Complete](https://www.devart.com/de/dbforge/sql/sqlcomplete/) sparen, einem Add-In zur Codevervollständigung für SSMS und VS. Diese leistungsstarke Funktionalität ermöglicht es Ihnen, nicht übereinstimmende Objekte in mehreren Datenbanken einfach zu erkennen und effektive Skripts zu ihrer Verwaltung zu generieren. Schlussfolgerung Bei der Arbeit mit einer Datenbank ist es üblich, eine Reihe ungültiger Objekte zu haben, die Ihre Arbeit behindern und Fehler verursachen. Das Wichtigste ist, sie rechtzeitig zu finden und zu validieren. In diesem Artikel haben wir Sie durch einige der wichtigsten Dinge geführt, die Sie über ungültige Objekte in Oracle- und SQL-Datenbanken wissen sollten, und Skripts bereitgestellt, die Sie bei der Identifizierung dieser Objekte unterstützen. Wir möchten auch hervorheben, dass es bei Schwierigkeiten automatisierte Möglichkeiten gibt, ungültige Objekte zu bearbeiten, zu identifizieren und zu reparieren, die von Devart und anderen Unternehmen bereitgestellt werden. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [find invalid objects](https://blog.devart.com/tag/find-invalid-objects) [Oracle](https://blog.devart.com/tag/oracle) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffinden-sie-ungultige-objekte-in-ihren-datenbanken.html) [Twitter](https://twitter.com/intent/tweet?text=Finden+Sie+ung%C3%BCltige+Objekte+in+Ihren+Datenbanken&url=https%3A%2F%2Fblog.devart.com%2Ffinden-sie-ungultige-objekte-in-ihren-datenbanken.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/finden-sie-ungultige-objekte-in-ihren-datenbanken.html&title=Finden+Sie+ung%C3%BCltige+Objekte+in+Ihren+Datenbanken) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/finden-sie-ungultige-objekte-in-ihren-datenbanken.html&title=Finden+Sie+ung%C3%BCltige+Objekte+in+Ihren+Datenbanken) [Copy URL](https://blog.devart.com/finden-sie-ungultige-objekte-in-ihren-datenbanken.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/foreign-key-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Understanding and Implementing Foreign Keys in SQL Server By [Nataly Smith](https://blog.devart.com/author/nataly-smith) November 21, 2023 [0](https://blog.devart.com/foreign-key-in-sql-server.html#respond) 2366 Managing foreign keys in SQL Server is a critical aspect of database design and maintenance. This article provides a comprehensive guide, covering everything from the creation of foreign keys to effective troubleshooting of any potential issues that may arise during the process. Skillfully handling foreign keys is fundamental to ensuring data integrity and maintaining a well-structured database environment. Throughout this article, we will use [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/studio-sql.html) as our testing ground for all the examples, demonstrating practical implementations of each concept. Contents Understanding foreign keys Creating a foreign key Working with foreign keys Troubleshooting foreign key issues Checking foreign key relationships Conclusion Understanding foreign keys Foreign keys serve as a crucial mechanism when it comes to establishing and maintaining relationships between tables in relational databases. Its primary purpose is to enforce referential integrity, ensuring that data in the referenced table corresponds to existing data in the parent one. In essence, a foreign key acts as a bridge, connecting related records across different tables within a database. A foreign key constraint (also called a referential integrity constraint) designates a column as the foreign key and establishes a relationship between that foreign key and a specified primary or unique key, called the referenced key. Note: Both the foreign and the referenced keys can be located in the same table or view. Multiple foreign keys can be defined in a single table or view. A single column can be part of more than just one foreign key. In SQL, the syntax for defining a foreign key involves specifying the field or fields to be linked and referencing the corresponding primary key in the parent table. It is a crucial component of Data Definition Language (DDL) used for table creation and modification. This relationship establishes a hierarchy between tables, designating one as the parent (containing the primary key) and the other as the child (containing the foreign key). This hierarchy forms the basis for executing operations like updates and deletions in a controlled and structured manner, preserving the integrity of the database. Creating a foreign key Now, we are going to venture forth to the practical aspect of handling foreign keys in the wild. dbForge Studio for SQL Server will help us tame this beast. The first task on today’s agenda is to create a foreign key. While creating a new table in your database, you can also create the necessary foreign keys along the way: 1. In Database Explorer , connect to the instance of the database engine that contains the database to be modified. 2. Expand the connection node and then expand the database that will contain the new table. 3. Right-click the Tables node of your database and then click New Table . The Table Editor opens. 4. In the Name text box, type the table name. Optionally, add a description. By default, the table is created in the dbo schema. To specify a different schema for the table, select the appropriate schema from the Schema drop-down list. 5. Type column names, select data types, and choose whether to allow nulls for each column, as shown in the following illustration. Optionally, add a description. 6. To specify a column as a primary key, select the check box next to the column name. 7. Switch to the Constraints tab. Then, right-click the empty space within the Constraints section and select Add foreign key . 8. In the Name field, enter the name of the constraint. Optionally, add a description and select delete and update rules if needed. 9. Select Constraint Columns as well as the corresponding Referenced Schema , Table , and Columns . In addition, you can select whether you would like the foreign key to be Enabled , Not for Replication , or to Check Existing Data . 9. Click Apply Changes . Note: For you to be able to create a foreign key, the referenced table should have a unique index. Otherwise, dbForge Studio will prompt you to create it. Click Yes in the dialog that appears, and the unique index will be added. Working with foreign keys Our next step in today’s journey will be addressing the existing foreign keys, determining whether to add new ones, remove a few, adjust data in tables with foreign keys, etc. Add foreign key in SQL Server 1. Expand the table you need, right-click the Keys folder, and choose New Foreign Key from the menu. 2. Add the required columns, select the referenced table and referenced constraint, and click Apply Changes . You can also switch to the Constraints tab and create the constraint using the shortcut menu. Drop foreign key in SQL Server In order to drop a key, navigate to the constraint in Database Explorer and select Delete from the shortcut menu. Alternatively, open the table that owns the key, switch to the Constraints tab, and choose Remove Constraint from its shortcut menu. How to insert data into tables with foreign keys Working with data in tables that have foreign keys involves a few steps to ensure data integrity. Before making any changes, it is important to understand the relationships between the tables. Specifically, you should know which tables have foreign key constraints that are being referenced by other tables, as well as which tables are acting as references to other tables. Inserting data to a table with foreign keys does not require any particular changes in the INSERT syntax: INSERT INTO table_name (column1, column2, ...)\nVALUES (value1, value2, ...); However, there are a couple of tips and tricks to keep in mind: If you are dealing with multiple tables with foreign keys, start by inserting data into the referenced table(s) first. This ensures that the data you are referencing does, in fact, exist. Having inserted the necessary data in the referenced table(s), you can proceed to insert data in the parent one(s). Make sure that the data you are inserting complies with any foreign key constraints. The value you are inserting in the parent table must exist in the referenced table. If the data you are inserting violates any foreign key constraints, your actions will result in an error. In this case, you will need to address this by either providing valid data or modifying the existing data in the referenced table. After inserting data, it is good practice to double-check that the foreign key relationships are still intact. You can do this by running appropriate SELECT queries or by using tools provided by your database management system. How to update data in tables with foreign keys Keeping in mind the tips above, you can update the tables with foreign keys using regular UPDATE syntax: UPDATE table_name\nSET column1 = value1, column2 = value2, ...\nWHERE condition; How to delete data from tables with foreign keys When deleting data from tables with foreign keys, it is important to follow the same principles as when inserting or updating data. This means considering the relationships between tables and ensuring that any foreign key constraints are taken into account. Note: As mentioned earlier in this article, dbForge Studio for SQL Server allows you to select delete and update rules, which can automatically propagate changes to related tables. The options are: NO ACTION CASCADE SET NULL SET DEFAULT Troubleshooting foreign key issues Even with careful database management, encountering errors, including those involving foreign keys, is inevitable. Now, we are going to address some of the most prevalent errors associated with foreign keys and outline effective solutions, along with highlighting best practices for optimal database management. Description Resolution Foreign key mismatch When creating a foreign key, make sure that the data types of the parent and referenced columns match (for example, both should be INT). Dangling foreign keys A dangling foreign key links to a nonexistent table or column, which can lead to data integrity issues. Avoid this by creating referenced tables or columns before creating foreign key constraints. Missing foreign key indexes Always manually create foreign key indexes to optimize the join queries between parent and child tables. The INSERT statement conflicted with the FOREIGN KEY constraint Ensure that the values being inserted into the child table exist in the parent table. Alternatively, you can temporarily disable the constraint, insert the values, and then re-enable the constraint. Cannot add foreign key constraint Verify that the referenced column in the parent table has a corresponding primary key or unique constraint. Also, ensure that data types match between the child and parent tables. Cannot add or update a child row: a foreign key constraint fails Make sure that the value being updated or inserted in the child table has a corresponding value in the parent table. Introducing FOREIGN KEY constraint may cause cycles or multiple cascade paths Disable the FOREIGN_KEY_CHECKS constraint, perform the data modification, and re-enable the constraint. Checking foreign key relationships To verify foreign key relationships in your database, you can utilize system tables and execute specific queries that we are going to look into in this section of our article. How to use system tables to check foreign key relationships For a little bit of hands-on experience, let us use system tables and check foreign key relationships: SELECT \n FK.name AS FK_Name,\n TP.name AS Parent_Table,\n RFK.name AS Referenced_Table\nFROM \n sys.foreign_keys AS FK\nINNER JOIN \n sys.tables AS TP ON FK.parent_object_id = TP.object_id\nINNER JOIN \n sys.tables AS RFK ON FK.referenced_object_id = RFK.object_id On executing the query above, you will get the names of foreign keys, parent tables, and referenced tables: Examples of queries to check foreign key relationships To list all foreign keys with their details: SELECT \n fk.name AS FK_Name,\n tp.name AS Parent_Table,\n rfk.name AS Referenced_Table,\n rkc.name AS Referenced_Column\nFROM \n sys.foreign_keys AS fk\nINNER JOIN \n sys.tables AS tp ON fk.parent_object_id = tp.object_id\nINNER JOIN \n sys.tables AS rfk ON fk.referenced_object_id = rfk.object_id\nINNER JOIN \n sys.foreign_key_columns AS fkc ON fkc.constraint_object_id = fk.object_id\nINNER JOIN \n sys.columns AS rkc ON fkc.referenced_column_id = rkc.column_id \n AND fkc.referenced_object_id = rkc.object_id As a result, you will see the list of all foreign keys along with the respective parent tables, as well as the tables and columns they reference: To find foreign keys in a specific table: SELECT \n fk.name AS FK_Name,\n tp.name AS Parent_Table,\n rfk.name AS Referenced_Table,\n rkc.name AS Referenced_Column\nFROM \n sys.foreign_keys AS fk\nINNER JOIN \n sys.tables AS tp ON fk.parent_object_id = tp.object_id\nINNER JOIN \n sys.tables AS rfk ON fk.referenced_object_id = rfk.object_id\nINNER JOIN \n sys.foreign_key_columns AS fkc ON fkc.constraint_object_id = fk.object_id\nINNER JOIN \n sys.columns AS rkc ON fkc.referenced_column_id = rkc.column_id \n AND fkc.referenced_object_id = rkc.object_id\nWHERE \n tp.name = 'YourTableName' -- Replace with the actual table name The query output will contain the same set of columns as the previous one — foreign keys, parent tables, referenced tables, and their corresponding columns — yet specifically for a singular table. To find tables that have foreign key references to a specific table: SELECT \n tp.name AS Referencing_Table,\n rfk.name AS Referenced_Table,\n fk.name AS FK_Name\nFROM \n sys.foreign_keys AS fk\nINNER JOIN \n sys.tables AS tp ON fk.parent_object_id = tp.object_id\nINNER JOIN \n sys.tables AS rfk ON fk.referenced_object_id = rfk.object_id\nWHERE \n rfk.name = 'YourReferencedTableName' -- Replace with the actual referenced table name This query will help you identify which tables in your database are connected to a specific referenced table through foreign key relationships. Conclusion Wrapping up everything we have just covered, understanding foreign keys is fundamental to maintaining data integrity in relational databases. They establish important relationships between tables, ensuring consistency and accuracy in our data. Tools like dbForge Studio for SQL Server provide a user-friendly interface to manage foreign keys effortlessly. [Download a 30-day free trial](https://www.devart.com/dbforge/sql/studio/download.html) today and experience firsthand how dbForge streamlines the process of working with foreign keys, enhancing the efficiency and reliability of your database operations. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [Foreign Key](https://blog.devart.com/tag/foreign-key) [SQL Server](https://blog.devart.com/tag/sql-server) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fforeign-key-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Understanding+and+Implementing+Foreign+Keys+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fforeign-key-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/foreign-key-in-sql-server.html&title=Understanding+and+Implementing+Foreign+Keys+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/foreign-key-in-sql-server.html&title=Understanding+and+Implementing+Foreign+Keys+in+SQL+Server) [Copy URL](https://blog.devart.com/foreign-key-in-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/from-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) From Data Chaos to Analytics Bliss: How dbForge SQL Tools Contributed to a Small Data Analytics Company’s Success By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) March 9, 2023 [0](https://blog.devart.com/from-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html#respond) 2231 In today’s data-driven world, having the right tools can make all the difference for small businesses seeking to gain a competitive edge. This was certainly the case for a small data analytics company that struggled to keep up with larger competitors until they employed dbForge SQL Tools by Devart. A startup company, whose CEO prefers to remain anonymous, has recently shared their remarkable success story involving SQL Tools. This story will delve into the reasons behind their choice, the features they find most useful, and how the dbForge SQL Tools bundle has helped them to overcome various challenges. By sharing their experience with you, we hope to provide insight into how SQL Tools can be a game-changer for startups and small businesses in the data analytics industry. Contents Struggle to stay competitive: A startup’s pre-SQL Tools story From many options to one: Why SQL Tools by Devart was the clear winner Must-haves: What the startup was looking for in the database development tool Selecting SQL Tools among competitors Customer’s top picks: Key features that spark excitement Impressive results achieved Struggle to stay competitive: A startup’s pre-SQL Tools story Our customer is a Director of Data and Analytics in a small startup company, he oversees a team of specialists who work with data. However, the team also handles data warehousing, which presents additional challenges. Given their size and budget, the company needed a tool that was affordable yet still offered automation capabilities similar to those available from larger competitors like Redgate. With the economic recession looming, the company was facing a tough road ahead, but they knew they needed to find a solution to help them stay competitive. From many options to one: Why SQL Tools by Devart was the clear winner Let’s explore the specific requirements that the startup had when searching for a database development tool and look into the specific tasks that it needed to solve, as well as the shortlisted options considered by the team. Finally, we’ll delve into why they ultimately chose SQL Tools over other alternatives. Must-haves: Essential features required Advanced IntelliSense: The company required more robust IntelliSense capabilities than those available in SSMS, as the existing functionality did not meet the demands of the data analytics team. The limitations of the Management Studio were frustrating and hindered productivity because the team was working mainly with large databases and complex queries. As a result, those limitations forced the company to seek out alternatives to SSMS. Powerful Data and Schema Compare functionality: The team needed a reliable solution that would help them identify and resolve discrepancies between development, staging, and production environments, ensuring consistency and accuracy of data across different environments. Sophisticated Data Import and Export: This feature is crucial for the team that consists mainly of analytics and database developers as they need to regularly transfer data between different systems and formats. The team specialists move large amounts of data from one database or file format to another, extract, transform, and load data from various sources into a data warehouse or analytical database, and export data to different formats for use in other applications. Reliable source control: The company needed a tool to help them track and manage changes in data over time, as well as roll back to previous versions if necessary. Their developers and analysts needed to guarantee that those changes would be well-documented, rigorously tested, and efficiently deployed, with a focus on minimizing the chance of errors or data loss. Modest price: The price of the tool was an important consideration for the company, especially given its size and the possibility of a future economic downturn. Investing in good quality software at a fair price could be beneficial to the company in the long term and help them stay competitive. Selecting SQL Tools among competitors When searching for the right tool, the company CEO and Data Analytics Team Lead aimed to strike a balance between features and cost. However, they found Redgate and Apex solutions to be a bit overpriced. Therefore, the company turned to their competitors and stumbled upon [dbForge reviews on the Gartner platform](https://www.gartner.com/reviews/market/integrated-development-environment-ide-software/vendor/devart/product/dbforge) . The company CEO recalled having worked with the Devart products earlier—namely, with their Excel add-ins—and also recalled being satisfied with the quality and support. He did a small research on the functionality of SQL tools and suggested that the team evaluate the solution during the 30-day trial period. The outcome was stunning – SQL Tools proved to be an ideal choice. As a result, they decided to purchase the tool pack licenses for further use. Customer’s top picks: Key features that spark excitement [Data Compare & Sync](https://www.devart.com/dbforge/sql/datacompare/) The Data Compare tool that comes as part of the SQL Tools bundle allows quick and efficient comparisons of data between different sources, enabling the customer’s analytics team to promptly identify differences and inconsistencies. By using this tool, the team can ensure that they are working with accurate and up-to-date information, ultimately leading to better analysis and decision-making. Additionally, the Data Compare as well as Schema Compare functionality can be accessed and used from the command line, making it possible to automate and schedule multiple routine tasks. [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) Data Pump proved to be an essential tool for the data analytics team as it enabled quick and efficient transfer of large amounts of data. With its ability to move data between different systems and platforms, the tool saved valuable time and effort for the team, allowing them to focus on analyzing the data rather than worrying about the logistics of moving it. This tool is especially useful for work with big data or the handling of data migration projects. [Source Control](https://www.devart.com/dbforge/sql/source-control/) The integrated Source Control allows all the teams to manage changes in data and code in a collaborative and organized manner. By using Source Control, they can track changes, maintain version history, and easily revert to previous versions if necessary. Source Control also helps make sure that all team members are working on the latest version of data and code, leading to better collaboration and increased productivity. Impressive results achieved 2x-3x increase in data transfer speed The company’s specialists were able to achieve a 2x-3x speedup in data transfer by utilizing the powerful data export and import utilities included in the SQL Tools bundle. Thanks to these tools, data migration between data sources and databases became significantly faster and more efficient. 75% reduction in code errors and typos The IntelliSense-style autocompletion and syntax check helped the database development and data analytics teams reduce errors and typos by providing context-sensitive suggestions and checking syntax while typing code. This functionality helps quickly identify and fix errors before they become an issue, saving lots of time and effort in the long run. 3x-4x boost in database migration speed Thanks to the Schema Compare and Data Compare tools, the teams were able to dramatically increase the speed and accuracy of database migrations, thereby freeing up time for more extensive data analysis. [Test SQL Tools free for 30 days – Download now!](https://www.devart.com/dbforge/sql/sql-tools/download.html) The success story is amazing, right? Are you looking for the perfect way to manage your databases too? Look no further – SQL Tools is the answer! With this toolset, you’ll get access to all the features you need to manage and optimize your databases. Don’t wait – download SQL tools for a free 30-day trial and start getting results today! Tags [sql tools](https://blog.devart.com/tag/sql-tools) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffrom-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html) [Twitter](https://twitter.com/intent/tweet?text=From+Data+Chaos+to+Analytics+Bliss%3A+How+dbForge+SQL+Tools+Contributed+to+a+Small+Data+Analytics+Company%E2%80%99s+Success&url=https%3A%2F%2Fblog.devart.com%2Ffrom-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/from-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html&title=From+Data+Chaos+to+Analytics+Bliss%3A+How+dbForge+SQL+Tools+Contributed+to+a+Small+Data+Analytics+Company%E2%80%99s+Success) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/from-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html&title=From+Data+Chaos+to+Analytics+Bliss%3A+How+dbForge+SQL+Tools+Contributed+to+a+Small+Data+Analytics+Company%E2%80%99s+Success) [Copy URL](https://blog.devart.com/from-data-chaos-to-analytics-bliss-how-dbforge-sql-tools-contributed-to-a-small-data-analytics-companys-success.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/functionality-improvements-in-the-new-version-of-linqconnect.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ORM Solutions](https://blog.devart.com/category/products/orm-solutions) [What’s New](https://blog.devart.com/category/whats-new) Functionality Improvements in the New Version of LinqConnect By [dotConnect Team](https://blog.devart.com/author/dotconnect) March 26, 2012 [0](https://blog.devart.com/functionality-improvements-in-the-new-version-of-linqconnect.html#respond) 3220 Devart is glad to announce a beta version of completely reworked [LinqConnect ORM](http://devart.com/linqconnect/) with better performance and large number of improvements. New 4.0 version of LinqConnect marks a new milestone in LinqConnect development. Renewed LinqConnect demonstrates a new degree of performance and stability. Now it can be used in most complicated use cases and supports submit of very complex object graphs. Starting from the 4.0 version, LinqConnect uses only its own classes. It does not reference the System.Data.Linq assembly any more, and this assembly is not loaded with your applications, using LinqConnect. However you don’t need to worry about LINQ to SQL compatibility. Public interface of our classes is completely compatible with the public interface of Microsoft LINQ to SQL classes. Moreover, we provide the Upgrade Wizard for easy upgrading of your LinqConnect projects to the new version of LinqConnect in a few clicks. Cache Improvements Now LinqConnect cache has two different modes. The first mode, classical, is recommended to use it when DataContext is used as a Unit of Work and is a short-lived object (Microsoft recommends to use LINQ to SQL DataContext in this way). In this mode DataContext uses hard links on all the entity objects and they are released when the DataContext itself is released. The second, self-clearing mode is intended for applications when DataContext is considered as a long-lived object, not as a Unit of Work. In this mode, weak references to entity classes are used. When you don’t use an object any more, it is deleted from the cache and is not tracked any more. However this mode requires more memory and resources for tracking. Submit Functionality In the 4.0 version we redesigned the Submit functionality. The main improvements are the optimizations of tracking and submitting of the complex object graphs with the parent-child relation chains. Submitting objects with many-to-many associations was also improved. Now it requires a smaller number of database calls. There is no database call when adding/deleting an object to/from a collection. When deleting an entity with an associated collection, one delete operation is performed for the associated table. Delete rule behaviour was changed. Now this operation is executed by the database. LinqConnect removes the affected objects from cache after this operation. Stored Procedure Mapping Improvements New version of LinqConnect allows you to map stored procedure OUT parameters that return cursors to enumerable parameters of the corresponding DataContext method. Now you don’t need to mess with the ResultType attribute and ImultipleResult when working with stored procedures, returning cursors in Oracle and PostgreSQL. Query Compilation Improvements Query compilation algorithm was completely reworked to implement using of ILGenerator. Now the compilation is performed much faster and requires less RAM. Materialization Improvements In the new LinqConnect version we have optimized entity materialization algorithms greatly. Materialization of entities with associated entities is now 1,5 times faster, and materialization of entities, returned by stored procedures, is 4 times faster. Compiled Query Cache Improvements Now compiled queries and materialization functions are stored separately in the cache. This increases performance when executing different queries returning results of the same type. [Download LinqConnect](http://devart.com/linqconnect/download.html) and try our improvements now! Tags [linqconnect](https://blog.devart.com/tag/linqconnect) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffunctionality-improvements-in-the-new-version-of-linqconnect.html) [Twitter](https://twitter.com/intent/tweet?text=Functionality+Improvements+in+the+New+Version+of+LinqConnect&url=https%3A%2F%2Fblog.devart.com%2Ffunctionality-improvements-in-the-new-version-of-linqconnect.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/functionality-improvements-in-the-new-version-of-linqconnect.html&title=Functionality+Improvements+in+the+New+Version+of+LinqConnect) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/functionality-improvements-in-the-new-version-of-linqconnect.html&title=Functionality+Improvements+in+the+New+Version+of+LinqConnect) [Copy URL](https://blog.devart.com/functionality-improvements-in-the-new-version-of-linqconnect.html) RELATED ARTICLES [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [ADO.NET vs Dapper: Comparison Guide for .NET Developers](https://blog.devart.com/ado-net-vs-dapper.html) April 29, 2025 [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [NHibernate vs. Dapper: Which One Should You Choose for .NET Development?](https://blog.devart.com/nhibernate-vs-dapper-which-one-should-you-choose-for-net-development.html) May 6, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/fundamentals-of-bitmasking.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Using Bitmasks for Efficient Data Filtering By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) September 25, 2023 [0](https://blog.devart.com/fundamentals-of-bitmasking.html#respond) 2193 In the competitive world of database management, every bit—literally—counts when it comes to efficiency. That’s where bitmasking comes into play, a resourceful technique that offers a smart solution for SQL Server administrators and developers. In this article, we explore the concept of bitmasking, a useful technique that offers efficient ways to manage and manipulate data. Covering its application in various scenarios such as checking bit statuses, modifying individual bits, and comparing bitmasks, the article provides a comprehensive guide suitable for both beginners and seasoned professionals. By the end of the article, readers will gain valuable insights into how bitmasking can optimize SQL Server data operations. Contents Understanding bitmasking What is a bitmask in SQL Server? Determining if the particular bit is enabled Determining if any bit is enabled Determining if all bits are enabled Turning ON a particular bit Turning OFF a particular bit Enabling/disabling a group of bits Using bitmasks for permission control How to compare two bitmasks in SQL Server Leveraging SQL Complete for querying SQL Server data Conclusion Understanding bitmasking Bitmasking is a powerful technique that allows for efficient storage and manipulation of multiple Boolean values within a single numerical variable. By utilizing bitwise operations, bitmasking enables quick checks and updates of these values, making it particularly useful in databases and other systems where computational efficiency is a priority. What is a bitmask in SQL Server? A bitmask in SQL Server is essentially an integer value that is used as a series of on-off switches, with each bit in the integer representing a particular setting, flag, or condition. The technique of using bitmasks allows you to efficiently store and manipulate multiple true/false (Boolean) pieces of information in a single integer column. Bitwise operations like AND , OR , and NOT are used to modify or query these bitmasks. For example, consider an application where you have user permissions such as read (0001), write (0010), and execute (0100). Instead of using three different Boolean columns to represent these permissions, you could use one integer column with each bit representing a particular permission. Here’s how you could use a bitmask in this context: Read permission could be represented by the integer value 1, which is 0001 in binary. Write permission could be represented by the integer value 2, which is 0010 in binary. Execute permission could be represented by the integer value 4, which is 0100 in binary. A user with read and execute permissions would have a value of 0101 in binary, or 5 in the integer form. You could use bitwise operators to check for permissions. For example, to check if a user has read permission: DECLARE @userPermission INT = 5 -- (0101)\nDECLARE @readPermission INT = 1 -- (0001)\n\nIF (@userPermission & @readPermission) = @readPermission\n PRINT 'User has read permission' Here, the & (AND) operator compares the individual bits of @userPermission and @readPermission . If the read bit is set in @userPermission , then the user has read permission. By using bitmasks, you can efficiently store and query multiple settings or flags using bitwise operations, reducing the space needed in the database and often speeding up queries. Determining if the particular bit is enabled In SQL Server, you can use bitwise operations to check if particular permissions are granted based on the binary representation of integer values. Assume you have a column named permissions that stores the permission settings for each user, encoded as integers. To check if a user has a specific permission, you could use the bitwise AND ( & ) operator as follows. Let’s look at how to verify whether the Read permission is granted. Check for the Read permission DECLARE @permissions INT = 7; -- 7 in binary is 0111, which means Read, Write, and Execute permissions are enabled.\n\nIF (@permissions & 1) = 1 -- 0001 in binary\n PRINT 'Read permission is enabled';\nELSE\n PRINT 'Read permission is not enabled'; In this example, the variable @permissions is subjected to a bitwise AND operation with the numerical value representing the target permission. If the result matches the value of the target permission, it indicates that the permission is enabled. Determining if any bit is enabled In SQL Server, you can determine if any bit is enabled in an integer by checking if the integer is not zero. If an integer is not zero, then at least one bit must be enabled (set to 1). Here’s a simple example: DECLARE @value INT = 5; -- 5 in binary is 0101\n\nIF @value != 0\n PRINT 'At least one bit is enabled';\nELSE\n PRINT 'No bits are enabled'; If you only care about specific bits, you can mask out the others and then check the result: DECLARE @value INT = 5; -- 5 in binary is 0101\nDECLARE @mask INT = 3; -- 3 in binary is 0011, we only care about the last two bits\n\nIF (@value & @mask) != 0\n PRINT 'At least one of the specified bits is enabled';\nELSE\n PRINT 'None of the specified bits are enabled'; Determining if all bits are enabled To determine if all bits in an integer are enabled (set to 1) in SQL Server, you would generally compare the integer value to a bitmask that has all the bits enabled that you are interested in. Check if all specific bits are enabled Let’s say you are interested in checking if the first three bits (from the right, 0-based) are enabled. The bitmask for these would be 7 (in binary, 0111 ). You can check if all these bits are enabled by using the bitwise AND ( & ) operator and comparing the result to the bitmask: DECLARE @value INT = 7; -- All first three bits are enabled (0111)\nDECLARE @mask INT = 7; -- Bitmask for first three bits (0111)\n\nIF (@value & @mask) = @mask\n PRINT 'All specified bits are enabled';\nELSE\n PRINT 'Not all specified bits are enabled'; Turning ON a particular bit To turn on (set) a particular bit in an integer value in SQL Server, you can use the bitwise OR ( | ) operator. Here’s how you can turn on specific bits for the write permission, based on our previous example. Turn ON the Write permission To turn on the write permission, you can perform a bitwise OR with 2 , which is 0010 in binary. DECLARE @permissions INT = 0; -- Starting with no permissions (0000 in binary).\n\n-- Turn on write permission\nSET @permissions = @permissions | 2; -- 2 is 0010 in binary\n\nPRINT @permissions; -- Will output 2 Turn ON multiple permissions DECLARE @permissions INT = 0; -- Starting with no permissions (0000 in binary).\n\n-- Turn on read and write permissions\nSET @permissions = @permissions | 1 | 2; -- 1 is 0001 and 2 is 0010 in binary\n\nPRINT @permissions; -- Will output 3 (0011 in binary) In each example, the variable @permissions undergoes a bitwise OR operation with the integer value corresponding to the permission you want to activate. The OR operation ensures that the specific bit for the permission is turned on, while leaving other bits unchanged. Turning OFF a particular bit To turn off (clear) a particular bit in an integer value in SQL Server, you can use bitwise AND ( & ) along with bitwise NOT ( ~ ) operators. Here’s how you can turn off bits for the execute permission. Turn OFF the Execute permission To turn off the execute permission, you can perform a bitwise AND with the bitwise NOT of 4 , which is 0100 in binary. DECLARE @permissions INT = 7; -- Starting with all permissions (0111 in binary).\n\n-- Turn off execute permission\nSET @permissions = @permissions & ~4; -- ~4 will be 1011 in binary\n\nPRINT @permissions; -- Will output 3 In this case, the @permissions variable is modified using a bitwise AND operation, combined with the bitwise NOT of the specific permission value we want to turn off. This technique ensures that only the targeted bit is cleared, while all other bits remain unaffected. Enabling/disabling a group of bits Enabling or disabling a group of bits in SQL Server is a similar operation to working with individual bits, except that the bitmask will cover multiple bits. Here’s how you can do it: Enable a group of bits To enable a group of bits, you’ll need to use the bitwise OR ( | ) operator with a mask that has the desired bits set to 1. For example, to enable both the read (1, 0001 in binary) and write (2, 0010 in binary) permissions: DECLARE @permissions INT = 0; -- Start with no permissions (0000 in binary)\nDECLARE @mask INT = 1 | 2; -- Create a mask for read and write permissions (0001 | 0010 = 0011)\n\n-- Enable the bits\nSET @permissions = @permissions | @mask;\n\nPRINT @permissions; -- Will output 3 (0011 in binary) Disable a group of bits To disable a group of bits, you’ll need to use the bitwise AND ( & ) operator with a mask that has the desired bits set to 0 (using bitwise NOT ~ ). For example, to disable both read and write permissions if they are enabled: DECLARE @permissions INT = 7; -- Start with all permissions (0111 in binary)\nDECLARE @mask INT = ~(1 | 2); -- Create a mask to disable read and write permissions (~0001 | ~0010 = 1100)\n\n-- Disable the bits\nSET @permissions = @permissions & @mask;\n\nPRINT @permissions; -- Will output 4 (0100 in binary, only execute permission remains) Using bitmasks for permission control Utilizing bitmasks for permission control may seem complex, leading to a valid question: “Why not use individual fields such as ExecuteAllowed , ReadAllowed , WriteAllowed , and so on, instead?” In the scenario where we use separate fields for permissions, the conditions for filtering can become relatively complex, particularly if the specific permissions to be checked are not known in advance. Let’s say you want to find all users who have either ReadAllowed or WriteAllowed permissions. Your query might look something like this if you’re using separate fields: SELECT * FROM Users WHERE (ReadAllowed = TRUE OR WriteAllowed = TRUE); However, if tomorrow you need to filter users who have ExecuteAllowed permission in addition to the other conditions, you would need to update your query to: SELECT * FROM Users WHERE (ReadAllowed = TRUE OR WriteAllowed = TRUE OR ExecuteAllowed = TRUE); As you can see, the condition becomes longer and more complicated each time a new permission type is added to the filter. Over time, this can make the code harder to manage and debug, especially when multiple permissions have to be checked in a single query. If we employ bit masks for permission control, the process becomes significantly more streamlined. With this approach, you only need to set the appropriate bits in a variable, and the search code remains consistent, regardless of which specific permissions you are interested in. Evaluating the use of bitmasks for managing permissions Pros: Efficiency: Storing permissions as bitmasks is extremely memory-efficient. Multiple permissions can be stored in a single integer. Simplicity: Using bitwise operations, you can easily modify or check multiple permissions at once. Flexibility: With bitmasks, it’s straightforward to add new permissions without altering the existing data structure. Unified search code: As mentioned, if permissions are stored as bitmasks, the search code remains consistent, even as new permissions are added. Cons: Readability: For those unfamiliar with bitwise operations, bitmasks can be less intuitive to read or debug. Database operations: Some databases do not natively support bitwise operations, potentially making queries more complex. SQL Server does provide this capability. How to compare two bitmasks in SQL Server Imagine we have two interconnected database tables: one named Users and another called Roles . Within the Users table, each user’s role is represented using a bitmask, allowing for efficient storage and quick retrieval of role-related information. We want to identify all users associated with a roles bitmask of 5, which represents both ‘teacher’ and ‘admin’ roles. Consequently, the results should include Tom, Max, and Peter, but not Helen. How can we achieve that? You can use the following query to find users who have any of the roles represented in a bitmask of 5 (which would be Admin and Teacher in this case): DECLARE @roleMask INT = 5; -- Role bitmask for teacher and admin\n\n-- Select users that have any of the roles in the bitmask\nSELECT *\nFROM Users\nWHERE (Roles & @roleMask) > 0; This will select all rows where the Roles value has at least one bit in common with the @roleMask value. Leveraging SQL Complete for querying SQL Server data In the examples provided earlier, we used [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) as our tool of choice for writing and formatting code. SQL Complete is a robust extension for SQL Server Management Studio (SSMS) and Visual Studio that offers a rich set of features that streamline the coding process. From intelligent auto-completion to advanced code formatting, SQL Complete enhances productivity and ensures code quality, making it an indispensable tool for database professionals. Whether you’re a novice or an experienced developer, SQL Complete elevates your SQL coding experience to a new level of efficiency and precision. Conclusion In summary, bitmasking in SQL Server serves as an invaluable technique for efficiently managing multiple Boolean values within a single numerical entity. Throughout this article, we have walked through the basics of what a bitmask is, delved into various ways to manipulate and interpret individual or groups of bits, and explored how to compare bitmasks. With applications ranging from permission settings to feature toggling, bitmasking offers a level of efficiency that is difficult to match with other techniques. As we’ve seen, understanding and effectively implementing bitmasking can have a significant impact on the optimization of your SQL Server databases. Whether you’re a beginner or an experienced database professional, incorporating bitmasking into your toolkit can offer new dimensions of flexibility and efficiency. We have one more valuable tool for you that can revolutionize your SQL Server experience. If you’re looking to optimize your coding workflow, we highly recommend giving SQL Complete a try. The best part? You can [download and try SQL Complete absolutely free](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) during a 14-day trial period. Don’t miss this opportunity to elevate your SQL coding to the next level. WANT TO LEARN MORE? If you’re keen to expand your understanding further, we invite you to read our in-depth article on [bit manipulation functions](https://blog.devart.com/bit-manipulation-functions-in-sql-server.html) . Tags [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffundamentals-of-bitmasking.html) [Twitter](https://twitter.com/intent/tweet?text=Using+Bitmasks+for+Efficient+Data+Filtering&url=https%3A%2F%2Fblog.devart.com%2Ffundamentals-of-bitmasking.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/fundamentals-of-bitmasking.html&title=Using+Bitmasks+for+Efficient+Data+Filtering) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/fundamentals-of-bitmasking.html&title=Using+Bitmasks+for+Efficient+Data+Filtering) [Copy URL](https://blog.devart.com/fundamentals-of-bitmasking.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/future-devops-latest-trends.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) The Future of Database DevOps: Latest Trends and Challenges for 2025 By [dbForge Team](https://blog.devart.com/author/dbforge) March 17, 2022 [0](https://blog.devart.com/future-devops-latest-trends.html#respond) 2805 Today we cannot imagine software development and deployment processes without applying the DevOps methodology. This methodology is focused on establishing the interaction between developers and system administrators in a company. Thereby, the DevOps practice creates a smooth development circle and speeds up the delivery of a software product. The era of the DevOps trend originated in 2008 when Patrick Debois attended the Agile conference and created the Agile Systems Administration Group with Andrew Shafer. Since that time, the DevOps progressive tools have been actively used in most companies where it is required to increase the efficiency of software development and operation. Even though the DevOps principles are becoming a major part of the software delivery process, there is still a challenge in uniting database development with the DevOps process. Database DevOps challenges and possible future changes The Database DevOps philosophy aims to improve and simplify database management by automating some facets of the database lifecycle. This process is highly effective but at the same time it is complicated as you may face the following issues: Continuous deployment In the DevOps environment, continuous deployment of software is achieved by using a CI/CD pipeline. In case there are some new changes in the application code, these new changes just replace the old ones. Unfortunately, this approach doesn’t work with databases. For instance, to add schema changes, there must be a strategy for migrating old data to a new structure. It makes the continuous deployment of databases much more difficult. Scalability An application is highly scalable in the microservice architecture. As for databases, their performance gets much lower if a database scales in a size. Rollback The database management process excludes a possibility to get back and fix a database source code, test it, and then redeploy it, if you have some issues in the production environment. This might cause a breakdown in the work of a database. Incompatibility between relational databases and the microservice architecture : Microservices have a non-shared architecture. The reason for this is to remove any dependencies from other microservices and to protect them from being affected by a failed microservice. As you can see, it is not so simple and seamless to automate database development and deployment as applications. But the world of IT technologies is progressing very fast and we are sure that within a few years, we will see great changes in this process, which will also cover all existing needs. We assume that these changes would affect: Continuous delivery: a database pipeline would be integrated into an application pipeline even if a couple of applications or microservices use a database. Thus, the database changes would be synchronized with the ones of the application. Rational cloud-native database solutions: the solutions would be integrated with DevOps CI/CD pipelines through APIs and cloud-native infrastructure. Equal database deployment: there would be an ability to deploy databases to cloud environments, Docker containers, Kubernetes, and all operating systems in the same way. Scalability: databases would scale dynamically based on changes in workloads and users’ demands. Source control: we would like to believe that a database code would be maintained through version control. It would allow reviewing databases changes just like applications changes. The biggest challenge is that nobody can tell exactly how much time it will take until we are able to manage databases without restrictions from the DevOps perspective. So, how to overcome all challenges and automate a database deployment right now? That’s the question we have the answer to. DevOps Automation for SQL Server can really improve the Database DevOps practices of your team. The tool is included both in the SQL Tools pack and in the dbForge Studio for SQL Server. With [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/database-devops.html) , database lifecycle goes through five stages displayed on the image below. At the Development stage, database developers modify a database schema. The Version control stage is about committing the changes to the version control system, for example, to Git. After that the Build stage gets started—the database is compiled from .sql files on the SQL Server. At this stage, you get the desired database. It isn’t the end of the process, here comes the Unit test stage. This stage is essential as it helps to ensure that the database still operates as expected after the changes have been added. The testing is performed by SQL unit tests. The Publish stage brings the CI process to a logical conclusion. At this stage, you publish the NuGet package or ZIP to be able to share it. Optionally, you can generate the database documentation and include it in the package. Finally, the NuGet package or ZIP is published in the NuGet repository or added to a separate folder. Conclusion The DevOps principles are a popular trend in many companies. More and more developers strive to automate manual operations in the deployment process of a software product. Minimizing human intervention in the product delivery allows releasing faster and with fewer failures. But as we can see, things don’t look so easy when it comes to the deployment of databases. There are still some blockers in database development process, most of which can be removed by the Devart product. To take your databases deployment to a new level, try 30-day trial versions of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and [DevOps Automation for SQL Server](https://www.devart.com/dbforge/sql/database-devops/download.html) . Tags [Database DevOps](https://blog.devart.com/tag/database-devops) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [devops](https://blog.devart.com/tag/devops) [DevOps Automation](https://blog.devart.com/tag/devops-automation) [DevOps challenges](https://blog.devart.com/tag/devops-challenges) [DevOps trends](https://blog.devart.com/tag/devops-trends) [Future of DevOps](https://blog.devart.com/tag/future-of-devops) [sql tools](https://blog.devart.com/tag/sql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Ffuture-devops-latest-trends.html) [Twitter](https://twitter.com/intent/tweet?text=The+Future+of+Database+DevOps%3A+Latest+Trends+and+Challenges+for+2025&url=https%3A%2F%2Fblog.devart.com%2Ffuture-devops-latest-trends.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/future-devops-latest-trends.html&title=The+Future+of+Database+DevOps%3A+Latest+Trends+and+Challenges+for+2025) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/future-devops-latest-trends.html&title=The+Future+of+Database+DevOps%3A+Latest+Trends+and+Challenges+for+2025) [Copy URL](https://blog.devart.com/future-devops-latest-trends.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/general-review-of-microsoft-sql-server-management-studio-ssms.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) General Review of Microsoft SQL Server Management Studio (SSMS) By [dbForge Team](https://blog.devart.com/author/dbforge) March 2, 2021 [0](https://blog.devart.com/general-review-of-microsoft-sql-server-management-studio-ssms.html#respond) 4172 SQL Server Management Studio or SSMS is, perhaps, the most famous product for managing Microsoft SQL Servers and any SQL infrastructure. Many experts consider it the most useful management tool of all. SSMS is the default choice for all tasks related to Microsoft SQL Server. It allows users to not only create databases and their elements but also write SQL queries of any complexity faster and easier. A set of powerful multi-featured graphical tools and script editors show high performance in managing all database-related routines. Besides, all these tools and features are available out of the box. Given the fact that this solution is free of charge and easy-to-use, it became indispensable for specialists dealing with SQL databases, such as developers, administrators, and analysts, as well as database DevOps engineers and security specialists. Despite other professional tools, MS SQL Management Studio remains a leading tool on the market. Download and Install SSMS To install SSMS and work with it properly, you need a 1.8 GHz or faster x86 (Intel, AMD) processor, dual-core, or better. The minimum RAM is 2 GB (2.5 GB if you run it on a virtual machine), but the optimal would be 4 GB. For the hard disk space, you need at least 2 GB of available space (more is better: up to 10 GB). The supported operating systems are the Windows Server line from 2008 R2 to 2019 (all 64-bit), OS Windows 8.1 (64-bit), and Windows 10 (64-bit, version 1607 or higher). SSMS is compatible with Windows only. If you need to run MS SQL Studio on other OS, such as macOS or Linux, you will have to utilize another Microsoft product – Azure Data Studio. The latest SSMS version together with release notes is always available on the [official Microsoft download page](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) . Download the setup file and execute it to start the installation. Note that it will display the default location to install the software. Here, you can either accept it or click Change and specify the new path. By clicking Install , you accept the License Agreement terms. After that, the installation process begins. When the installation has completed successfully, you will receive a notification. There is one more way to install SQL Server Studio , which is via the command-line interface . The process is running in the background, and the user does not receive any GUI prompts. If this method suits you better, launch the command prompt with the elevated permissions and type the following command: start \"\" /w /Quiet SSMSInstallRoot= If any failure or error occurs during the installation process, the data will be stored in a log file in %TEMP%\\SSMSSetup. After successful installation, launch SQL Server Management Studio from the Start menu: With the first launch of the product, we get the screen prompting to connect to SQL Server. It is the first step we need to perform to configure our MS SQL Management Studio: Define the Server type Set the Server name Specify the Authentication type Provide the Username and Password Click Connect You can save the username and password. So next time you connect to the same SQL Server instance, you won’t need to enter them again. Note : SQL Server Management Studio can connect to other SQL Server components, such as Analysis Services (SSAS), Integration Services (SSIS), and Reporting Services (SSRS). This functionality comes in handy as it allows users to apply SSMS in the following cases: Manage the Analysis Services objects Write and save Analysis Services Scripts in MDX, DMX, and XMLA Manage packages – import, export, monitor, upgrade, and organize them into folders Manage roles and jobs Administer server and databases, etc. SQL Server Management Studio features In our overview, we’ll stick to the most prominent SSMS tools and options for SQL Server management: Object Explorer Query Editor Template Explorer Query Execution Plan Table Designer Database Designer Query and View Designer Generate and Publish Scripts Wizard Backup and Restore SQL Server Security Management Activity Monitor XEvent Profiler Object Explorer Object Explorer is the most frequently used feature in SQL Server Management Studio. Object Explorer has a user interface that offers a convenient hierarchical view of server objects, similar to Windows Explorer with its system of folders, subfolders, files, etc. Among the many helpful options, it allows you to do the following: Search a specific object View and edit object properties Manage objects represented as nodes Run custom reports Query Editor The primary tasks of SQL Server Management Studio are writing, executing, and debugging T-SQL code. To implement these functions, SSMS provides an advanced SQL Query Editor with IntelliSense support. Equipped with this technology, the tool is able to auto-complete code. This means, as soon as you start writing code, the software suggests variants to finish the row. This helps accelerate the code writing process and make it more accurate. There are many more helpful options in SSMS Query Editor: Building scripts with T-SQL and XQuery statements SQLCMD Scripts editor MDX, DMX, and XML/A editors Code formatting, including syntax highlighting Code debugging Commenting and uncommenting selected lines Code lines numeration Drag-and-drop text Selected code executing Getting query results as text Saving results to file Bookmarks Integration with Query Execution Plan Template Explorer Templates are files with ready-made SQL scripts that create standard objects (databases, tables, views, indexes, stored procedures, triggers, statistics, and functions). Besides, there are templates for SQL Server Management tasks. You can use them to create extended properties, linked servers, logins, users, roles, and Analysis Services. There is a large collection of predefined templates available in Template Explorer. Refer to any such file and use the code in the Editor with many customization options. Additionally, you can make custom code templates. All such scripts are organized in folders, so you will have to either take the existing folders or develop a new folder structure. Query Execution Plan A query execution plan is the sequence of operations on SQL Server that you perform to obtain a SQL query result. Before the server executes a SQL query, it must analyze the instructions and define the most efficient way of execution. For that, it uses the Query Optimizer component. Its input data is the query execution plan itself. In SQL Server Management Studio, you can view a query execution plan and identify the most resource-intensive operations. Then, you can adjust the query to achieve optimal results. Table Designer This visual tool lets users design and visualize tables in a database. Table Designer makes it possible to create, edit, and delete both tables and their components (columns, indexes, keys, relationships, and constraints). The main advantage is that you can perform all these tasks in a visual mode, without manual typing of SQL code. Database Designer Database Designer is another visual tool that assists in designing databases and their components. It visualizes each database with tables, columns, constraints, and dependencies. Similarly, you can create and edit databases using diagrams. With SQL Server Studio, it is possible to work on one or several diagrams – there can be as many of them as necessary. Also, each database can be a part of multiple diagrams. This way, you can focus on different database aspects. Query and View Designer Query and View Designer allows users to develop queries and views. A significant plus is that you can create database queries using the mouse only – there is no need to write code manually. The Designer also allows you to select a particular SQL query and create a similar one in the Editor. Generate and Publish Scripts Wizard This feature generates specific scripts that can run on other SQL Server instances. In essence, you can generate a database object creation script and then use the script on another instance to create the same object. The Wizard also handles the task of publishing database contents to the Web service. The scripts may be intended for the entire database or specific objects only. Thus, you can restore objects or share the script with your colleagues for object unification. Backup and Restore The graphical interface of SQL Server Management Studio simplifies the tasks of backing up and restoring databases. The necessary options are provided for each database in the context menu of Object Explorer. SQL Server Security Management Apart from T-SQL code development and execution, SSMS can manage SQL Server itself. In particular, it deals with SQL Server security. SSMS Security Manager can create usernames and database users and configure their access rights. The necessary options for both the entire SQL Server and separate databases are available in the “Security” section. Activity Monitor The tool displays the current data and SQL Server processes. With Activity Monitor, users can track server activities, such as the execution of SQL queries and statements, check the connected users, view data input and output, and examine the most recent and current resource-consuming queries, etc. XEvent Profiler XEvent Profiler is an SSMS component that provides quick access to a live streaming view of diagnostics events on SQL Server. One of its advantages is that the database session is less intrusive, so you can debug SQL Server events without performance degradation. Also, it provides a possibility to customize the event view adjusting it to your needs. Conclusion SQL Server Management Studio is an undeniable leader that stands out from similar products. Its multiple features allow performing a range of database-related jobs, and, what’s more, they are available for free out of the box. They facilitate the tasks and help specialists save their time and work more efficiently. However, it does not mean there is no room for improvement. There are lots of additional tools, add-ins, and extensions helping to enhance the functionality and fill the existing gaps in performance. It’s worth mentioning the [SSMS tools and add-ins](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) provided by Devart. The variety of means within this solution is excellent for resolving different tasks, from enhancing the IntelliSense options to schema and data comparison and index management. Tags [Microsoft SQL Server Management Studio](https://blog.devart.com/tag/microsoft-sql-server-management-studio) [ssms](https://blog.devart.com/tag/ssms) [SSMS tools and add-in](https://blog.devart.com/tag/ssms-tools-and-add-in) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgeneral-review-of-microsoft-sql-server-management-studio-ssms.html) [Twitter](https://twitter.com/intent/tweet?text=General+Review+of+Microsoft+SQL+Server+Management+Studio+%28SSMS%29&url=https%3A%2F%2Fblog.devart.com%2Fgeneral-review-of-microsoft-sql-server-management-studio-ssms.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/general-review-of-microsoft-sql-server-management-studio-ssms.html&title=General+Review+of+Microsoft+SQL+Server+Management+Studio+%28SSMS%29) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/general-review-of-microsoft-sql-server-management-studio-ssms.html&title=General+Review+of+Microsoft+SQL+Server+Management+Studio+%28SSMS%29) [Copy URL](https://blog.devart.com/general-review-of-microsoft-sql-server-management-studio-ssms.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/generate-crud-procedures-with-gpt-chat.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How GPT Chat Can Help You Generate CRUD Procedures for SQL Server Tables By [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) May 25, 2023 [0](https://blog.devart.com/generate-crud-procedures-with-gpt-chat.html#respond) 2260 In today’s technological landscape, artificial intelligence has carved a significant niche for itself, offering unparalleled assistance in numerous domains. One such domain is database development, where the AI model, such as ChatGPT, has been gaining increasing recognition. This article delves into how ChatGPT can help database developers streamline their work by aiding in the creation of Create, Read, Update, and Delete (CRUD) procedures for SQL Server tables. Through intelligent suggestions and automated responses, ChatGPT not only simplifies this vital process but also enhances productivity and efficiency, thereby redefining how database developers approach their tasks. Additionally, we will validate the AI-generated code and compare it with the [CRUD stored procedures](https://blog.devart.com/how-to-generate-and-use-crud-stored-procedures.html) generated by SQL Complete, a robust add-on designed for SQL Server database development. Contents Prerequisites Download and install the AdventureWorks database Create CRUD procedures with the help of ChatGPT CRUD Stored Procedures in SQL Server How ChatGPT can help generate CRUD stored procedures How to generate CRUD stored procedures using SQL Complete Conclusion Prerequisites To follow the scenario described in this article, you will need the following: ChatGPT account AdventureWorks database SQL Server Management Studio (SSMS) with [SQL Complete downloaded and installed](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) Download and install the AdventureWorks database Download the [AdventureWorks2019.bak](https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure?view=sql-server-ver16&tabs=ssms) file from the official Microsoft website. Open SSMS, connect to your SQL Server instance, and restore AdventureWorks2019 from the downloaded .bak file. Verify the database installation by connecting to it, going to Database/Object Explorer, and navigating across its objects: tables, views, stored procedures, etc. Create CRUD procedures with the help of ChatGPT [CRUD operations in SQL Server](https://www.devart.com/dbforge/sql/sqlcomplete/crud-operations-in-sql.html) refer to the basic database operations: Create, Read, Update, and Delete. These operations allow for the creation of new data records, retrieval of existing data, modification of data, and deletion of unwanted data, respectively. Let us look at how you can leverage the power of AI to generate CRUD stored procedures in SQL Server. CRUD stored procedures in SQL Server To begin with, let’s find out what CRUD stored procedures are and how they can be used in SQL Server. A CRUD stored procedure refers to a database procedure or routine that is designed to perform the basic Create, Read, Update, and Delete operations on a database table. CRUD is an acronym that stands for: Create: Inserts new data or records into the database table. Read: Retrieves or reads existing data or records from the database table. Update: Modifies or updates existing data or records in the database table. Delete: Removes or deletes existing data or records from the database table. A CRUD stored procedure encapsulates the necessary SQL statements and logic to execute these operations in a structured and controlled manner. It provides a standardized approach for interacting with the database table, ensuring data integrity and consistent manipulation of data. CRUD stored procedures are commonly used in database development to simplify and streamline data management tasks. CRUD stored procedures offer a reusable and centralized solution for managing database operations, making it easier to maintain and update the application code. They also enhance performance by reducing network traffic and improving query execution. How ChatGPT can help generate CRUD stored procedures Suppose, we want to create a CRUD Stored Procedure for the HumanResources.Department table in the AdventureWorks2019 database. Let us ask the AI for assistance. Here is the code that ChatGPT has generated for us. CREATE PROCEDURE dbo.Department_CRUD\n @DepartmentID INT = NULL,\n @Name NVARCHAR(50) = NULL,\n @GroupName NVARCHAR(50) = NULL,\n @ModifiedDate DATETIME = NULL,\n @Action NVARCHAR(10)\nAS\nBEGIN\n SET NOCOUNT ON;\n\n IF @Action = 'SELECT'\n BEGIN\n SELECT DepartmentID, Name, GroupName, ModifiedDate\n FROM HumanResources.Department\n WHERE (@DepartmentID IS NULL OR DepartmentID = @DepartmentID)\n AND (@Name IS NULL OR Name = @Name)\n AND (@GroupName IS NULL OR GroupName = @GroupName)\n AND (@ModifiedDate IS NULL OR ModifiedDate = @ModifiedDate)\n END\n ELSE IF @Action = 'INSERT'\n BEGIN\n INSERT INTO HumanResources.Department (Name, GroupName, ModifiedDate)\n VALUES (@Name, @GroupName, @ModifiedDate)\n END\n ELSE IF @Action = 'UPDATE'\n BEGIN\n UPDATE HumanResources.Department\n SET Name = @Name,\n GroupName = @GroupName,\n ModifiedDate = @ModifiedDate\n WHERE DepartmentID = @DepartmentID\n END\n ELSE IF @Action = 'DELETE'\n BEGIN\n DELETE FROM HumanResources.Department\n WHERE DepartmentID = @DepartmentID\n END\nEND Let us now check whether it is working properly. First of all, we run the code, and see that it is executed successfully, and a new procedure appears in the Object Explorer. Next, to check whether the AI-generated stored procedure is valid, we execute its SELECT part adding the parameters. Quite impressive, isn’t it? Let’s delve deeper into exploring the SQL capabilities of ChatGPT and request its assistance in generating four separate stored procedures for each CRUD operation instead of a single stored procedure. Here is the result we obtained. And here is the code for the stored procedures for your convenience. -- Create Procedure\nCREATE PROCEDURE dbo.sp_CreateDepartment\n @DepartmentID INT,\n @DepartmentName NVARCHAR(50),\n @GroupName NVARCHAR(50)\nAS\nBEGIN\n INSERT INTO HumanResources.Department (DepartmentID, DepartmentName, GroupName)\n VALUES (@DepartmentID, @DepartmentName, @GroupName)\nEND\n\n-- Read Procedure\nCREATE PROCEDURE dbo.sp_ReadDepartment\nAS\nBEGIN\n SELECT * FROM HumanResources.Department\nEND\n\n-- Update Procedure\nCREATE PROCEDURE dbo.sp_UpdateDepartment\n @DepartmentID INT,\n @DepartmentName NVARCHAR(50),\n @GroupName NVARCHAR(50)\nAS\nBEGIN\n UPDATE HumanResources.Department\n SET DepartmentName = @DepartmentName, GroupName = @GroupName\n WHERE DepartmentID = @DepartmentID\nEND\n\n-- Delete Procedure\nCREATE PROCEDURE dbo.sp_DeleteDepartment\n @DepartmentID INT\nAS\nBEGIN\n DELETE FROM HumanResources.Department\n WHERE DepartmentID = @DepartmentID\nEND Shall we check it? This time we get an error when trying to execute the code written by ChatGPT. It appears that it invented column names. How to generate CRUD stored procedures using SQL Complete [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is a powerful add-on tool for SQL Server database development. It is designed to enhance the productivity and efficiency of developers by providing intelligent code completion, code snippets, and SQL statement suggestions. SQL Complete offers features such as real-time syntax and error highlighting, code formatting, code navigation, and code analysis. It helps developers write SQL code faster, with fewer errors, and adhering to best practices. SQL Complete integrates seamlessly with popular database development tools like SQL Server Management Studio (SSMS) and provides a comprehensive set of tools to streamline the development process and improve productivity. In addition to its numerous features, SQL Complete also enables the creation of CRUD procedures with ease. It provides a convenient and efficient way to [generate the necessary SQL code for Create, Read, Update, and Delete operations on database tables](https://docs.devart.com/sqlcomplete/sql-refactoring/generate-crud-procedures-for-a-table.html) . By utilizing SQL Complete, developers can accelerate the development process by automatically generating the code structure for CRUD procedures, saving time and reducing the chances of errors. Let us look at how you can utilize the powers of SQL Complete for creating CRUD stored procedures. To create a CRUD procedure, in Object Explorer , right-click on the database table and select SQL Complete > Script Table as CRUD . This will generate a code in a new SQL document. Let us execute the code. As you can see, the code has been executed without any errors, and as a result, four stored procedures are now visible in the Object Explorer. Using four separate stored procedures instead of a single large one offers several advantages in terms of organizing, reusing, maintaining, and optimizing your code in the long run. Conclusion In this article, we explored the capabilities of ChatGPT in generating CRUD stored procedures for SQL Server and compared it to the functionality of SQL Complete. While ChatGPT showcased its potential in assisting with CRUD procedure creation, SQL Complete demonstrated better accuracy in handling the task. However, it is important to note that ChatGPT is a remarkable tool that is continually being developed, and its potential for assisting developers is promising. As an AI-powered solution, ChatGPT offers unique advantages and can be a valuable asset in simplifying the development process. Take your SQL Server database development to the next level by [downloading and trying out SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) today. Experience its powerful features and witness firsthand how it enhances your productivity, accuracy, and efficiency in writing SQL code. Download now and see the difference for yourself! Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Elena Zemliakova](https://blog.devart.com/author/helena-alexander) Elena is an experienced technical writer and translator with a Ph.D. in Linguistics. As the head of the Product Content Team, she oversees the creation of clear, user-focused documentation and engaging technical content for the company’s blog and website. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgenerate-crud-procedures-with-gpt-chat.html) [Twitter](https://twitter.com/intent/tweet?text=How+GPT+Chat+Can+Help+You+Generate+CRUD+Procedures+for+SQL+Server+Tables&url=https%3A%2F%2Fblog.devart.com%2Fgenerate-crud-procedures-with-gpt-chat.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/generate-crud-procedures-with-gpt-chat.html&title=How+GPT+Chat+Can+Help+You+Generate+CRUD+Procedures+for+SQL+Server+Tables) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/generate-crud-procedures-with-gpt-chat.html&title=How+GPT+Chat+Can+Help+You+Generate+CRUD+Procedures+for+SQL+Server+Tables) [Copy URL](https://blog.devart.com/generate-crud-procedures-with-gpt-chat.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/generate-series-in-sql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Power of SQL GENERATE_SERIES Function By [Nataly Smith](https://blog.devart.com/author/nataly-smith) October 3, 2023 [0](https://blog.devart.com/generate-series-in-sql.html#respond) 2488 If your work somehow interweaves with database management, you know that precision and efficiency are paramount in this matter. The SQL GENERATE_SERIES function is a versatile tool that can come in handy when we generate sequences of data within SQL queries. This function empowers database professionals to effortlessly create ordered lists of numbers simplifying complex tasks and enhancing data manipulation capabilities. Embark on a journey to unlock the full potential of your SQL queries with the combined might of GENERATE_SERIES and [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . This feature is available as an add-in for Visual Studio and [SSMS](https://www.devart.com/dbforge/sql/ssms-tools-and-addins/) or as a part of [dbForge Studio for SQL Server (IDE)](https://www.devart.com/dbforge/sql/studio/) . In this article, we will talk about the intricacies of the function, unveiling its capabilities and demonstrating how it can change your data manipulation game. Contents Understanding GENERATE_SERIES Syntax of GENERATE_SERIES Using GENERATE_SERIES for Generating Number Series Detailed Examples of Using GENERATE_SERIES Handling Argument Types in GENERATE_SERIES Best Practices [Conclusion](http://conclusion) Understanding GENERATE_SERIES To begin with, let us define what GENERATE_SERIES actually is and what it is used for. This function is a powerful tool in SQL used to create a sequential list of values within a specified range. It requires three parameters: a start value , an end value , and an optional step value . The function then generates a series of numbers inclusive of the start and end values , with intervals determined by step . Using the GENERATE_SERIES function provides a concise and efficient way to generate ordered lists of numbers or dates, eliminating the need for labor-intensive manual input. Additionally, it excels in scenarios where precision is essential, ensuring that data ranges are accurately defined. This function also enhances the readability of SQL queries, making code more intuitive and maintainable. Note: Even though the GENERATE_SERIES function is not native to SQL Server, it can be of use to the developers who work with this RDBMS. It requires the compatibility level to be at least 160. When the compatibility level is less than 160, the SQL Server is unable to find the GENERATE_SERIES function. Syntax of GENERATE_SERIES Having gained a basic understanding of the GENERATE_SERIES function, we will move on to a more technical side of the matter, namely, its syntax. In its most pristine view, the function will look somewhat like this: GENERATE_SERIES(start_value, end_value, step_increment); start_value represents the starting value of the series. end_value signifies the end value of the series. step_increment is an optional parameter that specifies the increment value between each consecutive element in the series. If omitted, the default increment is 1. For demonstration purposes, we are going to use dbForge Studio for SQL Server in this article. Using GENERATE_SERIES for Generating Number Series The next stop on our journey will be dedicated to both basic and advanced GENERATE_SERIES usage. The most simple application of this function you can think of is generating numerical ranges, creating ordered lists, and populating tables with predefined data, enhancing the efficiency of data manipulation within SQL queries. However, when it comes to more complex usage, GENERATE_SERIES involves leveraging its capabilities in conjunction with other SQL functions and operations to perform complex data manipulation tasks. For example, combining it with arithmetic operations can lead to sophisticated result sets. This function’s versatility extends to generating date ranges, facilitating tasks like date-based calculations and financial analyses, as well as generating reports over specific time intervals, showcasing its application beyond just numerical sequences. Additionally, GENERATE_SERIES can be employed in scenarios that require simulating large datasets for testing purposes or creating ranges of custom elements. Its adaptability and flexibility make it a powerful tool for a wide range of database management tasks. Detailed Examples of Using GENERATE_SERIES When learning something new, it is always best to choose a combined approach: theory in synergy with practice and real-world examples to make things more real. Therefore, let us now go through some of the most common examples of the GENERATE_SERIES function usage. dbForge Studio for SQL Server will be our test site, while its convenient built-in features will enhance our code writing speed and accuracy. Previously, only SQL Complete included [code completion](https://www.devart.com/dbforge/sql/studio/sql-coding-assistance.html#code-completion) , SQL formatting, [snippets](https://www.devart.com/dbforge/sql/studio/sql-coding-assistance.html#snippets) , and [code navigation](https://www.devart.com/dbforge/sql/studio/sql-coding-assistance.html#doc_outline) features, and now they are a part of dbForge Studio. Example 1: Generating a Simple Number Series SELECT * FROM GENERATE_SERIES(1, 5); This simple query generates a series of integers from 1 to 5. The output will be a single column with five rows, each containing a consecutive integer: Example 2: Generating a Number Series with a Specific Step SELECT * FROM GENERATE_SERIES(1, 10, 2); In the syntax above, the start value is 1, the end value is 10, and the step increment equals 2. This would produce the series: 1, 3, 5, 7, 9. Example 3: Generating a Decimal Number Series SELECT * FROM GENERATE_SERIES(1.5, 5.5, 1.5); We decided to spice things up just a little bit by adding decimal numbers to the query in order to see how it works. As expected, the function generates a series of decimal numbers starting from 1.5 and ending at 5.5, with an increment of 1.5. The output will be a single column with a total of 4 rows, each containing a decimal number: Handling Argument Types in GENERATE_SERIES To gain an even better understanding of GENERATE_SERIES, we need to look deeper into the data types of the arguments being passed within the function. For instance, providing integer values as arguments will generate a series of integers while using floating-point numbers will result in a series of decimal or floating-point values. This ensures that the function behaves as expected and produces the desired output. Data type Description Example Integer When both start and end values are integers, the function will generate a series of integers. SELECT * FROM GENERATE_SERIES(1, 5); Floating-Point If either the start or end value (or both) are floating-point numbers, the function will generate a series of decimal or floating-point values. SELECT * FROM GENERATE_SERIES(1.5, 5.5, 1.5); Implicit Type Casting In cases where arguments have different data types (e.g., one integer and one floating-point), the database system may perform implicit type casting. SELECT * FROM GENERATE_SERIES(1, 5.5); Casting for Precision To ensure precise behavior, especially with mixed types, it is recommended to explicitly cast arguments. This avoids potential issues or unexpected behavior. SELECT * FROM GENERATE_SERIES(CAST(1 AS NUMERIC), CAST(5 AS NUMERIC)); In cases where the provided arguments have different data types, the database system will attempt to perform implicit type casting. However, SQL Server does not work well with different data types: Best Practices When using GENERATE_SERIES in SQL, it is important to follow best practices to ensure efficient and effective usage. By following these best practices, you can effectively use this function and related techniques to streamline your SQL queries while ensuring code quality, performance, and compatibility with your database system. Using Wrapper Functions To make your SQL code more readable and maintainable, consider creating wrapper functions that encapsulate GENERATE_SERIES for specific use cases. These wrapper functions can have meaningful names and default arguments, making it easier for other developers to understand their purpose. Dealing with Compatibility Level Restrictions Different database systems and versions may have varying support for the GENERATE_SERIES function. If you are working with a database that does not natively support it, you might need to use alternative methods, such as recursive Common Table Expressions (CTEs) or Numbers tables. Ensure that your approach aligns with the specific capabilities and limitations of your database system. Performance Optimization Be mindful of performance when working with large data sets. Consider indexing columns used in conjunction with generated series for faster query execution. Additionally, avoid generating unnecessarily large series that could consume excessive memory and processing resources. Documentation Document your usage of GENERATE_SERIES and any wrapper functions you create. Provide clear explanations of the purpose, arguments, and expected output. Good documentation ensures that others can understand and maintain your code. Testing Before using GENERATE_SERIES extensively in your applications or queries, conduct thorough testing with sample data to ensure it behaves as expected. Test corner cases, edge cases, and scenarios with various argument types to validate its behavior. Adherence to SQL Standards While GENERATE_SERIES is a powerful tool, it is not part of the SQL standard and may not be supported in all database systems. Security Considerations When using generated series for dynamic queries or data generation, be cautious of SQL injection vulnerabilities. Always validate user inputs to prevent malicious SQL injection attacks. Conclusion In conclusion, the SQL GENERATE_SERIES function stands as a testament to the power and versatility it brings to database management. By effortlessly creating ordered lists of numbers, it simplifies complex tasks and greatly enhances data manipulation capabilities. When combined with dbForge SQL Complete, this function becomes an even more formidable tool, unlocking the full potential of SQL queries. We invite you to experience the full capabilities of our software by downloading free, fully functional trials of [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) . If you are already using our products, make sure to update your Studio and SQL Complete to the latest versions from their menus by clicking Help > Check for Updates. Tags [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [SQL GENERATE_SERIES Function](https://blog.devart.com/tag/sql-generate_series-function) [SQL Server](https://blog.devart.com/tag/sql-server) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgenerate-series-in-sql.html) [Twitter](https://twitter.com/intent/tweet?text=Power+of+SQL+GENERATE_SERIES+Function&url=https%3A%2F%2Fblog.devart.com%2Fgenerate-series-in-sql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/generate-series-in-sql.html&title=Power+of+SQL+GENERATE_SERIES+Function) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/generate-series-in-sql.html&title=Power+of+SQL+GENERATE_SERIES+Function) [Copy URL](https://blog.devart.com/generate-series-in-sql.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/generate-test-data-with-sql-data-generator.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Generate Test Data with the Help of SQL Data Generator By [dbForge Team](https://blog.devart.com/author/dbforge) January 22, 2024 [0](https://blog.devart.com/generate-test-data-with-sql-data-generator.html#respond) 10293 In this article, we will examine the process of populating the employee database with dummy data, whose schema we designed [in the previous part](https://blog.devart.com/sql-database-design-basics-with-example.html#the_database_schema_for_a_recruitment_service) . Filling a SQL database with dummy data can be very useful when we want to run some tests. The most convenient way is the population of SQL tables with random data with the help of visual data generation tools. Generating data with the help of Data Generator for SQL Server The [Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) tool is integrated into SSMS and is also included in [dbForge Studio](https://www.devart.com/dbforge/sql/studio/) . It should be noted that realistic test data is generated based on column names, dimensions, and data types. Apart from this, the relationships between tables are also taken into account, as the process of data generation depends on them. To open this component, right-click Data Generation > New Data Generation against the necessary database in [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-2017) : Img. 1. Running the [Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) tool in SSMS If you are using dbForge Studio, on the main menu, choose Tools > New Data Generation : Img.2. Running the Data Generator for SQL Server tool in dbForge Studio In the Data Generator Project Properties window that opens, on the Connection tab, you can see the current MS SQL Server instance and the database selected for data generation, which can be edited (if necessary). Then, click Next : Img.3. Setting the “Connection” tab Next, on the Options tab, set the options of data generation for the database: Img.4. Setting data generation options Note that you can generate SQL test data in different modes: By specified number of rows (1000 rows by default) By proportion of existing data in the volume of percent (10 % by default) By generation of data by time (10 seconds by default) You can also clear data before generation by setting the Truncate data from table before generation parameter. You can set the value distribution mode in one of the following ways: Random by timestamp Random by seed (1 by default) Sequential Also, you can set column properties: Set values to be unique Include NULL values (10% of rows by default) Include empty values (10% of rows by default) You can save the settings to a .bat file by clicking Save Command Line located on the lower left of the data generation settings window. Watch the Data Generator for SQL Server tool in action & learn how to fill a database with test data in minutes! After you are finished with the settings, on the lower right of the data generation settings window, click Open . You will then see a progress bar showing the table metadata loading. After that, the window with detailed data generation settings for each selected table appears: Img.5.  Detailed data generation settings for each selected table On the left, you should select the tables and columns you want to populate, and on the right, you should set the table generation mode for the selected table. At the same time, below are the instances of generated data (note that they represent real names). In the top right corner, there is a button of data generation settings that were described above. To start the data generation process, click on the green arrow at the top center of the window. Then, you will see the window for selecting additional settings. Here, on the Output tab, you need to select exactly where to generate the data, in the form of a script, save it to a file or to a database. Let us select the last option and click Next : Img. 6. Setting the Output tab Then, you can set additional parameters on the Options tab. In this case, you need to clear database backup options and click Next : Img.7. Setting the Options tab On the Additional Scripts tab, you can set additional scripts. In our case, we just click Next : Img.8. Setting the “Additional Scripts” tab On the Summary tab, we can see the information about settings and also warnings. Here, you can also save all settings as a .bat file by clicking Save Command Line . To run the data generation process, you need to click Generate : Img.9. The general information and warnings on the Summary tab The window of the data generation process appears: Img.10. Data generation process Then, the tables will be populated with data. For instance, the Employee table has the following generated data: Img.11. The examples of generated data in the Employee table Ready to test the Data Generator for SQL Server? Enjoy [a free 30-day trial](https://www.devart.com/dbforge/sql/data-generator/download.html) ! Conclusion To sum up, we populated the database with realistic data for testing both functionality and load. It is possible to [generate much more random data](https://www.devart.com/dbforge/sql/data-generator/how-to-generate-random-numbers.html) for load tests. In addition to that, the very process of testing can be accelerated by means of the [dbForge Unit Test](https://www.devart.com/dbforge/sql/unit-test/) tool. Start writing SQL unit tests directly in SQL Server Management Studio & run multiple unit tests in a few clicks with the [dbForge Unit Test](https://www.devart.com/dbforge/sql/unit-test/) tool. What is more, through the use of SQL data generation, you can calculate not only a database growth rate but also a query performance difference that results from the data volume increase. Next time, we are going to talk about the ways to transfer data from one SQL Server database to another one through [export and import](https://blog.devart.com/export-and-import-json-data-via-dbforge-data-pump-for-sql-server.html) . Tags [data generation](https://blog.devart.com/tag/data-generation) [sql data generator](https://blog.devart.com/tag/sql-data-generator) [SQL Server](https://blog.devart.com/tag/sql-server) [studio for sql server](https://blog.devart.com/tag/studio-for-sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgenerate-test-data-with-sql-data-generator.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Generate+Test+Data+with+the+Help+of+SQL+Data+Generator&url=https%3A%2F%2Fblog.devart.com%2Fgenerate-test-data-with-sql-data-generator.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/generate-test-data-with-sql-data-generator.html&title=How+to+Generate+Test+Data+with+the+Help+of+SQL+Data+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/generate-test-data-with-sql-data-generator.html&title=How+to+Generate+Test+Data+with+the+Help+of+SQL+Data+Generator) [Copy URL](https://blog.devart.com/generate-test-data-with-sql-data-generator.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/generating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Generating Related Data Elements with SQL Data Generator By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) April 1, 2016 [0](https://blog.devart.com/generating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html#respond) 3868 Recently we got an interesting question from our user: The docs don’t address directly, but how can we generate three related data elements, like dateof birth, age, and date of death for instance, in one script efficiently as it generates test rows/records and then populate the fields in a test database table? If docs do address this, please point me to where, if you would… Thanks! Another user has posted a similar question on the Devart [support forum](https://forums.devart.com/viewtopic.php?f=23&t=33396) . We added the new “ Death Date ” generator to help resolve the problem. dbForge Data Generator for SQL Server is supplied with a wide collection of generator templates that can be easily used for [the creation of your own data generators](https://docs.devart.com/data-generator-for-sql-server/using-generators/creating-new-generators.html) . Actually, it is the most “sad” generator on our list. Let’s have a look at how to use it. Here is a demo table that contains the PersonAge , DateofBirth , and DateofDeath fields. Let’s start [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) and populate the table with test data. We see a warning that states that the global name “DOB” is not defined . DOB is the column that contains [dates of birth](https://docs.devart.com/data-generator-for-sql-server/using-generators/birth-date.html) . It must be located in the same table. Select the DateofDeath column to see the details. As you can see, the Death Date generator is automatically selected and mapped to the column. The following step is to modify the python script. Find the bd = DOB string. It is required to replace the DOB placeholder with the actual column that contains dates of birth ( DateofBirth in our case). Note, it is case sensitive. Thus, in this particular example, we need to write bd = DateofBirth Once you have modified the python script with the correct column name, another warning might appear in the DateofDeath column ( String was not recognized as a valid DateTime ). The reason is that we have a different column data format by default. In that case, the last step is to replace “/” with the “.” To generate the correct Person age date, that corresponds to DateofBirth and DateofDeath , select the PersonAge column and specify appropriate columns instead of placeholders. Also, keep in mind that the date format may require some modifications. Here is the result: This way you can generate related data elements. The [python script](https://docs.devart.com/data-generator-for-sql-server/using-generators/python-script-examples.html) is well commented and you can modify it to use for your own purposes. Tags [data generator](https://blog.devart.com/tag/data-generator) [SQL Server](https://blog.devart.com/tag/sql-server) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgenerating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Generating+Related+Data+Elements+with+SQL+Data+Generator&url=https%3A%2F%2Fblog.devart.com%2Fgenerating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/generating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html&title=Generating+Related+Data+Elements+with+SQL+Data+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/generating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html&title=Generating+Related+Data+Elements+with+SQL+Data+Generator) [Copy URL](https://blog.devart.com/generating-rrelated-data-elements-with-dbforge-data-generator-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/generating-subscription-statistics-data-in-oracle-data-generator.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Generating subscription statistics data in Oracle Data Generator By [dbForge Team](https://blog.devart.com/author/dbforge) August 2, 2019 [0](https://blog.devart.com/generating-subscription-statistics-data-in-oracle-data-generator.html#respond) 4186 When you need to fill your databases with data for testing purposes, it’s usually quite handy to also have this data follow dynamic patterns like in real life. In such situations, records should both look realistic on their own and form a life-like overall picture. For example, if you want to fill your table with test data for an online service with a subscription model, the records would have to be dynamic from day to day. In real environment, the amount of subscriptions will always change – some people will leave the service while new subscribers come in. Of course, we would like the latter number to outweigh the former. With the help of Data Generator for Oracle and some basic Python scripts, we can generate test data that will show a dynamic growth pattern. How to generate realistic test data with a dynamic growth pattern Creating the necessary tables First of all, we need a database that can properly describe the information we want to store. In this example, we’ll want three tables to hold our data – product , subscriber , and subscriptions . The following SQL query will create these tables: CREATE TABLE \"Product\" (\n\"id\" NUMBER(10, 0),\n\"name\" VARCHAR2(50 BYTE),\nCONSTRAINT PK_PRODUCT_ID PRIMARY KEY (\"id\") USING INDEX TABLESPACE USERS\n)\nTABLESPACE USERS\nLOGGING;\nCREATE TABLE \"Subscriber\" (\n\"id\" NUMBER(10, 0),\n\"name\" VARCHAR2(50 BYTE),\nCONSTRAINT PK_SUBSCRIBER_ID PRIMARY KEY (\"id\") USING INDEX TABLESPACE USERS\n)\nTABLESPACE USERS\nLOGGING;\nCREATE TABLE \"Subscriptions\" (\n\"day\" NUMBER,\n\"product_id\" NUMBER,\n\"subscriber_id\" NUMBER,\nCONSTRAINT FK_SUBSCRIPTIONS_PRODUCT_ID FOREIGN KEY (\"product_id\")\nREFERENCES \"Product\" (\"id\"),\nCONSTRAINT FK_SUBSCRIPTIONS_SUBSCRIBER_ID FOREIGN KEY (\"subscriber_id\")\nREFERENCES \"Subscriber\" (\"id\")\n)\nTABLESPACE USERS\nLOGGING; You can create the corresponding SQL file in [Data Generator for Oracle](https://www.devart.com/dbforge/oracle/data-generator/) . To do it, click the New SQL button located on the toolbar at the top left part of the screen. If you haven’t already connected to a server, the Connect to Server window will appear – here, select the desired connection. When this is done, click Connect . In a tab that will be opened, enter the query we provided above: You can then execute this query to create the tables we need for the next steps. With these three tables set up, we can proceed further. Creating a new Data Generator document Let’s create a new Data Generator document. In dbForge Data Generator for Oracle, click New Data Generation on the top left corner of the screen. The Data Generator Project Properties window will be opened. In the Connection tab, choose the server connection and the schema containing the tables we created in the previous step. Then, press Next to continue. In the Options tab, you will be able to set various data generation options, if this is needed. You can also move on with the default options. When everything is set up, press Open. The main Data Generator window will be opened. Applying a custom data generation script Now, select the table we just created by enabling the corresponding checkbox. Then, go to the table’s DAY field . In the Column generation settings window located at the right side, set the Generator value to Python. In the Python script section, replace the default script with the following: def main(config):\n\n v_day = 1;\n v_count = 4\n while True:\n for x in range(v_count):\n yield v_day; \n v_day = v_day + 1; v_count= v_count + 4; When the script is entered, you can see what kind of data it generates in the Preview of data to be generated section at the lower part of the screen: What the initial values of the script mean There are three things that are important to us in the script: the v_day and v_count variables and the daily growth of subscriptions. v_day specifies the numerical value of the day of our service’s operation. So, the day we launched the service will be marked as day 1, and the day two weeks later as day 15. v_count specifies how many subscribers we gain on the day from which we start generating the data. In the script’s initial version, we start with v_count = 4 , which means we will have 4 new subscription s on the first day. At the very end of the script, you will find the following line: v_count= v_count + 4; This line specifies the daily growth of subscriptions. So, by default, the daily subscription growth is equal to 4. With all these initial values, we will get 4 new subscribers on the first day, 8 new subscribers on the second day, 12 on the third day, etc. Changing the script to better suit your needs To fine-tune the data generation process, you can change the corresponding values in the script. By changing the initial value of v_day in the script, you can generate data starting from a specific day in our service’s lifespan. For example, you can start the data generation process from the point of time when our service is 1 month old by changing the first line of the script to: v_day = 31; By changing the value of v_count , you will change how much new subscribers you get on the day from which you start generating the data. So, let’s assume that by day 31 we have 2000 subscribers. We can specify that by changing the second line of the script like this: v_count = 2000; Finally, let’s change our daily subscription growth. If we get 12 new subscribers each day, we will need to change the last line of the script to: v_day = v_day + 1; v_count= v_count + 12; So, with these modified parameters, we’ll start generating data from day 31 of our service’s lifetime, with 2000 initial subscribers and the daily subscription growth of 12. So, you can see that generating a dynamical pattern of data for your table is not that difficult with dbForge Data Generator for Oracle – you can [download](https://www.devart.com/dbforge/oracle/data-generator/download.html) it and try out for yourself. Also, you can watch this video tutorial: Tags [data generator](https://blog.devart.com/tag/data-generator) [Oracle](https://blog.devart.com/tag/oracle) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgenerating-subscription-statistics-data-in-oracle-data-generator.html) [Twitter](https://twitter.com/intent/tweet?text=Generating+subscription+statistics+data+in+Oracle+Data+Generator&url=https%3A%2F%2Fblog.devart.com%2Fgenerating-subscription-statistics-data-in-oracle-data-generator.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/generating-subscription-statistics-data-in-oracle-data-generator.html&title=Generating+subscription+statistics+data+in+Oracle+Data+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/generating-subscription-statistics-data-in-oracle-data-generator.html&title=Generating+subscription+statistics+data+in+Oracle+Data+Generator) [Copy URL](https://blog.devart.com/generating-subscription-statistics-data-in-oracle-data-generator.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/get-20-off-on-any-devart-product-expires-january-10th.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) Get 20% off on any Devart product (expires January 10th)! By [dbForge Team](https://blog.devart.com/author/dbforge) December 14, 2011 [0](https://blog.devart.com/get-20-off-on-any-devart-product-expires-january-10th.html#respond) 2380 Merry Christmas and Happy New Year! As Christmas holidays are drawing closer, we offer you to get 20% off any product license order. Don’t think twice, just save 20%. And what more, you get free access to all future releases of the respective products for one year. Hurry up, this offer is available only from December 01, 2011 till January 10, 2012 . Decide and order right now. We wish you all the best in the coming year, let joy and happiness be your constant partners and friends! Best Christmas Wishes, Devart Team Tags [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-20-off-on-any-devart-product-expires-january-10th.html) [Twitter](https://twitter.com/intent/tweet?text=Get+20%25+off+on+any+Devart+product+%28expires+January+10th%29%21&url=https%3A%2F%2Fblog.devart.com%2Fget-20-off-on-any-devart-product-expires-january-10th.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-20-off-on-any-devart-product-expires-january-10th.html&title=Get+20%25+off+on+any+Devart+product+%28expires+January+10th%29%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-20-off-on-any-devart-product-expires-january-10th.html&title=Get+20%25+off+on+any+Devart+product+%28expires+January+10th%29%21) [Copy URL](https://blog.devart.com/get-20-off-on-any-devart-product-expires-january-10th.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-latest-maintenance-update-of-dbforge-edge.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Get the Latest Maintenance Update of dbForge Edge! By [dbForge Team](https://blog.devart.com/author/dbforge) September 6, 2023 [0](https://blog.devart.com/get-the-latest-maintenance-update-of-dbforge-edge.html#respond) 1944 We have just rolled out a new maintenance update of [dbForge Edge](https://www.devart.com/dbforge/edge/) , our multidatabase suite comprising four well-rounded IDEs called Studios that encompass a vast diversity of database development, management, and administration tasks across SQL Server, MySQL/MariaDB, Oracle Database, and PostgreSQL. The purpose of such updates is to keep your experience with each Studio as smooth and trouble-free as possible. Now let’s take a look at the update in detail; mostly, it is all about newly added support for the latest and upcoming versions of key database systems. [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) has received support for the recently released MySQL 8.1 . [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) has received support for Oracle Database 23c , rolled out as a Free Developer Release. Finally, [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) now supports PostgreSQL 16 , currently available as a beta. The updates are available and can be installed for each individual Studio from the Help menu > Check for Updates . And if you would like to learn more about the introduced fixes, feel free to proceed to the revision history of the corresponding Studio: [MySQL](https://www.devart.com/dbforge/mysql/studio/revision_history.html) , [Oracle](https://www.devart.com/dbforge/oracle/studio/revision_history.html) , and [PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/revision_history.html) . Not using dbForge Edge yet? Then you are welcome to see all of its countless capabilities in action during a free 30-day trial . All you have to do is simply [download dbForge Edge](https://www.devart.com/dbforge/edge/download.html) from our official website and give it a go. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [MySQL](https://blog.devart.com/tag/mysql) [Oracle](https://blog.devart.com/tag/oracle) [PostgreSQL](https://blog.devart.com/tag/postgresql) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-maintenance-update-of-dbforge-edge.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+Latest+Maintenance+Update+of+dbForge+Edge%21&url=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-maintenance-update-of-dbforge-edge.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-latest-maintenance-update-of-dbforge-edge.html&title=Get+the+Latest+Maintenance+Update+of+dbForge+Edge%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-latest-maintenance-update-of-dbforge-edge.html&title=Get+the+Latest+Maintenance+Update+of+dbForge+Edge%21) [Copy URL](https://blog.devart.com/get-the-latest-maintenance-update-of-dbforge-edge.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-latest-maintenance-update-of-sql-complete-6-9.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Get the Latest Maintenance Update of SQL Complete 6.9 By [dbForge Team](https://blog.devart.com/author/dbforge) January 31, 2022 [0](https://blog.devart.com/get-the-latest-maintenance-update-of-sql-complete-6-9.html#respond) 2663 We have just released a maintenance update of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , an add-in for SSMS and Visual Studio that can accelerate your daily SQL coding by up to 4 times with a handful of code completion, formatting, and refactoring tools. The purpose of such updates is to keep your experience with SQL Complete as smooth and trouble-free as possible. This is the list of issues that have been successfully eliminated: Issue description Ticket # Issue that occurred when navigating to the document from the execution notification window D70704 Issue that occurred when closing documents D70305, D74983, D74982, D74981, D75367, D75789, D75791, D75790, D75788, D75885, D76368, D76367, D76362, D76380, D76366, D76365, D76364, D77657, D77745, D77796 Optimized retrieval of object information from the server and removal of unnecessary sorting D71171, D75868 Issue that occurred when retrieving machine configuration details for an error report D64891 Issue that occurred when formatting single-line queries D73763, D76942 Issue that occurred when loading an assembly D75943 Unexpected exception that occurred when working with the Document History – Issue that occurred when closing the application D74237, D75024, D75164, D75137, D75426 The update is already available and can be installed from the SQL Complete menu in SSMS > Help > Check for Updates . Not using SQL Complete yet? Then you are welcome to see all of its vast capabilities in action during a FREE 14-day trial . All you have to do is simply [download SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) from our official website and see how fast and easy your SQL coding can be. Tags [dbforge](https://blog.devart.com/tag/dbforge) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [what's new sql complete](https://blog.devart.com/tag/whats-new-sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-maintenance-update-of-sql-complete-6-9.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+Latest+Maintenance+Update+of+SQL+Complete+6.9&url=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-maintenance-update-of-sql-complete-6-9.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-latest-maintenance-update-of-sql-complete-6-9.html&title=Get+the+Latest+Maintenance+Update+of+SQL+Complete+6.9) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-latest-maintenance-update-of-sql-complete-6-9.html&title=Get+the+Latest+Maintenance+Update+of+SQL+Complete+6.9) [Copy URL](https://blog.devart.com/get-the-latest-maintenance-update-of-sql-complete-6-9.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Get the Latest Update of dbForge Studio and SQL Tools With Support for SQL Server 2022 By [dbForge Team](https://blog.devart.com/author/dbforge) September 12, 2022 [0](https://blog.devart.com/get-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html#respond) 4385 It’s time to unveil a big update of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , and [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , your go-to solutions for fast and effective tackling of your routine database-related tasks. This update delivers support for SQL Server 2022 Community Technology Preview 2.1 (CTP 2.1) alongside extended keyword suggestions and newly supported syntax constructs. Note that the following update is valid for the entire [dbForge product line for SQL Server](https://www.devart.com/dbforge/sql/) . Support for SQL Server 2022 CTP 2.1 First and foremost, this update delivers full compatibility with Microsoft SQL Server 2022 Community Technology Preview 2.1. Expanded keyword suggestions Next, we expanded keyword suggestions for a few statements, namely, CREATE INDEX, ALTER INDEX, CREATE CLUSTERED INDEX, CREATE TABLE, ALTER TABLE, and ALTER DATABASE SCOPED CONFIGURATION. This is what the suggestions for CREATE INDEX look like: This is what the suggestions for ALTER INDEX look like: This is what the suggestions for CREATE CLUSTERED INDEX look like: This is what the suggestions for CREATE TABLE look like: This is what the suggestions for ALTER TABLE look like: Finally, this is what the suggestions for ALTER DATABASE SCOPED CONFIGURATION look like: New query hints and syntax constructs In this release, you get two new query hints for your DELETE, INSERT, SELECT, UPDATE, and MERGE statements: DISABLE_OPTIMIZED_PLAN_FORCING and { FORCE | DISABLE } SCALEOUTEXECUTION . Additionally, you can use these newly added syntax constructs: BACKUP SYMMETRIC KEY RESTORE SYMMETRIC KEY New functions You also get a few new functions available in Completion List, Quick Info, and Parameter Information. 1. The first one is DATE_BUCKET . 2. Next, the IGNORE NULLS and RESPECT NULLS constructs, introduced in SQL Server 2022, are now supported for the FIRST_VALUE and LAST_VALUE functions. 3. You can use the ISJSON function with support for the json_type_constraint parameter, also introduced in SQL Server 2022. 4. The next supported function is JSON_ARRAY . 5. Another function you get is JSON_OBJECT . 6. Yet another new function is JSON_PATH_EXISTS . 7. Finally, you get the STRING_SPLIT function with support for the enable_ordinal parameter, introduced in SQL Server 2022. That’s it! Individual updates for each add-in will be suggested upon opening SSMS. Otherwise, you can install them for each add-in from the corresponding menu in SSMS > Help > Check for Updates . As for the standalone apps—such as dbForge Studio, Data and Schema Compare, or Query Builder—you can similarly update them from their menus by navigating to Help > Check for Updates . dbForge Studio for SQL Server: your single IDE for all tasks If you still hesitate to give dbForge Studio a go, we gladly invite you to [download it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) . You will see why so many people just can’t imagine their work without the Studio. We would be happy if you found it valuable as well. Not using SQL Tools yet? Get your free trial! And if you have yet to discover the extensive capabilities and performance of dbForge SQL Tools, we suggest you [download them for a free 30-day trial](https://devart.com/dbforge/sql/sql-tools/download.html) and see how much more productive your daily work is going to be. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [sql tools](https://blog.devart.com/tag/sql-tools) [SQL Tools 6.1](https://blog.devart.com/tag/sql-tools-6-1) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+Latest+Update+of+dbForge+Studio+and+SQL+Tools+With+Support+for+SQL+Server+2022&url=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html&title=Get+the+Latest+Update+of+dbForge+Studio+and+SQL+Tools+With+Support+for+SQL+Server+2022) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html&title=Get+the+Latest+Update+of+dbForge+Studio+and+SQL+Tools+With+Support+for+SQL+Server+2022) [Copy URL](https://blog.devart.com/get-the-latest-update-of-dbforge-studio-and-sql-tools-with-support-for-sql-server-2022.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Get the Latest Update of dbForge Studio for SQL Server With a Few Useful Additions By [dbForge Team](https://blog.devart.com/author/dbforge) June 20, 2022 [0](https://blog.devart.com/get-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html#respond) 3094 Here comes a minor yet rather helpful update of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , your favorite IDE that helps you easily handle nearly any task related to database development and administration—and makes your daily work with SQL Server databases far easier. Similarly to [the recent release of SQL Complete](https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html) , we have enhanced the Studio with several new functions, which are hardly game-changing, yet are certainly worth having. Let’s take a look at each of them. So, in this release, you get a few new functions in the Completion List, Quick Info, and Parameter Information. The first function is GREATEST : The second one is LEAST : Then you get the CURRENT_TIMEZONE function: Yet another new function is CURRENT_TIMEZONE_ID : Finally, we have added support for the FORMATFILE_DATA_SOURCE parameter for bulk_options in the OPENROWSET function: That’s it! You are free to update your dbForge Studio at any given moment to have all of these enhancements firmly in place. Simply go to the Help menu > Check for Updates and launch your update. And if you are not acquainted with [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) yet, we suggest you [download it for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) , which will help you evaluate its capabilities and see how much more productive your daily work with SQL databases is going to be. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [what's new sql server studio](https://blog.devart.com/tag/whats-new-sql-server-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+Latest+Update+of+dbForge+Studio+for+SQL+Server+With+a+Few+Useful+Additions&url=https%3A%2F%2Fblog.devart.com%2Fget-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html&title=Get+the+Latest+Update+of+dbForge+Studio+for+SQL+Server+With+a+Few+Useful+Additions) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html&title=Get+the+Latest+Update+of+dbForge+Studio+for+SQL+Server+With+a+Few+Useful+Additions) [Copy URL](https://blog.devart.com/get-the-latest-update-of-dbforge-studio-for-sql-server-with-a-few-useful-additions.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-most-from-your-data-with-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Get the most from your data with SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) January 22, 2020 [0](https://blog.devart.com/get-the-most-from-your-data-with-sql-complete.html#respond) 3020 Visualize your SQL data to facilitate understanding and decision making with the Data Visualizers feature of SQL Complete. On many occasions, developers and DBAs need to quickly check the table contents and getting data in a readable format without much effort becomes a matter of importance. [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) users can benefit from the opportunity to view the cell contents in the required format right in the Results grid whenever necessary. The tool takes the pain out of checking and analyzing tables with multi-format data. The Data visualizers feature that is tailored to suit the needs of the most demanding SQL developers allows viewing data in 9 common data formats: Hexadecimal, Text, XML, HTML, Rich Text, PDF, JSON, Image, and Spatial. To start working with Data Visualizers, you should just output the required data to Results Grid and then switch to Data Viewer. Image view The Image View option allows viewing database image contents right in the Results Grid which significantly saves your time and effort. To enable the Image View, click the corresponding icon on the toolbar. Viewing data in the hexadecimal format In case you need to view the cell values in the hexadecimal data format, click the Hexadecimal View button in the Data Viewer and Editor toolbar and enjoy the quick result. Viewing data in the Text format To display the table data in the Text format, use the Text View option. It might be very helpful when working with tables that contain long descriptions and text data. Viewing data in the XML format XML View is designed to show the cell contents in a popular XML format. Just click the button and see the result. Viewing data in the HTML format Switching to the HTML View will display the cell contents in the HTML format in a flash. Viewing data in the Rich Text format The Rich Text View allows viewing content in the Rich Text format with one click. Viewing data in the PDF format With the help of Data Viewer, you can quickly view a PDF document just by selecting the PDF View option and then positioning the mouse pointer over the value in the grid. Viewing data in the JSON format To view the data in the JSON format, click the JSON View button on the Data Viewer and Editor toolbar. Spatial view You can now view data representing spatial values much easier than ever before. Saving cell contents to a file Having viewed the cell contents in the format you’ve chosen, you can then save it to a file and SQL Complete will automatically suggest you save it in the format you’ve worked with. Tell Us What You Think The Devart team strives for excellence therefore our products receive improvements on an ongoing basis. [Download](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) SQL Complete from our website and let us know your opinion of it. Your feedback is highly appreciated and will help us strengthen our product portfolio further. Tags [Data Viewer](https://blog.devart.com/tag/data-viewer) [Data Visualizers](https://blog.devart.com/tag/data-visualizers) [sql complete](https://blog.devart.com/tag/sql-complete) [visualize data](https://blog.devart.com/tag/visualize-data) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-most-from-your-data-with-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+most+from+your+data+with+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fget-the-most-from-your-data-with-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-most-from-your-data-with-sql-complete.html&title=Get+the+most+from+your+data+with+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-most-from-your-data-with-sql-complete.html&title=Get+the+most+from+your+data+with+SQL+Complete) [Copy URL](https://blog.devart.com/get-the-most-from-your-data-with-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Get the New Update of SQL Tools With Support for SSMS 19 By [dbForge Team](https://blog.devart.com/author/dbforge) July 20, 2022 [0](https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html#respond) 4704 Following the [recent release of dbForge SQL Complete](https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html) , we have rolled out a similar update for the entire [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle. Now every add-in and standalone tool it contains is compatible with SSMS 19 Preview 2 , the latest preview release of SSMS. The updates will either be suggested upon opening SSMS—or you can install them for each add-in from the corresponding SSMS menu > Help > Check for Updates . And if you are using a standalone tool—for instance, dbForge Schema Compare—you can similarly update it from its menu > Help > Check for Updates . Not using SQL Tools yet? Then we gladly invite you to [download them for a free trial](https://devart.com/dbforge/sql/sql-tools/download.html) and see their expansive capabilities in action. Tags [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [sql tools](https://blog.devart.com/tag/sql-tools) [SQL Tools 6.1](https://blog.devart.com/tag/sql-tools-6-1) [what's new sql server tools](https://blog.devart.com/tag/whats-new-sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fget-the-new-update-of-sql-tools-with-support-for-ssms-19.html) [Twitter](https://twitter.com/intent/tweet?text=Get+the+New+Update+of+SQL+Tools+With+Support+for+SSMS+19&url=https%3A%2F%2Fblog.devart.com%2Fget-the-new-update-of-sql-tools-with-support-for-ssms-19.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html&title=Get+the+New+Update+of+SQL+Tools+With+Support+for+SSMS+19) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html&title=Get+the+New+Update+of+SQL+Tools+With+Support+for+SSMS+19) [Copy URL](https://blog.devart.com/get-the-new-update-of-sql-tools-with-support-for-ssms-19.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/getting-real-currency-exchange-rates-with-data-generator-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Getting Real Currency Exchange Rates with SQL Server Data Generator By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) July 1, 2016 [0](https://blog.devart.com/getting-real-currency-exchange-rates-with-data-generator-for-sql-server.html#respond) 11784 This article will demonstrate how to get live currency exchange rates with the help of Python and [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) . The currency exchange rate is the reference information used in the translation of monetary values from one currency to another. The exchange rate expresses the value of one currency in terms of another. The AdventureWorks demo database contains the Sales.CurrencyRate table. The table stores currency exchange rates and consists of the following columns: CurrencyRateID — Primary key for CurrencyRate records. CurrencyRateDate — Date and time the exchange rate was obtained. FromCurrencyCode — Exchange rate was converted from this currency code. ToCurrencyCode — Exchange rate was converted to this currency code. Average Rate — Average exchange rate for the day. EndOfDayRate — Final exchange rate for the day. ModifiedDate — Date and time the record was last updated. With the help of dbForge Data Generator for SQL Server, we can easily generate test values for that table. Simply select the Sales.CurrencyRate table and you will see the data generated by default. Definitely, you can play with settings to adjust values to be generated, but the result will not look as precise as you may need. Note, the generated values do not look like real currency rates and most probably will not fit your business logic. Moreover, if the [SQL unit testing](https://www.devart.com/dbforge/sql/unit-test/) is a part of your development process, tests can fail due to incorrect values. Now let’s see how we can easily resolve this problem with the help of the Python generator provided by dbForge Data Generator for SQL Server. Across the internet, there are a number of resources that provide JSON API for foreign exchange rates and currency conversion. We will use [fixer.io](https://fixer.io/) in this demo. Select the Python generator for the AverageRate column and use the following script to get the data: # The generator generates the currency rates based on the exchange rate date and currency codes.\n\nimport clr\nclr.AddReference(\"System\")\nclr.AddReference(\"Newtonsoft.Json\")\nimport urllib, json\nfrom urllib2 import urlopen\nfrom System import DateTime\nfrom Newtonsoft.Json import JsonConvert\n\ndef main(config): \n \n# API key is now required for the free server.\n# Get your free API key: https://free.currencyconverterapi.com/free-api-key\n apiKey = \"[YOUR_API_KEY]\"\n \n# CurrencyRateDate – name of the column in the current table that contains exchange rate dates.\n# FromCurrencyCode – name of the column in the current table that contains a currency code to convert from.\n# ToCurrencyCode – name of the column in the current table that contains a currency code to convert to.\n\n fromCode = str(FromCurrencyCode)\n toCode = str(ToCurrencyCode)\n from_to_codes = str(fromCode) + \"_\" + str(toCode)\n \n dt = DateTime.Parse(str(CurrencyRateDate))\n convert_date = str(dt.Year) + \"-\"+ str(str(dt.Month).zfill(2)) +\"-\"+ str(str(dt.Day).zfill(2)) \n\n# Free version only allows up to 1 year earlier. \n url = \"http://free.currconv.com\"\n url += \"/api/v7/convert?q=\" + from_to_codes + \"&date=\" + convert_date + \"&compact=ultra\" + \"&apiKey=\" + apiKey\n \n response = urllib.urlopen(url)\n jsonString = response.read()\n data = json.read(jsonString)\n \n if not data.ContainsKey(from_to_codes):\n return jsonString\n \n jsonObj = JsonConvert.DeserializeObject(str(data)); \n return str(jsonObj[from_to_codes][convert_date]) Do the same action towards the EndOfDayRate column. The Python script will be as follows: import clr\nclr.AddReference(\"System\")\nimport urllib, json\nfrom urllib2 import urlopen\nfrom System import DateTime\n\ndef main(config): \n dtStr = str(CurrencyRateDate)\n dt = DateTime.Parse(dtStr)\n year = dt.Year\n month = dt.Month \n day = dt.Day\n n1 = str(FromCurrencyCode)\n n2 = str(ToCurrencyCode) \n\n if not n1 or not n2:\n return \"N/A\" \n\n url = \"http://api.fixer.io/\"+ str(year) +\"-\"+ str(str(month).zfill(2)) +\"-\"+ str(str(day).zfill(2))+ \"?base=\"+str(n1)+ \"&symbols=\"+ n1+\",\"+n2\n\n response = urllib.urlopen(url)\n data = json.read(response.read())\n\n if not data.has_key(\"rates\"):\n return \"N/A\" \n\n return data[\"rates\"][n2] Now you can see that we have the real currency exchange rates. Keep in mind that the scripts we have provided are only an example. The API key constantly changes on the side of the provider, thus, we cannot keep the scripts up-to-date at all times. Make sure to adjust the provided scripts according to the requirements of your API provider. [dbForge Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/download.html) is a powerful tool that provides you unlimited opportunities and flexibility while resolving domain-specific tasks. Tags [data generator](https://blog.devart.com/tag/data-generator) [SQL Server](https://blog.devart.com/tag/sql-server) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgetting-real-currency-exchange-rates-with-data-generator-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Getting+Real+Currency+Exchange+Rates+with+SQL+Server+Data+Generator&url=https%3A%2F%2Fblog.devart.com%2Fgetting-real-currency-exchange-rates-with-data-generator-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/getting-real-currency-exchange-rates-with-data-generator-for-sql-server.html&title=Getting+Real+Currency+Exchange+Rates+with+SQL+Server+Data+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/getting-real-currency-exchange-rates-with-data-generator-for-sql-server.html&title=Getting+Real+Currency+Exchange+Rates+with+SQL+Server+Data+Generator) [Copy URL](https://blog.devart.com/getting-real-currency-exchange-rates-with-data-generator-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/getting-started-with-database-designer.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) Getting Started with MySQL Database Designer By [dbForge Team](https://blog.devart.com/author/dbforge) March 24, 2009 [1](https://blog.devart.com/getting-started-with-database-designer.html#comments) 6289 Contents What Is Database Designer? How to Create a Database Diagram Navigating the Diagram How to Quickly Print Diagram Saving and Restoring the Diagram What Is Database Designer? dbForge Studio incorporates [MySQL Database Designer](https://www.devart.com/dbforge/mysql/studio/database-designer.html) — a powerful visual design tool for database development that allows you to build a clear and effective database structure visually and to see the complete picture representing all the tables, foreign key relations between them, views, and stored routines of your database on a database diagram. Database Designer allows you to view and edit your database in a convenient visual way. You can observe all objects of your database, see foreign key relations between tables, viewing database objects information, easily access database objects for editing, retrieving data, executing stored routines. Popup menu of the database object on the diagram provides access to the same actions as the popup menu of the database object in the Database Explorer. Shapes, representing the database object, can be moved (by dragging them with mouse or by keyboard), resized, aligned to the diagram grid. Diagram can be saved for future use. With Database Designer you also can perform reverse engineering of your databases to IDEF1X or IE diagrams, which can be easily printed. Note: Diagram does not store information about database object details offline. You need an opened connection to database with the diagram objects to work with the diagram. How to Create a Database Diagram To create database diagram perform the following steps. Click the New Database Diagram button on the Standard toolbar. Drag the database objects from the Database Explorer to the diagram. (Or you may create new database objects on the diagram using MySQL Database Diagram toolbar). You may click the Layout Diagram button on the Layout toolbar to layout diagram for more convenient view. (Or you can layout diagram manually by dragging shapes and moving the relations. You also may use containers to organize your diagram.) You may add stamp to your diagram, that displays company and project names, diagram author, version, date, and copyrights. To add the stamp, click the New Stamp button on the MySQL Database Diagram toolbar. To edit stamp fields double-click them and enter values. You also may add notes to your diagram to add describe its objects. Save the diagram for future use. Navigating the Diagram You can navigate the diagram in several ways. Using scrollbars. Moving mouse while holding the middle mouse button. Using Diagram Overview window. Just drag square visible area on the Diagram Overview window with your mouse. To display the Diagram Overview window select Other Windows -> Diagram Overview from the View menu or press CTRL+V, W. You can zoom in and out the diagram to simplify the navigating. To quickly navigate to the database objects you may use diagram search. How to Quickly Print Diagram To print the diagram, perform the following steps: Click the Display Print Markup button on the Diagram toolbar. The print markup grid will be displayed. Gray stripes shows the page overlapping. Layout diagram for convenient view considering the print markup. Select Print from the File menu. After you have printed the diagram, you may turn off the print markup by choosing Print Markup from the diagram popup menu. Saving and Restoring the Diagram Diagrams can be saved and restored just like any other document by using Save and Open buttons on the Standard toolbar. Diagram does not store information about database object details offline. So, when you open the diagram, dbForge Studio for MySQL tries to establish connection with MySQL database, containing objects of the diagram. If connection cannot be established, the diagram cannot be opened. Diagram stores the following information when saved to file: layout information information about containers, notes, stamps connection information diagram database object names and owners names of the foreign keys, displayed on the diagram as relations If the diagram database was modified after saving the diagram, Database Designer updates information about all database objects, displayed on it, when opening. But the newly created objects in the database does not appear on the diagram. If the diagram object was deleted from the database, it is deleted from the diagram Tags [database diagram](https://blog.devart.com/tag/database-diagram) [MySQL](https://blog.devart.com/tag/mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgetting-started-with-database-designer.html) [Twitter](https://twitter.com/intent/tweet?text=Getting+Started+with+MySQL+Database+Designer&url=https%3A%2F%2Fblog.devart.com%2Fgetting-started-with-database-designer.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/getting-started-with-database-designer.html&title=Getting+Started+with+MySQL+Database+Designer) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/getting-started-with-database-designer.html&title=Getting+Started+with+MySQL+Database+Designer) [Copy URL](https://blog.devart.com/getting-started-with-database-designer.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025 1 COMMENT dbForge Team’s Blog » Blog Archive » How To: Save Database Diagrams as an Image May 27, 2009\t\t\t\t\t\t At\t\t\t\t\t\t 1:44 am […] to build and view the structure of your database using the diagrams. You can read more about it in Getting Started with Database Designer […] Comments are closed."} {"url": "https://blog.devart.com/getting-started-with-dbforge-source-control.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Getting Started With dbForge Source Control By [dbForge Team](https://blog.devart.com/author/dbforge) July 18, 2022 [0](https://blog.devart.com/getting-started-with-dbforge-source-control.html#respond) 3318 [dbForge Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/) is an SSMS add-in that enables database version control and is compatible with the biggest version control systems, including Git, Mercurial, SVN, TFVC, Azure DevOps Server, Perforce, and SourceGear Vault. With its help, you can retrieve, commit, and revert changes in your SQL Server databases, resolve conflicts, view data and schema differences in your local and remote repositories, work with multiple branches, and do much more directly from SSMS. In this article, we’ll show you how to get started with the basic operations in Source Control quickly and effortlessly. CONTENTS Download and install dbForge Source Control Link a database to a repository Link static data Retrieve the latest version Commit changes Undo changes Resolve conflicts View the history of changes Download Source Control for a free trial Download and install dbForge Source Control Source Control can be downloaded as part of [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , a bundle of 15 SSMS add-ins and standalone apps that cover different aspects of SQL Server development, management, and administration. So, first off, [download](https://www.devart.com/dbforge/sql/sql-tools/download.html) the bundle from our official website, run the installation file, and let the wizard do the rest. In case you get stuck during the installation, feel free to consult this brief step-by-step video guide. Once the installation is completed, Source Control will be accessible from SSMS. Link a database to a repository Let’s start our exploration with linking your database to a repository. In our case, it will be Git —the most popular version control system. As for the prerequisites, you need to have the Git client installed on your Windows machine, and a Git repository created locally or cloned from a remote repository. Once it’s done, you can proceed to link your databases. To link a database to a Git repo, take the following steps: 1. In the SSMS Object Explorer , right-click the required database and select Source Control > Link database to Source Control from the shortcut menu. 2. The Link Database to Source Control wizard opens. On the Link page, click + in the Source control repository field. Note: Refer to our documentation to learn more about the [dedicated development model](https://docs.devart.com/source-control/linking-to-source-control/dedicated-model.html) and the [shared development model](https://docs.devart.com/source-control/linking-to-source-control/shared-model.html) . 3. In the Source Control Repository Properties dialog that opens, select Git from the Source сontrol system drop-down list and provide a path to your local Git repository. Note: Check our comprehensive guides to setting up and managing version control using the following solutions: [How to version-control SQL Server databases in GitHub](https://www.devart.com/dbforge/sql/source-control/version-controlling-databases-in-github.html) [How to version-control SQL Server databases in GitLab](https://www.devart.com/dbforge/sql/source-control/how-to-set-up-source-control-for-gitlab.html) [How to version-control SQL Server databases in Azure DevOps](https://www.devart.com/dbforge/sql/source-control/version-controlling-git-in-azure-devops.html) [How to version-control SQL Server databases in Bitbucket](https://www.devart.com/dbforge/sql/source-control/how-to-set-up-source-control-for-bitbucket.html) 4. Click Test to check the connection. Then click OK to close the dialog. 5. Back at Link Database to Source Control , select the preferred database development model and click Link . If the linking is successful, you will see the following icon in Object Explorer; it indicates that your database is linked to Source Control. That’s it! Note: Let us reiterate that Git is not the only version control system supported by dbForge Source Control. Refer to our documentation for the following topics: [Linking to SVN](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-svn.html) [Linking to TFVC](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-tfs.html) [Linking to Mercurial](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-mercurial.html) [Linking to Perforce](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-perforce.html) [Linking to SourceGear Vault](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-sourcegear-vault.html) [Linking to Plastic SCM](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-plastic-scm.html) [Linking to a working folder](https://docs.devart.com/source-control/linking-to-source-control/linking-db-to-working-folder.html) Link static data Source Control also lets you link and version-control static table data. And, since static data can have an impact on database performance and proper functioning, keeping track of changes in it is a good idea. To link static data to Source Control, take the following steps: 1. In the SSMS Object Explorer , right-click the database containing the required tables with static data. Point to Source Control and select Link / Unlink Static Data . 2. The Source Control Link Static Data dialog opens. Select the required tables with static data and click Apply . Please note that you can link and source-control only tables with properly defined primary keys. Afterwards, the Refresh dialog opens automatically, showing the progress of each stage. 3. Once the refresh is done, Source Control Manager opens, where you can check all the introduced changes in the corresponding tabs. If everything is correct, go to the Local Changes section, select the linked tables with static data, and click Commit . Learn more in our documentation: [Linking static data](https://docs.devart.com/source-control/working-with-source-control/linking-static-data.html) Retrieve the latest version To get the latest changes from Source Contol, do the following: 1. In the SSMS Object Explorer , right-click the linked database that you need to update, point to Source Control , and click Get Latest . Like in the previous case, the Refresh dialog opens automatically, showing the progress of each stage. 2. Once the refresh is done, Source Control Manager opens. In the Remote Changes section, select the objects and/or data that you need to update and click Get Latest . Note that if you select not all the related objects, but only one, then, upon clicking Get Latest , the following Dependencies window opens with a suggestion to include all affected objects: 3. If everything is correct, upon clicking Get Latest , you will see the Get Latest dialog that shows the progress. Once it’s all completed, click OK . Done! Now your local version is updated with the latest changes. Learn more in our documentation: [Getting the latest version](https://docs.devart.com/source-control/working-with-source-control/getting-the-latest-version.html) Commit changes To commit changes to your repository, do the following: 1. In the SSMS Object Explorer , right-click the required linked database, point to Source Control , and click Commit . Afterwards, the Refresh dialog opens automatically, showing the progress of each stage. 2. Once the refresh is done, Source Control Manager opens. In the Remote Changes section, select the objects and/or data that you want to commit. Note that if you select not all the related objects, but only one, then, upon clicking Commit , the following Dependencies window opens with a suggestion to include all affected objects: 3. In the text box, write a comment describing your commit. This will help your fellow developers understand what it’s about. Then check the changes to make sure everything is right. 4. Click Commit . The corresponding dialog opens, showing the progress of the commit operation. Once it’s all completed, click OK . That’s it! Now your repository is updated with your local changes. Learn more in our documentation: [Committing changes](https://docs.devart.com/source-control/working-with-source-control/committing-changes.html) Undo changes You can undo changes that have been made in database objects but have not been committed yet. However, note that the undo operation leads to permanent changes in your databases; these changes can be reverted only through the restoration of a previously saved database backup. To undo changes, take the following steps: 1. In the SSMS Object Explorer , right-click a linked database or a specific database object, point to Source Control , and click Show Source Control Manager . 2. In the Local changes section of Source Control Manager , select the changes that you want to undo, and click Undo . 3. After the process is completed, click OK . Now your local version is reverted to the latest changes from your repository. Take note that you cannot undo changes that have been committed, changes in static data, as well as dropped data. Learn more in our documentation: [Undoing changes](https://docs.devart.com/source-control/working-with-source-control/undoing-changes.html) Resolve conflicts A conflict occurs when two or more people simultaneously introduce changes to the same database object. In Source Control, conflicts are displayed in the corresponding section of Source Control Manager. To resolve a conflict in Source Control, do the following: 1. In the SSMS Object Explorer , right-click a linked database or a specific database object, point to Source Control , and click Show Source Control Manager . 2. Select the conflicting object or data and pick either of the following ways to resolve your conflict: Get Local – your version of the said object or data will be committed to Source Control Get Remote – your changes will be discarded; your local database will be updated with the latest version of the said object or data from Source Control That’s it! But keep in mind that data changes may not apply without the related schema changes. If you have applied both schema and data changes to a database object, you need to commit them simultaneously. Similarly, if you are pulling someone else’s schema and data changes, pull them simultaneously. That said, if you have a schema conflict and a data change on an object, this change cannot be committed or retrieved without resolving the schema conflict beforehand. Learn more in our documentation: [Resolving conflicts](https://docs.devart.com/source-control/working-with-source-control/resolving-conflicts.html) View the history of changes Changes in Source Control can be tracked using the Changes History, which lets you do the following: View the history of changes for entire databases or separate objects View the details of each commit, including revision ID, date, author, and comments View a list of objects modified in each commit View DDL differences for each object Compare two revisions To open the Changes History, do the following: 1. In the SSMS Object Explorer , right-click a linked database or a specific database object, point to Source Control , and click View Changes History . 2. And if you want to compare two separate revisions, press and hold CTRL and select your revisions from the list. The differences will be highlighted with red, as shown in the screenshot below. Learn more in our documentation: [Viewing Source Control history](https://docs.devart.com/source-control/working-with-source-control/viewing-source-control-history.html) Download Source Control for a free trial Now that you know all about the basic operations in dbForge Source Control, you are all set to try it yourself. And there’s no better way to start than to [download Source Control for a free 30-day trial](https://www.devart.com/dbforge/sql/source-control/download.html) right now. Since it is available as part of the SQL Tools bundle, we suggest you check other tools as well—we bet you will have a lot of useful discoveries. Tags [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [source control](https://blog.devart.com/tag/source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [version control](https://blog.devart.com/tag/version-control) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgetting-started-with-dbforge-source-control.html) [Twitter](https://twitter.com/intent/tweet?text=Getting+Started+With+dbForge+Source+Control&url=https%3A%2F%2Fblog.devart.com%2Fgetting-started-with-dbforge-source-control.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/getting-started-with-dbforge-source-control.html&title=Getting+Started+With+dbForge+Source+Control) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/getting-started-with-dbforge-source-control.html&title=Getting+Started+With+dbForge+Source+Control) [Copy URL](https://blog.devart.com/getting-started-with-dbforge-source-control.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/git-checkout-and-dbforge-source-control.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Git Checkout and dbForge Source Control – Comprehensive Guide By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) September 12, 2023 [0](https://blog.devart.com/git-checkout-and-dbforge-source-control.html#respond) 1969 Git is one of the most popular version control systems, ideal for handling both small and large projects. A standout feature of Git is its branching model that allows the users to create multiple independent branches and manage various aspects of a project without jeopardizing the work of others. A crucial ability to switch between these branches effortlessly is ensured by the git checkout command. In this article, we will delve into this command. Contents Understanding Git checkout Detached HEAD in Git Overview of dbForge Source Control Linking SQL database to a Git repository with dbForge Source Control Switching between branches in dbForge Source Control Conclusion Understanding Git Checkout The most frequent scenario involves using git checkout to transition between branches, but it also applies to switching between separate commits. The command for switching to another branch is: git checkout Shifting to a specific commit is accomplished with the git checkout command. Let’s delve deeper into this command and explore its various use cases. Switching to a different existing branch When your work repository contains several existing branches, you can seamlessly transition between them using the git checkout command. For instance, begin by checking the available branches and their names: git branch Now, you can switch to the main branch using the git checkout command: git checkout main All the subsequent commits will be recorded within the context of the main branch, where you can create new branches to perform specific tasks. Switching to a new branch The git checkout command often collaborates with the git branch command, especially when you start working on a new branch. You first employ the git branch command to create a new branch, followed by git checkout to transition into this newly created branch. However, instead of using two distinct commands, you can enhance the git checkout command by incorporating the -b flag: git checkout -b With this option, Git will automatically execute the git branch command to create the new branch and immediately transfer you to it using git checkout : git checkout -b hotfix The above command generates a new branch called hotfix and transitions the user to it. Switching to a remote branch Teams commonly use remote repositories to store their sets of branches. When you need to transition to a branch from such a remote repository, the process is the same as when switching to a different local branch. To begin, retrieve the contents of the remote branch using the command: git fetch --all Then, execute the git checkout command to transition to the desired remote branch: git checkout Note: This method is supported by the current versions of Git. If you’re working with older Git versions, you’ll need to first establish a local branch based on the corresponding remote branch. All checkout operations are recorded in the reference log (reflog), along with details about each commit. To inspect the reflog and gain insights into branch switching history, use the following command: git reflog Keep in mind that the reference log maintains information for a limited period. You can check and modify this retention period within the Git settings. Detached HEAD in Git Now that we’ve examined the typical use cases of the git checkout command for working with branches, let’s delve into the issue of detached HEAD . A detached HEAD occurs when the HEAD pointer is not attached to any specific branch in the repository – it points to a specific commit instead. Consequently, your current work becomes isolated from the rest of your project and its history. No branch will reference it, and Git won’t include it in the commit history. In other words, you’ll be able to access the work committed from this detached state only if you know its exact hash value. How to Resolve the Detached HEAD Issue The detached HEAD state isn’t an error – it is a part of the standard process, like when you need to go back several commits to test a certain functionality. Intentionally entering the detached HEAD state becomes problematic if you begin committing changes without first creating a branch. If you find yourself in a detached HEAD state, you can address it in two ways: Create a new branch from the current commit This is the ideal solution if you plan to continue work from a specific commit. Even if you begin committing immediately, you can always create a new branch. Сommits maintain their hierarchical relationship — each commit connects to the previous one, allowing you to restore the correct sequence. Use the standard command: git checkout -b Thus, make sure to create the new branch before switching to another commit/branch. Return to the previous (or another) branch in Git If you entered the detached HEAD state merely to view some specific commit, you might want to return to the previous branch. The following commands can accomplish this: git checkout - Git will switch you to the branch that you were on before the current one, extracting you from the detached HEAD state. Alternatively, specify the desired branch with: git checkout This command allows you to quickly switch from the detached HEAD state to any branch and continue working as usual. In Git, branches can be created and managed using the command-line utility or other dedicated tools. However, if you’re working with SQL Server databases, it’s beneficial to employ the familiar environment of SQL Server Management Studio (SSMS). This is made possible by dbForge Source Control. Overview of dbForge Source Control [dbForge Source Control](https://www.devart.com/dbforge/sql/source-control/) is an add-in for SSMS that is available as a part of the [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) package or as a feature of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/studio-sql.html) . Source Control serves as a client for your chosen version control system that can be naturally integrated into your DevOps processes. This solution allows you to do the following: Connect the databases to the version control system and manipulate both the database schemas and table data Perform the essential tasks visually: get the latest changes, commit and roll back changes, and handle conflicts Access the full history of revisions related to the entire database or particular objects dbForge Source control supports both shared and dedicated work modes, and, besides Git (with GitHub, GitLab, and Bitbucket), is compatible with all major version control systems, such as Azure DevOps Server, Apache Subversion (SVN), Mercurial (Hg), Perforce (P4), TFVC, and SourceGear Vault. Linking SQL Server database to a Git repository with dbForge Source Control When you have the Git for Windows client on your machine and the local Git repository created or cloned from the remote repository, you can link your database to the repository using dbForge Source Control: In SSMS, select the necessary database in Object Explorer , and right-click on it. Choose Source Control > Link database to Source Control . In the Wizard, click the + icon next to the Source control repository . Select Git and specify the path to the local Git repository (you need to clone it). Click Test to make sure your settings are correct, and click OK to finish the configuration. Click Link . After that, the database in Object Explorer will be marked as linked. Now you can work on your database, perform all the necessary tasks, and commit/revert changes directly from the SQL Server Management Studio. Switching between branches in dbForge Source Control After linking your database to the version control system, you can access it from within SSMS. By default, you work with the current branch of the repository. Note the available options: However, the git checkout command is not supported by the Source Control functionality directly. To switch between branches and create new branches, you need to use the command-line capacities or other tools created for Git. If you need to switch to another branch in Git, perform the git checkout command in a standard manner, as you prefer in your workflow. After that, to synchronize that switch with your SSMS interface, unlink and relink the database: Go to Object Explorer, right-click the necessary database, select Source Control , and proceed to Unlink Database from Source Control . Then click Yes . 2. Right-click the database again and repeat the same steps to get to the Source Control menu. Choose Link database to Source Control as before, and relink the database. Now you can define the location of your new branch and work on it effectively from within the SSMS interface. As you can see, the current branch has changed. Conclusion Thanks to Git’s support for branching, developers can work more safely and conveniently, focusing on tasks without the danger of impacting work in different branches. In this context, git checkout lets developers switch to the desired branch swiftly and omit the potentially hazardous detached HEAD state. For developers favoring SQL Server Management Studio, dbForge Source Control provides direct access within the SSMS interface to options like committing changes to version control systems (Git and others), reverting changes, and resolving conflicts. This eliminates the need for external tools. Once you’ve switched to the desired branch, you can focus on your tasks and manage changes efficiently. You can test the Source Control capabilities under a full workload, and decide whether to install only the Source Control or include other SQL Tools to transform your SSMS into a much more robust and adaptable system. A [30-day fully functional trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) allows you to assess the SQL Tool’s powers across all areas, including DevOps. dbForge Source Control and its complementary tools can streamline and safeguard your development and deployment processes. Tags [dbForge Source Control](https://blog.devart.com/tag/dbforge-source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgit-checkout-and-dbforge-source-control.html) [Twitter](https://twitter.com/intent/tweet?text=Git+Checkout+and+dbForge+Source+Control+%E2%80%93+Comprehensive+Guide&url=https%3A%2F%2Fblog.devart.com%2Fgit-checkout-and-dbforge-source-control.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/git-checkout-and-dbforge-source-control.html&title=Git+Checkout+and+dbForge+Source+Control+%E2%80%93+Comprehensive+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/git-checkout-and-dbforge-source-control.html&title=Git+Checkout+and+dbForge+Source+Control+%E2%80%93+Comprehensive+Guide) [Copy URL](https://blog.devart.com/git-checkout-and-dbforge-source-control.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/gui-dbforge-studio-for-mysql-v4-50-311-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) GUI dbForge Studio for MySQL, v4.50.311 Released By [dbForge Team](https://blog.devart.com/author/dbforge) October 27, 2010 [0](https://blog.devart.com/gui-dbforge-studio-for-mysql-v4-50-311-released.html#respond) 2704 Changes (as compared to 4.50.308) include: Bug fixes: * Removed spaces in the integer and float format in Data Export wizard (T23612) * Fixed NullReferenceException on saving a database project with opened table editors (T23519) * Fixed NullReferenceException on deleting a table from a database with an opened inactive editor (T56081) * Fixed NullReferenceException on auto hiding a tool window (T56033) * Added possibility to edit a definer for triggers in the trigger editor (T23357) * Fixed problem with BINARY(16) column type on selecting its data to the Data Editor (T23292) * Fixed problem with empty gray windows after editing a table in the modal editor from Database Designer (T23164, T23018) * Fixed link on the first page in the product tour (55629) * Fixed syntax checking for some constructions in CREATE TABLE statements (55161) * Fixed lost connection on the data comparing (T22521) * Impossibility to disable Code Completion in the Express edition was fixed (54932) * Files in database projects are saved with a relative path now (54378) Downloads: [https://www.devart.com/dbforge/mysql/studio/download.html](https://www.devart.com/dbforge/mysql/studio/download.html) Ordering: [https://www.devart.com/dbforge/mysql/studio/ordering.html](https://www.devart.com/dbforge/mysql/studio/ordering.html) Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgui-dbforge-studio-for-mysql-v4-50-311-released.html) [Twitter](https://twitter.com/intent/tweet?text=GUI+dbForge+Studio+for+MySQL%2C+v4.50.311+Released&url=https%3A%2F%2Fblog.devart.com%2Fgui-dbforge-studio-for-mysql-v4-50-311-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/gui-dbforge-studio-for-mysql-v4-50-311-released.html&title=GUI+dbForge+Studio+for+MySQL%2C+v4.50.311+Released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/gui-dbforge-studio-for-mysql-v4-50-311-released.html&title=GUI+dbForge+Studio+for+MySQL%2C+v4.50.311+Released) [Copy URL](https://blog.devart.com/gui-dbforge-studio-for-mysql-v4-50-311-released.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/gui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [What’s New](https://blog.devart.com/category/whats-new) GUI Tool for MySQL: dbForge Studio for MySQL, v4.50.342 Released By [dbForge Team](https://blog.devart.com/author/dbforge) March 18, 2011 [0](https://blog.devart.com/gui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html#respond) 2841 Changes (as compared to 4.50.339) include: Bug fixes: * The error on entering selection criteria after selecting Group By fields in Query Builder fixed (T26884) * The problem with showing 0 as a parameter value instead of NULL in the parameter editor fixed (T26867) * Removing field’s autoincrement sign on adding index in the table fixed (T24859) * Extra new line on copying data from a cell in Data Editor removed (T27021) * The problem with losing ON UPDATE CURRENT_TIMESTAMP during editing a table fixed (T27090) * The problems with connecting to MySQL Server via SSH tunnel fixed (T26911, T20079) * InvalidOperationExceptions on generating schema synchronization script in the Schema Comparison tool fixed (T27106, T27108) Read more about the tool: [https://www.devart.com/dbforge/mysql/studio/](https://www.devart.com/dbforge/mysql/studio/) Downloads: [https://www.devart.com/dbforge/mysql/studio/download.html](https://www.devart.com/dbforge/mysql/studio/download.html) Tags [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [what's new mysql studio](https://blog.devart.com/tag/whats-new-in-mysql-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fgui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html) [Twitter](https://twitter.com/intent/tweet?text=GUI+Tool+for+MySQL%3A+dbForge+Studio+for+MySQL%2C+v4.50.342+Released&url=https%3A%2F%2Fblog.devart.com%2Fgui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/gui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html&title=GUI+Tool+for+MySQL%3A+dbForge+Studio+for+MySQL%2C+v4.50.342+Released) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/gui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html&title=GUI+Tool+for+MySQL%3A+dbForge+Studio+for+MySQL%2C+v4.50.342+Released) [Copy URL](https://blog.devart.com/gui-tool-for-mysql-dbforge-studio-for-mysql-v4-50-342-released.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/handle-concurrency-conflicts-in-asp-net-core-and-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ADO.NET Data Providers](https://blog.devart.com/category/products/ado-net-data-providers) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Handle Concurrency Conflicts in ASP.NET Core and Oracle By [dotConnect Team](https://blog.devart.com/author/dotconnect) December 23, 2021 [0](https://blog.devart.com/handle-concurrency-conflicts-in-asp-net-core-and-oracle.html#respond) 3696 Concurrency conflicts arise due to concurrent access to a shared resource. This article explores ways to handle concurrency conflicts using ASP.NET Core and Oracle. To connect to Oracle, dotConnect for Oracle will be used. Prerequisites You’ll need the following tools to deal with code examples: Visual Studio 2019 Community Edition ( [download](https://visualstudio.microsoft.com/downloads/) ) Oracle Express Edition ( [download](https://www.oracle.com/in/database/technologies/xe-downloads.html) ) dotConnect for Oracle ( [download](https://www.devart.com/dotconnect/oracle/) ) What is Concurrency Handling? Concurrency handling is a technique for detecting and resolving problems caused by two concurrent requests to the same resource. It is a mechanism that enables numerous users to access the same data at the same time while ensuring that the data stays consistent throughout all future requests. Concurrency handling is a proven technique of identifying and resolving conflicts generated by concurrent requests to the same resource. In essence, it may be used to help enforce data integrity and consistency when several concurrent users attempt to access the same resource at the same time. Concurrency violations might occur when you’ve interdependent transactions, that is, transactions that rely on one another and attempt to access the same resource. A transaction comprises a collection of statements bundled together that is either guaranteed to be executed in its entirety or rolled back. Transactions contribute to data integrity and security. There are two approaches to handling concurrency conflict: optimistic concurrency and pessimistic concurrency strategies. Let’s now discuss how each of these works. Pessimistic Concurrency Pessimistic concurrency entails locking rows to prevent other users from changing the data and avoiding data inconsistency. This strategy makes the records inaccessible from the time they were last fetched until it is updated in the database. Hence, when a record is modified, all other concurrent changes on the same record are placed on hold until the current operation is completed, and the lock on the record is released. This approach is a good choice in environments with high data contention. You can take advantage of pessimistic concurrency in scenarios with short lock durations. However, pessimistic concurrency does not scale well when users interact with data, causing records to lock in longer periods. Optimistic Concurrency In the optimistic concurrency model, records are not locked; when a user attempts to edit a row, the application detects whether another user has modified the row since it was read last in the memory. Optimistic concurrency is often used in scenarios where data is scarce. Optimistic concurrency enhances efficiency by eliminating the need for record locking, which consumes extra server resources. Optimistic concurrency enhances performance and scales better since it allows the server to serve more clients in less time. The optimistic concurrency option follows the “last saved wins” policy, which means that the most recently changed value is stored in the database. In other words, the most recently saved record “wins.” It should be noted that the optimistic concurrency management technique assumes that resource conflicts caused by concurrent accesses to a shared resource are improbable, but not impossible. You do not need to check for concurrent modifications to the same resource (i.e., a record in your database table) using this method; the record is simply rewritten. Concurrency violation In the event of a concurrency violation, the most recent data in the database will be re-read and the update will then be attempted again. To check for concurrent violations, you would need to ascertain the changes to the record since the previous time the application read it. For optimistic concurrency control to work properly, your application must first check the row version before proceeding with an update activity. Create a new ASP.NET Core Web API Project Earlier, we mentioned the necessary tools to proceed to the practical scenarios. The time has come to use those tools. First, we need to create a new ASP.NET Core Web API project: Open Visual Studio 2019. Click Create a new project . Select ASP.NET Core Web Application and click Next . Specify the project name and location to store that project in your system. Optionally, checkmark the Place solution and project in the same directory checkbox. Click Create . In the Create a new ASP.NET Core Web Application window, select API as the project template. Select ASP.NET Core 5 or later as the version. Disable the Configure for HTTPS and Enable Docker Support options (uncheck them). Since we won’t use authentication in this example, specify authentication as No Authentication . Click Create to finish the process. We’ll use this project in this article. Implement Concurrency Handling in ASP.NET Core and Oracle When several users attempt to alter the same piece of data simultaneously, you must have a way to avoid one user’s modifications from interfering with other users’ changes. Concurrency Control is the process of detecting and resolving database changes performed by many users concurrently. Create a new database table. First off, let’s create a new database table in Oracle. The following code snippet can be used to create a new table called product in Oracle. create table product\n (\n product_id number primary key,\n product_name varchar2(50) not null,\n price double precision not null\n ); Next, insert a few records in the product table using the following script: insert into product values(1,'Lenovo Laptop',1000.00);\ninsert into product values(2,'HP Laptop',1000.00);\ncommit; Install NuGet Packages To get started you should install the dotConnect for Oracle package in your project. You can install it either from the NuGet Package Manager tool inside Visual Studio or, from the NuGet Package Manager console using the following command: PM> Install-Package Devart.Data.Oracle If the installation is successful, you’re all set to get started using dotConnect for Oracle. Creating an OracleConnection Now provide the Oracle database credentials in your application to establish a connection to the database. You can save this information configurable by storing it in the application’s config file as well. The code snippet given below illustrates how you can create an instance of OracleConnection: String connectionString = \"User Id=Your user Id; Password=Your password; Server = localhost; License Key = Specify your license key here; \";\nOracleConnection oracleConnection = new OracleConnection(connectionString); You should include the following namespace in your program: using Devart.Data.Oracle; Updating Data Let’s now examine how to simulate optimistic concurrency in ADO.NET using dotConnect for Oracle. We’ll take advantage of the disconnected mode of operation here. The following code snippet illustrates how you can use the OracleDataAdapter to simulate optimistic concurrency in ADO.NET. OracleDataAdapter oracleAdapter = new OracleDataAdapter();\noracleAdapter.SelectCommand = new OracleCommand(\"select * from product\", oracleConnection);\n\nDataSet dataSet = new DataSet();\noracleAdapter.Fill(dataSet);\n\noracleAdapter.UpdateCommand = new OracleCommand(\"UPDATE product Set id = :id, \" + \n\"name = :name WHERE id = :oldid AND name = :oldname\", oracleConnection);\noracleAdapter.UpdateCommand.Parameters.Add(\":id\", OracleDbType.Integer, 5, \"id\");\noracleAdapter.UpdateCommand.Parameters.Add(\":name\", OracleDbType.VarChar, 50, \"name\");\n\nOracleParameter parameter = oracleAdapter.UpdateCommand.Parameters.Add(\":oldid\", OracleDbType.Integer, 5, \"id\");\nparameter.SourceVersion = DataRowVersion.Original;\nparameter = oracleAdapter.UpdateCommand.Parameters.Add(\":oldname\", OracleDbType.NVarChar, 50, \"name\");\nparameter.SourceVersion = DataRowVersion.Original;\ndataSet.Tables[0].Rows[0][\"name\"] = \"DELL Laptop\";\n\noracleAdapter.Update(dataSet); The above code snippet shows how you can set the UpdateCommand of the OracleDataAdapter to test for optimistic concurrency. Note how the “where” clause of the update command has been used to check for any optimistic concurrency violations that might have occurred. Create a class named Product a shown below: public class Product\n {\n public int Id { get; set; }\n public string Name { get; set; }\n public double Price { get; set; } \n } We’ll use this class as our model in this example. Create an Action Method in ProductController Class Create a new controller class in your project named ProductController. The following code listing illustrates how you can write a controller action method to update a record in the product table. [HttpPut]\npublic IActionResult Put([FromBody] Product product) {\n string connectionString = \"User Id=Your user Id;Password=Your password; Server = localhost; License Key = Your License Key\";\n\n try {\n using(OracleConnection oracleConnection = new OracleConnection(connectionString)) {\n try {\n OracleDataAdapter oracleAdapter = new OracleDataAdapter();\n oracleAdapter.SelectCommand = new OracleCommand(\"select * from product where id \n = :id\", oracleConnection);\n oracleAdapter.SelectCommand.Parameters.Add(\"id\", product.Id);\n\n DataSet dataSet = new DataSet();\n oracleAdapter.Fill(dataSet);\n\n if (dataSet.Tables[0].Rows.Count == 0) return NotFound();\n\n oracleAdapter.UpdateCommand = new OracleCommand(\"update product set name = \n :name \"+\n \"WHERE id = :oldid AND name = :oldname\", oracleConnection);\n oracleAdapter.UpdateCommand.Parameters.Add(\":name\", OracleDbType.VarChar, 50, \n \"name\");\n\n OracleParameter parameter = \n oracleAdapter.UpdateCommand.Parameters.Add(\":oldid\", product.Id);\n parameter.SourceVersion = DataRowVersion.Original;\n\n parameter = oracleAdapter.UpdateCommand.Parameters.Add(\":oldname\", \n OracleDbType.VarChar, 50, \"name\");\n parameter.SourceVersion = DataRowVersion.Original;\n\n dataSet.Tables[0].Rows[0][\"name\"] = product.Name;\n\n oracleAdapter.Update(dataSet);\n } catch (DBConcurrencyException ex) {\n Debug.WriteLine(ex.Message);\n return BadRequest(\"Db Concurrency Violation\");\n }\n\n if (oracleConnection.State != ConnectionState.Closed) oracleConnection.Close();\n }\n } catch (Exception ex) {\n Debug.WriteLine(ex.Message);\n return BadRequest(\"Error...\");\n }\n\n return Ok(\"1 record updated...\");\n} Remember to include the following namespace in your ProductController.cs file: using Devart.Data.Oracle; Execute the Application Launch the Postman tool and execute the endpoint after specifying the URL as shown below. Note the text message “1 record updated…” in the response. Set a breakpoint in the call to the Update method of the OracleDataAdapter class in your action method. Now, hit the same endpoint once again. Run the following commands to update the record explicitly: update product set name = 'DELL Laptop' where id = 1;\ncommit; Now when you click on F10, you’ll be presented with the following exception: Once you click on F5, you’ll be able to see the response returned in Postman as shown below: Summary In this post, we’ve implemented concurrency handling using OracleDataAdapter in disconnected mode. You can also implement concurrency in connected mode and take advantage of the row version by using a Timestamp column in your database. Once a record is updated, the row version will change, and you can check if the row version of a record has changed during an update operation on the database. Tags [dotconnect](https://blog.devart.com/tag/dotconnect) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dotConnect Team](https://blog.devart.com/author/dotconnect) [https://www.devart.com/dotconnect/](https://www.devart.com/dotconnect/) The dotConnect Team is a group of experienced .NET developers at Devart who specialize in building and supporting dotConnect data providers. They share practical insights, coding tips, and tutorials on .NET development and database connectivity through the Devart blog. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhandle-concurrency-conflicts-in-asp-net-core-and-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Handle+Concurrency+Conflicts+in+ASP.NET+Core+and+Oracle&url=https%3A%2F%2Fblog.devart.com%2Fhandle-concurrency-conflicts-in-asp-net-core-and-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/handle-concurrency-conflicts-in-asp-net-core-and-oracle.html&title=Handle+Concurrency+Conflicts+in+ASP.NET+Core+and+Oracle) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/handle-concurrency-conflicts-in-asp-net-core-and-oracle.html&title=Handle+Concurrency+Conflicts+in+ASP.NET+Core+and+Oracle) [Copy URL](https://blog.devart.com/handle-concurrency-conflicts-in-asp-net-core-and-oracle.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/happy-august-with-devart.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) Celebrate the Last Summer Month with Devart! By [dbForge Team](https://blog.devart.com/author/dbforge) August 1, 2018 [0](https://blog.devart.com/happy-august-with-devart.html#respond) 9201 We are excited to announce that the traditional month of fun, discounts, and gifts from Devart has just begun! August is a pinnacle of summer when we all have the last chance to enjoy all the riches and wonders of the warm season this year. It is also a month when Devart celebrates its birthday. On this occasion, we are happy to offer a 10% discoun t on all new Devart licenses! What is more, each purchase will automatically participate in the prize draw that will be held at the beginning of September. We have also prepared a bit of fun – pass a quiz and get a special reward! May your August be great and full of awesome moments! Tags [devart](https://blog.devart.com/tag/devart) [discounts](https://blog.devart.com/tag/discounts) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhappy-august-with-devart.html) [Twitter](https://twitter.com/intent/tweet?text=Celebrate+the+Last+Summer+Month+with+Devart%21&url=https%3A%2F%2Fblog.devart.com%2Fhappy-august-with-devart.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/happy-august-with-devart.html&title=Celebrate+the+Last+Summer+Month+with+Devart%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/happy-august-with-devart.html&title=Celebrate+the+Last+Summer+Month+with+Devart%21) [Copy URL](https://blog.devart.com/happy-august-with-devart.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/hello-world.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Welcome! By [dbForge Team](https://blog.devart.com/author/dbforge) November 25, 2008 [0](https://blog.devart.com/hello-world.html#respond) 2919 Welcome to the Devart development blog of dbForge product family. We’ll publish here articles on development, product usage tips, issue workarounds etc for MySQL, Oracle, SQL Server. [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhello-world.html) [Twitter](https://twitter.com/intent/tweet?text=Welcome%21&url=https%3A%2F%2Fblog.devart.com%2Fhello-world.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/hello-world.html&title=Welcome%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/hello-world.html&title=Welcome%21) [Copy URL](https://blog.devart.com/hello-world.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/help-santa-and-get-rewarded.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [Products](https://blog.devart.com/category/products) Help Santa and Get Rewarded! By [ALM Team](https://blog.devart.com/author/alm) December 18, 2018 [0](https://blog.devart.com/help-santa-and-get-rewarded.html#respond) 7543 We are excited to inform all our user that the traditional winter season of discounts and gifts from Devart has just begun! In praise of the upcoming winter holidays, we are more than happy to give you a 10% discount on all Devart products. In addition, we welcome you to participate in our Christmas quiz. Just help Santa to deal with SQL queries and get a cool reward! Devart team wishes all of you a great Chrismas shopping and lots of fun during the forthcoming holidays! Tags [devart](https://blog.devart.com/tag/devart) [discounts](https://blog.devart.com/tag/discounts) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhelp-santa-and-get-rewarded.html) [Twitter](https://twitter.com/intent/tweet?text=Help+Santa+and+Get+Rewarded%21&url=https%3A%2F%2Fblog.devart.com%2Fhelp-santa-and-get-rewarded.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/help-santa-and-get-rewarded.html&title=Help+Santa+and+Get+Rewarded%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/help-santa-and-get-rewarded.html&title=Help+Santa+and+Get+Rewarded%21) [Copy URL](https://blog.devart.com/help-santa-and-get-rewarded.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/here-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html", "product_name": "Unknown", "content_type": "Blog", "content": "[PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) [Product Release](https://blog.devart.com/category/product-release) [What’s New](https://blog.devart.com/category/whats-new) Here Come dbForge Tools for PostgreSQL 3.2: Extended Connectivity, Enhanced Code Completion, Smart Data Generators, and More By [dbForge Team](https://blog.devart.com/author/dbforge) December 4, 2024 [0](https://blog.devart.com/here-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html#respond) 1037 We’ve got another big update! This time it’s all about PostgreSQL; the entire product line has been polished and augmented to make your experience with PostgreSQL fuller in every way—this includes new data sources, improved suggestions, a slew of new smart data generators, and quite a few other tricks to make your daily work easy as ever. Without further ado, let’s check everything feature by feature! Contents Connectivity Code Completion Query History Data Editor Schema Compare Data Compare Data Generator Database Explorer Script Generation SQL Document Devart Academy Connectivity First off, you can freely use the Studio with the recently released PostgreSQL 17 . Additionally, we have expanded the connectivity of our tools to cover a few new cloud services. Among them, you get AlloyDB , a fully managed PostgreSQL-compatible database service. You can also work with Supabase , an open-source database infrastructure built on PostgreSQL that has recently seen a considerable surge in popularity. It boasts a simple setup process and an intuitive user interface. The list of newly supported services also includes Google Cloud and Azure Cosmos DB for PostgreSQL . Finally, we’ve added support for the rds-ca-rsa2048-g1 SSL/TLS certificate for connecting to PostgreSQL servers on Amazon Web Services . Now, if you are using any of these services, you are free to get dbForge Studio to connect to your PostgreSQL databases there and manage them with ease and convenience. Code Completion Our next stop is code completion. Here, first of all, we’d like to mention that we have added parameter info and quick info for functions and procedures , displayed as you type. Next, we’ve added suggestions for foreign key columns in ON DELETE SET NULL and ON DELETE SET DEFAULT clauses . Next, we have added suggestions for columns declared in scripts for foreign keys within CREATE TABLE and ALTER TABLE clauses. The behavior of code completion for DDL triggers has been improved as well. Here’s what’s been updated for event triggers: In CREATE EVENT TRIGGER clauses, you get suggestions for WHEN TAG IN and DDL events. In ALTER EVENT TRIGGER clauses, you get suggestions for existing triggers, ENABLE REPLICA and ENABLE ALWAYS, as well as potential owners. In DROP EVENT TRIGGER clauses, you get suggestions for existing triggers. And here’s what’s been updated for DML triggers: In CREATE TRIGGER clauses, you get trigger suggestions after CREATE OR REPLACE; suggestions for FOR EACH ROW and FOR EACH STATEMENT; and suggestions for DEFERRABLE INITIALLY DEFERRED and DEFERRABLE INITIALLY IMMEDIATE. In ALTER TRIGGER clauses, you get suggestions for triggers alongside the tables and views that the said triggers are based on. Query History Next, you get a redesigned Query History (formerly known as Execution History). It has a new toolbar that includes an updated range selection and a handy Clear button that helps clear the history right away. Data Editor Now, let’s take a look at Data Editor, where you can export a selection of data directly from the grid to an Excel file via the shortcut menu Copy Data As > EXCEL > To File . You also have the option to set the value of a cell to a unique identifier by selecting Set Value To > Unique Identifier from the shortcut menu. Schema Compare Next comes Schema Compare, where we have implemented support for foreign key columns in ON DELETE SET NULL and ON DELETE SET DEFAULT clauses . We have added support for the comparison and synchronization of DML and event (DDL) triggers . The comparison and synchronization of UNIQUE constraints has been improved… …and we’ve also added full support for CHECK constraints . Finally, we’ve made automated generation of comparison reports much more convenient with the familiar Save Command Line button, accessible directly in the Comparison Report wizard. And if you take a look at the Command line execution file settings dialog, you will see plenty of new customization options. You can check them in the following screenshot. Data Compare This case is similar to the previous one: when generating a comparison report in Data Compare, you can use the same Save Command Line button in the wizard to create a command-line script for recurring comparison and reporting operations. The Command line execution file settings dialog is just as replete with new options. Note that both [Schema Compare](https://www.devart.com/dbforge/postgresql/schemacompare/) and [Data Compare](https://www.devart.com/dbforge/postgresql/datacompare/) for PostgreSQL are available as standalone applications, and the abovementioned features apply to them as well. Data Generator In [Data Generator](https://www.devart.com/dbforge/postgresql/studio/data-generator.html) , there are lots of pleasant surprises as well. For instance, you get a slew of new basic generators , which include Files Folder, Lorem Ipsum, Shuffled Text, and Text File. Moreover, you’ve got a handful of new smart generators in categories such as Business … … IT … … Payment … … Location … … Product … … Health … …and Personal . That’s quite a lot to explore, and we hope these generators will come in handy. Finally, in Data Population Wizard , you get a new option that appends a timestamp to the name of the file that you save your data population script to. Database Explorer In Database Explorer, you can delete database objects directly from the shortcut menu . Script Generation If you are using Amazon Redshift, you can generate the DDL of a materialized view via the familiar Generate Script As menu. SQL Document From now on, you can use shortcuts to duplicate, remove, and join current lines in your SQL documents. Quick access to Devart Academy Let’s conclude our journey with a link to [Streamlining PostgreSQL Management With dbForge Studio](https://www.devart.com/academy/postgresql-studio/) , a course of Devart Academy that helps master PostgreSQL by means of the Studio. Now you can access it directly from the Studio via Help > Demos and Video Tutorials . By the way, besides YouTube videos, we have a dedicated [PostgreSQL Tutorial](https://blog.devart.com/postgresql-tutorial.html) page on our blog, where we have assembled a collection of useful Postgres-related articles that might come in handy. Feel free to check it out! Get the updated dbForge tools for PostgreSQL 3.2 today! The cornerstone of our PostgreSQL product line is [dbForge Studio for PostgreSQL](https://www.devart.com/dbforge/postgresql/studio/) , our flagship IDE that basically combines everything you might need for PostgreSQL development and management. You can give the updated Studio a go right away by [downloading it for a free 30-day trial](https://www.devart.com/dbforge/postgresql/studio/download.html) , which is a perfect way to scrutinize and test its capabilities. And if you’re already using the Studio, the update is already waiting for you. Let us know what you think about it; we’ll be glad to hear your feedback and new feature suggestions. Finally, let us remind you that the updated dbForge Studio for PostgreSQL is available as part of [dbForge Edge](https://www.devart.com/dbforge/edge/) , our multidatabase solution that covers a number of other database systems (including SQL Server, MySQL, MariaDB, and Oracle Database) and cloud services. The solution comprises four feature-rich IDEs that are a perfect fit for beginners and power users alike. Don’t take our word for it— [get dbForge Edge for a free 30-day trial](https://www.devart.com/dbforge/edge/download.html) and see for yourself. Tags [dbForge Studio for PostgreSQL](https://blog.devart.com/tag/dbforge-studio-for-postgresql) [PostgreSQL](https://blog.devart.com/tag/postgresql) [postgresql tools](https://blog.devart.com/tag/postgresql-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Come+dbForge+Tools+for+PostgreSQL+3.2%3A+Extended+Connectivity%2C+Enhanced+Code+Completion%2C+Smart+Data+Generators%2C+and+More&url=https%3A%2F%2Fblog.devart.com%2Fhere-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html&title=Here+Come+dbForge+Tools+for+PostgreSQL+3.2%3A+Extended+Connectivity%2C+Enhanced+Code+Completion%2C+Smart+Data+Generators%2C+and+More) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html&title=Here+Come+dbForge+Tools+for+PostgreSQL+3.2%3A+Extended+Connectivity%2C+Enhanced+Code+Completion%2C+Smart+Data+Generators%2C+and+More) [Copy URL](https://blog.devart.com/here-come-dbforge-tools-for-postgresql-3-2-extended-connectivity-enhanced-code-completion-smart-data-generators-and-more.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/here-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Product Release](https://blog.devart.com/category/product-release) [What’s New](https://blog.devart.com/category/whats-new) Here Come the Newly Updated Delphi Data Access Components With Support for RAD Studio 12.2 and Lazarus 3.4 By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) September 27, 2024 [0](https://blog.devart.com/here-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html#respond) 992 It’s high time we shared a new release of [Delphi Data Access Components](https://www.devart.com/dac.html) with a slew of new features and enhancements that will surely be worth your while—including support for the latest versions of RAD Studio and Lazarus, new options, improved performance, and much more. Curious? Let’s take a look at what awaits you in this release. The first thing worth mentioning is support for the recently released RAD Studio 12 Athens – Release 2 (12.2) , which focuses on completing the features from the previous releases and enhancing the experience with a slew of new capabilities. The same goes for the latest version of another compatible IDE, Lazarus 3.4 —you’ve got full support for it as well. [UniDAC](https://www.devart.com/unidac/) , our key library of components, has received support for Microsoft Excel . Now you can freely operate with Excel data, both directly and via an ODBC driver. Also note that, besides Windows, you can do it on Linux and macOS. Next, we’ve added C++ Builder demo components for work with SecureBridge to MyDAC, ODAC, SDAC, PgDAC, and UniDAC (covering MySQL, Oracle, SQL Server, and PostgreSQL, respectively). SDAC and UniDAC (SQL Server provider) have received support for Variant OUT parameters that can now be used in the Direct mode. For the same components, we’ve added the datetime format , which can now be specified when using the timestamp macros. MyDAC, PgDAC, LiteDAC, and UniDAC have received a WriteBOM option for TUniDump. This option allows turning on/off the addition of BOM to the dump file whenever you save it with the Unicode encoding. PgDAC and UniDAC (PostgreSQL provider) now boast well-enhanced performance during the execution of SELECT queries. For PgDAC, we’ve also implemented support for pgvector , an open-source PostgreSQL extension that allows you to store, query, and index vectors. Next, we’ve added a new option for TUniTable, TUniQuery, and TUniStoredProc in UniDAC. It’s called UseGeneratedColumns , and it allows to turn on/off the use of generated columns in PostgreSQL. One more new option, EnableAutoInc , is now available in UniDAC for UniTable and UniQuery components. The option allows you to use the SQLite AUTOINCREMENT column attribute. UniDAC has received support for the latest NexusDB 4.75.10 . Finally, in UniDAC (Microsoft Access provider), we’ve added support for the execution of stored SELECT and UNION queries in the Direct mode. The output of these queries is similar to views in other database systems. Please refer to the revision histories in the table below for the full list of changes. [UniDAC 10.3](https://www.devart.com/unidac/) [ [Download](https://www.devart.com/unidac/download.html) ] [ [Revision History](https://www.devart.com/unidac/revision_history.html) ] [ODAC 13.3](https://www.devart.com/odac/) [ [Download](https://www.devart.com/odac/download.html) ] [ [Revision History](https://www.devart.com/odac/revision_history.html) ] [SDAC 11.3](https://www.devart.com/sdac/) [ [Download](https://www.devart.com/sdac/download.html) ] [ [Revision History](https://www.devart.com/sdac/revision_history.html) ] [MyDAC 12.3](https://www.devart.com/mydac/) [ [Download](https://www.devart.com/mydac/download.html) ] [ [Revision History](https://www.devart.com/mydac/revision_history.html) ] [IBDAC 9.3](https://www.devart.com/ibdac/) [ [Download](https://www.devart.com/ibdac/download.html) ] [ [Revision History](https://www.devart.com/ibdac/revision_history.html) ] [PgDAC 8.3](https://www.devart.com/pgdac/) [ [Download](https://www.devart.com/pgdac/download.html) ] [ [Revision History](https://www.devart.com/pgdac/revision_history.html) ] [LiteDAC 6.3](https://www.devart.com/litedac/) [ [Download](https://www.devart.com/litedac/download.html) ] [ [Revision History](https://www.devart.com/litedac/revision_history.html) ] [VirtualDAC 13.3](https://www.devart.com/virtualdac/) [ [Download](https://www.devart.com/virtualdac/download.html) ] [ [Revision History](https://www.devart.com/virtualdac/revision_history.html) ] Download Delphi Data Access Components for a free 60-day trial and explore their capabilities today! The newly enhanced [Delphi Data Access Components](https://www.devart.com/dac.html) have been rolled out, so feel free to get your update right now! And if you’re not using them yet, you might as well start with [UniDAC](https://www.devart.com/unidac/) , our flagship library of components that provide direct access to popular databases such as Oracle, Microsoft SQL Server, MySQL, InterBase, Firebird, PostgreSQL, and SQLite, as well as clouds like Salesforce, FreshBooks, SugarCRM, and many others. Our Data Access Components are compatible with Embarcadero RAD Studio, Delphi, C++Builder, Lazarus, and Free Pascal. You can work with them on Windows, Linux, macOS, iOS, and Android, for both 32-bit and 64-bit platforms. And last but surely not least, you can get them for a free trial and see how well they fare in your particular projects. Tags [dac](https://blog.devart.com/tag/dac) [data connectivity](https://blog.devart.com/tag/data-connectivity) [delphi data access](https://blog.devart.com/tag/delphi-data-access) [delphi data access components](https://blog.devart.com/tag/delphi-data-access-components) [unidac](https://blog.devart.com/tag/unidac) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Come+the+Newly+Updated+Delphi+Data+Access+Components+With+Support+for+RAD+Studio+12.2+and+Lazarus+3.4&url=https%3A%2F%2Fblog.devart.com%2Fhere-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html&title=Here+Come+the+Newly+Updated+Delphi+Data+Access+Components+With+Support+for+RAD+Studio+12.2+and+Lazarus+3.4) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html&title=Here+Come+the+Newly+Updated+Delphi+Data+Access+Components+With+Support+for+RAD+Studio+12.2+and+Lazarus+3.4) [Copy URL](https://blog.devart.com/here-come-the-newly-updated-delphi-data-access-components-with-support-for-rad-studio-12-2-and-lazarus-3-4.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [Product Release](https://blog.devart.com/category/product-release) [dbForge Tools for SQL Server v7.1 Released: Extended Connectivity and Compatibility!](https://blog.devart.com/dbforge-tools-v7-1-released.html) April 9, 2025 [Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [Newly Updated Delphi Data Access Components With Support for RAD Studio 64-bit IDE, RAD Studio 12.3, and Lazarus 3.8](https://blog.devart.com/newly-updated-delphi-data-access-components-with-support-for-rad-studio-64-bit-ide-rad-studio-12-3-and-lazarus-3-8.html) March 27, 2025"} {"url": "https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Here Comes a New Update of dbForge Tools for SQL Server! By [dbForge Team](https://blog.devart.com/author/dbforge) July 12, 2023 [0](https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html#respond) 2244 Recently, we have rolled out extensive updates of our product lines for MySQL, Oracle, and PostgreSQL databases. But what if you are a SQL Server user? Well, you don’t need to worry. [dbForge tools for SQL Server](https://www.devart.com/dbforge/sql/) didn’t escape our attention, and we’re finally here with a brand new update delivering quite a few helpful goodies for your everyday work. Mostly, these are new functions and operators that were introduced in SQL Server 2022 and made available in all of our tools as well. Contents Newly supported relational operators Newly supported bit manipulation functions Newly supported aggregate functions Newly supported date & time function Optimized process of describing scripts folders Newly supported relational operators Now that you’ve checked the contents, it won’t be a spoiler that most of the enhancements offered in this release are related to code completion and smart suggestions that help you produce error-free SQL code faster. And the first thing we’d like to show you (using [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) ) is the newly added support for the relational operators GENERATE_SERIES (which generates a series of numbers within a given interval) and OPENQUERY (which executes a specified pass-through query on a specified linked server). This is what it looks like in SSMS via [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . Newly supported bit manipulation functions We have also added support for the entire spectrum of bit manipulation functions to help you process and store data more efficiently than with individual bits. These include LEFT_SHIFT , RIGHT_SHIFT , BIT_COUNT , GET_BIT , and SET_BIT . Note that we’ve also added support for >> and << operators. This is what it looks like in SSMS via SQL Complete. Newly supported aggregate functions Next, you have two more aggregate functions at your service, APPROX_PERCENTILE_CONT and APPROX_PERCENTILE_DISC , both of which deal with returning values from a set of values in a group based on the provided percentile and sort specification. This is what it looks like in SSMS via SQL Complete. Newly supported date & time function The final new function for today is DATETRUNC , which returns an input date truncated to a specified datepart. This is what it looks like in SSMS via SQL Complete. Optimized process of describing scripts folders Ultimately, we have implemented a modification that significantly enhances the speed of the schema comparison engine utilized in Source Control to describe scripts folders. It is achieved by excluding SQL files with static data from the initial describing process. Files with static data that are linked to Source Control will be described later. Get the update of SQL Tools, Studio, and SQL Complete today! Note that this update is valid for the entire dbForge product line for SQL Server. It’s already been rolled out, so you are free to update your dbForge tools at any given moment to have all of these enhancements firmly in place. And if you are not acquainted with any of our tools yet, we suggest you [download dbForge Studio for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) , evaluate its full capabilities, and see how much more productive your daily work with SQL Server databases can become. Users of multiple database systems should take note of our [dbForge Edge](https://www.devart.com/dbforge/edge/) bundle comprising four Studios—for SQL Server, MySQL, Oracle, and PostgreSQL—which has also received this update. It is just as well available for a 30-day trial, so feel free to [download it today](https://www.devart.com/dbforge/edge/download.html) . Alternatively, if you are an active SSMS user, or if you are in search of solutions with a narrow focus, we can recommend dbForge SQL Tools, an advanced bundle of 15 SSMS add-ins and standalone applications that cover nearly every task related to the development and management of SQL Server databases. Similarly to the Studio, you can [download SQL Tools for a free 30-day trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) and see each of them in action. Finally, we offer Data Compare and Schema Compare paired together in a single Compare Bundle, which you can also [download for a free trial](https://www.devart.com/dbforge/sql/compare-bundle/download.html) and conveniently purchase at a reduced price. Tags [dbForge Edge](https://blog.devart.com/tag/dbforge-edge) [dbforge studio](https://blog.devart.com/tag/dbforge-studio) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-comes-a-new-update-of-dbforge-tools-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Comes+a+New+Update+of+dbForge+Tools+for+SQL+Server%21&url=https%3A%2F%2Fblog.devart.com%2Fhere-comes-a-new-update-of-dbforge-tools-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html&title=Here+Comes+a+New+Update+of+dbForge+Tools+for+SQL+Server%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html&title=Here+Comes+a+New+Update+of+dbForge+Tools+for+SQL+Server%21) [Copy URL](https://blog.devart.com/here-comes-a-new-update-of-dbforge-tools-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Here Comes a New Update of SQL Complete 6.11 with Support for SSMS 19 By [dbForge Team](https://blog.devart.com/author/dbforge) June 24, 2022 [0](https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html#respond) 4161 We’ve got a new update of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , your favorite add-in for SSMS and Visual Studio that helps you write clean SQL code faster, delivering context-sensitive code completion, debugging, formatting, and refactoring features. The biggest news is that with this update, SQL Complete has been provided with support for the upcoming SSMS 19 , so you can rest assured you will enjoy a seamless transition once its official release sees the light of day. Additionally, we have fixed a few issues to make your experience with SQL Complete even better. Issue description Ticket # Fixed an unexpected exception that might occur when copying data to the clipboard D89130 Fixed the reset of tabs color settings after closing SSMS with SQL Complete activated D85351 Fixed an issue with setting the database name in the Query Editor window when starting SSMS using the -d argument D81862 Fixed the disappearance of the Document Outline window D81156 Fixed an error that occurred when starting SSMS D87321, D89325, D90091, D90192, D90396, D90381, D90374, D90368 Fixed a memory leak issue that occurred when executing large scripts D89598, D90100 Fixed an application error that occurred when starting SSMS 2014 – Fixed the display of data in Data Viewer when changing the order of columns in the Editor – Fixed an error that might occur when opening documents – Fixed an issue with displaying Quick Info on an incomplete script – Fixed an issue with missing hints for Linked Server objects – Fixed the problem with incorrect formatting of procedure parameter names – The update is already available and can be installed from the SQL Complete menu in SSMS > Help > Check for Updates . Not using SQL Complete yet? Then we invite you to check all of its expansive capabilities in action during a FREE 2-week trial . All you have to do is [download SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) from our official website and see how fast and easy your SQL coding can be. Tags [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [what's new sql complete](https://blog.devart.com/tag/whats-new-sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Comes+a+New+Update+of+SQL+Complete+6.11+with+Support+for+SSMS+19&url=https%3A%2F%2Fblog.devart.com%2Fhere-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html&title=Here+Comes+a+New+Update+of+SQL+Complete+6.11+with+Support+for+SSMS+19) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html&title=Here+Comes+a+New+Update+of+SQL+Complete+6.11+with+Support+for+SSMS+19) [Copy URL](https://blog.devart.com/here-comes-a-new-update-of-sql-complete-6-11-with-support-for-ssms-19.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/here-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Here Comes dbForge Studio 4.4 with the Updates You Don’t Want to Miss! By [dbForge Team](https://blog.devart.com/author/dbforge) January 13, 2022 [0](https://blog.devart.com/here-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html#respond) 3081 Meet the latest update of [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) ! It packs quite a punch that we believe you will find interesting: updated connectivity, a few new options in the Schema Compare functionality, several useful additions to Documenter, and a couple more improvements that will make dbForge Studio an even better tool for your daily tasks. Without further ado, let’s have a detailed overview of these updates. Connectivity Let’s get started with connectivity. dbForge Studio now supports Oracle 21c . Code Completion What we have here is a minor yet helpful addition—support for the SQL*Plus DESCRIBE command. Data Compare With this release, you get an updated Data Comparison Report Wizard , with actual differences included in reports. Now it is easy to check what’s been added, modified, and deleted during every comparison. Schema Compare When it comes to Schema Compare, we have a lot to tell you about. First, we improved the behavior of the Ignore START WITH schema comparison option. Second, we added 4 new Ignore options to make your comparison operations even more flexible. See below how each of them works. Ignore MAXVALUE Ignore INCREMENT BY Ignore CACHE Ignore CYCLE We also drastically improved schema comparison reports and made them far more informative with actual differences included (similarly to the abovementioned data comparison reports). Finally, we tightened the integration with version control systems . Documenter This release of dbForge Studio adds support for the JSON search index in Documenter. We also implemented support for attribute clustering. Other improvements Now let’s take a look at a few miscellaneous tweaks that will most certainly come in handy. Let’s get started with the improved behavior of Retrieve Data . dbForge Studio now supports nested tables in the Generate Script As functionality. Finally, we expanded the Generate Script As settings with several new options for DDL scripts: Include COMMIT Include SET SQLBLANKLINES ON Include SET DEFINE OFF Include NLS parameters for the current session This is how CREATE works before and after enabling these options: This is how DROP works before and after enabling these options: This is how DROP and CREATE works before and after enabling these options: That’s it! You are free to update your dbForge Studio for Oracle at any given moment to get all of these improvements. Simply go to the Help menu > Check for Updates . And if you are not acquainted with dbForge Studio for Oracle yet, we invite you to give it a go and [download a free 30-day trial](https://www.devart.com/dbforge/oracle/studio/download.html) , which will definitely help you evaluate its capabilities and see how much more productive your daily work is going to be. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [studio for oracle](https://blog.devart.com/tag/studio-for-oracle) [what's new oracle studio](https://blog.devart.com/tag/whats-new-oracle-studio) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Comes+dbForge+Studio+4.4+with+the+Updates+You+Don%E2%80%99t+Want+to+Miss%21&url=https%3A%2F%2Fblog.devart.com%2Fhere-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html&title=Here+Comes+dbForge+Studio+4.4+with+the+Updates+You+Don%E2%80%99t+Want+to+Miss%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html&title=Here+Comes+dbForge+Studio+4.4+with+the+Updates+You+Don%E2%80%99t+Want+to+Miss%21) [Copy URL](https://blog.devart.com/here-comes-dbforge-studio-4-4-with-the-updates-you-dont-want-to-miss.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) [What’s New](https://blog.devart.com/category/whats-new) Here Comes the Great Big Update of dbForge Tools for Oracle! By [dbForge Team](https://blog.devart.com/author/dbforge) March 16, 2023 [0](https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html#respond) 2616 Following the recent update of our PostgreSQL product line, we are genuinely excited to unveil yet another new release that’s just as big and essential—a rich collection of newly supported SQL statements and miscellaneous functional enhancements that have been integrated into the latest versions of [dbForge tools for Oracle](https://www.devart.com/dbforge/oracle/) . Contents Support for Linux & macOS Newly supported SQL statements Text Editor enhancements Documenter enhancements Data Compare & Data Generator enhancements Other improvements Support for Linux & macOS First of all, we’d love to tell Linux and macOS users that now they can run all dbForge tools for Oracle on their machines using a specialized solution by CodeWeavers. It is called [CrossOver](https://www.codeweavers.com/crossover) , and it provides Windows applications with a compatibility layer and allows them to run on Linux and macOS. In a nutshell, the flow is as follows: Install CrossOver on your Mac or Linux machine. It will work as a regular native application. Open CrossOver and configure a special container (bottle) with the environment required for dbForge Studio. Install the Studio into the bottle and run it! This is what it looks like on macOS. And this is what it looks like on Linux. Newly supported SQL statements We have greatly expanded the code completion capabilities of our Oracle tools—and now you have 10 more implemented SQL statements that will help you get your coding done faster. Here are all of them; and for your convenience, each entry is illustrated with a screenshot. Text Editor enhancements The integrated Text Editor has a few more nice surprises for you. One of them is the newly added support for several SQL*Plus commands , namely: Our next stop is extended support for the SELECT statement in the Completion List . To be more precise, we have added support for the following clauses: We have also added support for the WITH … AS clause in the INSERT statement . Documenter enhancements Now let’s proceed to Documenter, available both in [dbForge Studio](https://www.devart.com/dbforge/oracle/studio/) and as a [standalone application](https://www.devart.com/dbforge/oracle/documenter/) —and the first thing to mention is support for Memory Table properties . Additionally, we have added ValidityPeriod for tables. Data Compare & Data Generator enhancements Next, let’s talk about [Data Compare](https://www.devart.com/dbforge/oracle/datacompare/) and [Data Generator](https://www.devart.com/dbforge/oracle/data-generator/) , both of which can also be used as standalone apps and as part of the Studio. Now you have two more handy switches added to the command line: /schemaexport and /schemaimport . Let us also take a look at the newly added schema import and export features. Other improvements There are several auxiliary improvements that have been introduced to our tools with this release. Let’s start with the newly added Ignore IDENTITY columns option. Last but not least, we have reimagined the reordering of table columns that contain Oracle built-in data types. While previously it was necessary to use DROP/CREATE TABLE for that purpose—which is a rather resource-intensive operation—now ALTER TABLE is used instead. And since it’s no longer required to create your table anew, the eventual script becomes more compact and is executed faster. Download dbForge Studio for Oracle v4.5 today! Now that we’ve told you about the results of our efforts, we’d love to invite you to see them in action and [download dbForge Studio](https://www.devart.com/dbforge/oracle/studio/download.html) —our flagship IDE for Oracle—for a free 30-day trial (or, if you already have it, get the update via the Help menu > Check for Updates). The same applies to other [dbForge tools for Oracle](https://www.devart.com/dbforge/oracle/) —don’t hesitate to check them out as well! We would truly and verily appreciate your feedback. Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [what's new oracle tools](https://blog.devart.com/tag/whats-new-oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-comes-the-great-big-update-of-dbforge-tools-for-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Comes+the+Great+Big+Update+of+dbForge+Tools+for+Oracle%21&url=https%3A%2F%2Fblog.devart.com%2Fhere-comes-the-great-big-update-of-dbforge-tools-for-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html&title=Here+Comes+the+Great+Big+Update+of+dbForge+Tools+for+Oracle%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html&title=Here+Comes+the+Great+Big+Update+of+dbForge+Tools+for+Oracle%21) [Copy URL](https://blog.devart.com/here-comes-the-great-big-update-of-dbforge-tools-for-oracle.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025"} {"url": "https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [What’s New](https://blog.devart.com/category/whats-new) Here Comes the Latest Update of SQL Complete! By [dbForge Team](https://blog.devart.com/author/dbforge) April 4, 2022 [0](https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html#respond) 2945 It’s time to unveil the latest update of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , your favorite code completion add-in for SSMS and Visual Studio. We have enhanced it with a few new functions that will surely come in handy. Let’s check them out! We have added support for several new functions in the Completion List, Quick Info, and Parameter Information. First off, you get the GREATEST function: Then you get the LEAST function: Next comes the CURRENT_TIMEZONE function: Finally, you get the CURRENT_TIMEZONE_ID function: We have also added support for the FORMATFILE_DATA_SOURCE parameter for bulk_options in the OPENROWSET function: That’s it! You are free to update your dbForge SQL Complete at any given moment to get all of these improvements. Simply go to the Help menu > Check for Updates . And if you are not acquainted with dbForge SQL Complete yet, we suggest you [give it a go with a free 14-day trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) , which will help you evaluate its capabilities and see how much more productive your daily work is going to be. Tags [productivity](https://blog.devart.com/tag/productivity) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhere-comes-the-latest-update-of-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=Here+Comes+the+Latest+Update+of+SQL+Complete%21&url=https%3A%2F%2Fblog.devart.com%2Fhere-comes-the-latest-update-of-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html&title=Here+Comes+the+Latest+Update+of+SQL+Complete%21) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html&title=Here+Comes+the+Latest+Update+of+SQL+Complete%21) [Copy URL](https://blog.devart.com/here-comes-the-latest-update-of-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/highlights-from-mysql-server-8-3-0-release.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Events](https://blog.devart.com/category/events) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) What’s New in MySQL 8.3: Feature Overview By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) March 4, 2024 [0](https://blog.devart.com/highlights-from-mysql-server-8-3-0-release.html#respond) 1795 The latest version of MySQL Server, 8.3, has been available as a General Availability (GA) release for a while. And in case you have missed it, here is a brief recap of the newly available features and enhancements alongside some deprecated and removed functionality. Without further ado, let’s take a closer look. New features in MySQL 8.3.0 The format of global transaction identifiers (GIDs) used in MySQL Replication and Group Replication was extended in order to enable the identification of transaction groups. It is now possible to assign a unique name to GTIDs belonging to a specific transaction group. It is now possible to choose between 2 versions of the JSON output format used by EXPLAIN FORMAT=JSON statements. Now it’s done via the explain_json_format_version server system variable, which was also included in this release. DDL and DCL statement tracking was added for group_replication_set_as_primary() . The introduced SASL-based LDAP authentication allows Windows clients to use GSSAPI/Kerberos alongside the authentication_ldap_sasl_client plugin for authentication purposes. The binlog_transaction_dependency_tracking server system variable was deprecated in MySQL 8.2.0. In preparation for the removal of the said variable, its default value is now WRITESET . Data-masking components added support for specifying a dedicated schema to store the related internal table and masking functions. Previously, the mysql system schema provided the only storage option. Now, a new read-only variable called component_masking.masking_database enables setting and persisting an alternative schema name at server startup. Thread pool connection information was added to the MySQL Performance Schema. Two system status variables were introduced to provide information about accesses to the PROCESSLIST table: Deprecated_use_i_s_processlist_count and Deprecated_use_i_s_processlist_last_timestamp . The MySQL Enterprise Data Masking and De-Identification component now includes the ability to flush the data on the secondary or replica into memory. For better versatility, SET_ANY_DEFINER and ALLOW_NONEXISTENT_DEFINER privileges were added instead of the previously required SET_USER_ID . To get a more detailed overview of the abovementioned features, refer to [the list of newly added features in MySQL 8.3](https://dev.mysql.com/doc/refman/8.3/en/mysql-nutshell.html#mysql-nutshell-additions) . Deprecated and removed features in MySQL 8.3.0 Group Replication recovery no longer depends on writing view change events to the binary log to mark changes in group membership. Instead, when all members of a group are MySQL 8.3.0 or later, they share compressed recovery metadata, and when a new member joins the group, no such event is logged or assigned a GTID. A number of MySQL C API functions , which had been deprecated in previous versions of MySQL, were eventually removed. A few options and variables related to MySQL Replication had been deprecated in previous versions of MySQL, and were also removed from MySQL 8.3. A few options for compiling the server via CMake were found obsolete and removed as well. The FLUSH HOSTS statement, which had been deprecated in MySQL 8.0.23, was also removed. To clear the host cache, one can use TRUNCATE TABLE performance_schema.host_cache or mysqladmin flush-hosts instead. When global transaction identifiers (GTIDs) are used for replication, transactions that have already been applied are now automatically ignored. Using writeset information for conflict detection was found to cause issues with dependency tracking. Because of that, the usage of writesets for conflict checks was limited to cases when row-based logging is in effect. That said, if binlog_transaction_dependency_tracking is set to WRITESET or WRITESET_SESSION , binlog_format must be ROW , whereas MIXED is no longer supported. For further information, refer to [the list of deprecated and removed features in MySQL 8.3](https://dev.mysql.com/doc/refman/8.3/en/mysql-nutshell.html#mysql-nutshell-deprecations) . Bug fixes and improvements Here is [the list of bugs fixed in MySQL 8.3](https://dev.mysql.com/doc/relnotes/mysql/8.3/en/news-8-3-0.html#mysqld-8-3-0-bug) , which is just too hefty to be covered here in detail; we’ll only mention that quite a few of them deal with InnoDB and replication, so you may want to pay some extra attention to that. Prepare to upgrade Before upgrading, take note of the following simple yet helpful tips: [Check MySQL version](https://blog.devart.com/how-to-check-mysql-version.html) Make sure you have established root access to your MySQL Server Back up your databases beforehand, just in case Examine the available [upgrade paths](https://dev.mysql.com/doc/refman/8.3/en/upgrade-paths.html) to make sure yours is supported Review [the deprecated and removed features](https://dev.mysql.com/doc/refman/8.3/en/mysql-nutshell.html) ; if you are using them, you may want to reconsider upgrading or introduce corresponding changes to your MySQL Server Review [the deprecated and removed variables](https://dev.mysql.com/doc/refman/8.3/en/added-deprecated-removed.html) to make sure the upgrade does not affect them In case you use replication, check [the corresponding section](https://dev.mysql.com/doc/refman/8.3/en/replication-upgrade.html) Take a look at [the upgrade best practices](https://dev.mysql.com/doc/refman/8.3/en/upgrade-best-practices.html) , you might find some of them useful Last but by no means least, perform your upgrade on a test environment first to verify that everything works correctly, and only then run the upgrade against your production server For your convenience, here is [the detailed official guide on upgrading your MySQL Server](https://dev.mysql.com/doc/refman/8.3/en/upgrading.html) . Level up your MySQL database management with dbForge tools Finally, after you upgrade your MySQL Server, it might be a good time to upgrade your toolset for database development and management. We’ve got something to suggest in this respect – [dbForge for MySQL](https://www.devart.com/dbforge/mysql/) – a collection of high-end tools that help you be most efficient at your daily database-related tasks. The most all-encompassing of those is inarguably [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , a flagship IDE that has every feature you might need, including database design and version control, SQL development, query optimization, visual query building, data management and migration, identification of differences in schemas and table data, administration, test data generation, and much more. It is also worth mentioning that the Studio offers broad compatibility options beyond MySQL proper. The entire product line for MySQL looks as follows: [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) [dbForge Compare Bundle for MySQL](https://www.devart.com/dbforge/mysql/compare-bundle/) [dbForge Schema Compare for MySQL](https://www.devart.com/dbforge/mysql/schemacompare/) [dbForge Data Compare for MySQL](https://www.devart.com/dbforge/mysql/datacompare/) [dbForge Query Builder for MySQL](https://www.devart.com/dbforge/mysql/querybuilder/) [dbForge Data Generator for MySQL](https://www.devart.com/dbforge/mysql/data-generator/) [dbForge Documenter for MySQL](https://www.devart.com/dbforge/mysql/documenter/) [dbForge Fusion for MySQL](https://www.devart.com/dbforge/mysql/fusion/) Each of these tools is available for a free 30-day trial , which is a nice chance to explore them and make sure they’ll be the best fit for your daily work. dbForge Studio for MySQL is probably the best place to get started. Other tools have a narrower focus, but you might want to check them out as well. We’d also love to remind you that if your daily work goes beyond MySQL, you might as well try our multidatabase solution called [dbForge Edge](https://www.devart.com/dbforge/edge/) , which delivers a wide coverage of database management systems and cloud services, including MariaDB, Microsoft SQL Server, Oracle Database, and PostgreSQL. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhighlights-from-mysql-server-8-3-0-release.html) [Twitter](https://twitter.com/intent/tweet?text=What%E2%80%99s+New+in+MySQL+8.3%3A+Feature+Overview&url=https%3A%2F%2Fblog.devart.com%2Fhighlights-from-mysql-server-8-3-0-release.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/highlights-from-mysql-server-8-3-0-release.html&title=What%E2%80%99s+New+in+MySQL+8.3%3A+Feature+Overview) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/highlights-from-mysql-server-8-3-0-release.html&title=What%E2%80%99s+New+in+MySQL+8.3%3A+Feature+Overview) [Copy URL](https://blog.devart.com/highlights-from-mysql-server-8-3-0-release.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Uncategorized](https://blog.devart.com/category/uncategorized) How a Data Engineer Streamlined His Team’s Operations Using dbForge SQL Tools By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) April 5, 2023 [0](https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html#respond) 2202 Recently, we’ve [explored the case](https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html) when dbForge SQL Complete was adopted by a group of SQL developers working under the banner of a tech company and seeking to get their routine coding done faster. Our today’s case dwells on the same idea—but the difference is that this time we’ll be dealing with a company of a much larger scale, a global company that delivers IT and business consulting services and has spent more than 45 years on the market. Our client, who works as a Senior Consultant and Data Engineer at the said company, preferred not to disclose his name, yet agreed to share his story. The client’s team was just one of the teams that had to deal with databases and actual data every day. In fact, there were around 60-80 employees, allocated to numerous unrelated projects, who had to interact with databases one way or another. Furthermore, there weren’t any specific database tools that would be adopted as standard by the company—to the extent that, in most cases, you didn’t know what tools the team next door was using. Yet the client knew well what he was heading for. His teammates, who mostly happened to work with SQL Server databases, employed the tried-and-true SQL Server Management Studio, which effectively covered most of their daily tasks. A critical moment came when the workload became increasingly intense and brought an urgent need to streamline the daily activities of the client’s team: First of all, the client wanted to find a way to write SQL queries much faster. Smart autocompletion would help. Another bottleneck was concealed in data import and export operations. These operations were rather slow and didn’t cover the required data formats all that well. Finally, the team simply didn’t have any proper tools to help them quickly detect changes in table data. Moreover, shifting to a completely different database toolset was not quite welcome. The best option was to find capable add-ins for SSMS to enhance it with, and [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) , which were discovered by the client during a brief research, came across as a viable integrated solution. After a month-long free trial period, the client was certain about it. Here is what he told us during an interview: “I guess it’s very good when the company uses the same software, because it helps to standardize the processes in some way, and it facilitates the interaction with vendors. When you have just one vendor that supplies you with database tools, it’s easier to implement any database-related scenarios. It’s easy to run the budget, and it eases the procurement processes. As for your tools, we’ve currently got 10 licenses, and the professional package is just what we need. SQL Complete is the heart of it. It helps very much.” After the trial, it didn’t take more than a month until the obstacles vanished—and as of now, the client’s team has been successfully using SQL Tools for more than 3 years. Now let’s take a look at the tools that the client and his colleagues found the most valuable. dbForge SQL Complete [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is an add-in that can be integrated with both SSMS and Visual Studio, and it is designed to enhance SQL coding and make it error-free with context-aware keyword and object suggestions, phrase completion, formatting, refactoring, and a built-in T-SQL Debugger. Overall, SQL Complete can boost the user’s coding speed by 200-400%. dbForge Data Compare [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) helps detect and analyze table data discrepancies in live databases, backups, and script folders. With its help, you can synchronize databases and recover damaged or missing data with just a few clicks. All tasks can be scheduled and automated via CLI. This tool is noted to save up to 70% of the time typically spent on manual data comparison and synchronization. dbForge Schema Compare [Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) is a similar solution with a similar efficiency, which helps compare SQL Server database schemas, analyze differences, and synchronize them via auto-generated SQL scripts. dbForge Data Pump [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) is a powerful add-in for data import and export, which supports 14 most popular data formats, quickly populates SQL databases with external source data, streamlines data migration, and helps automate recurring scenarios using templates that can be easily run from the command line. Get SQL Tools for a free 30-day trial today! This is where you might as well start your own success story. And if you’re looking to equip yourself with the broadest set of tools for SQL Server databases, feel free to [download dbForge SQL Tools for a free trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) and check them in action today. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [success story](https://blog.devart.com/tag/success-story) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html) [Twitter](https://twitter.com/intent/tweet?text=How+a+Data+Engineer+Streamlined+His+Team%E2%80%99s+Operations+Using+dbForge+SQL+Tools&url=https%3A%2F%2Fblog.devart.com%2Fhow-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html&title=How+a+Data+Engineer+Streamlined+His+Team%E2%80%99s+Operations+Using+dbForge+SQL+Tools) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html&title=How+a+Data+Engineer+Streamlined+His+Team%E2%80%99s+Operations+Using+dbForge+SQL+Tools) [Copy URL](https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Uncategorized](https://blog.devart.com/category/uncategorized) How ChatGPT Can Help Database Developers Write Unit Tests for SQL Server By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) April 18, 2023 [0](https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html#respond) 2727 We have already explored the capabilities of the much-hyped ChatGPT in terms of [writing SQL JOIN queries](https://blog.devart.com/how-to-use-chatgpt-to-write-sql-join-queries.html) and [retrieving data from MySQL databases](https://blog.devart.com/power-up-your-mysql-queries-how-chatgpt-can-help-you-retrieve-mysql-data.html) . Now the time has come to take one more step further and scrutinize a more narrow-focused yet crucial topic for database developers—unit testing. We’ll see how ChatGPT fares when it comes to writing unit tests for various database objects, and then we’ll run the output using one of the integrated features of [dbForge Studio for SQL Server](http://devart.com/dbforge/sql/studio/) called Unit Test (which is also [available as an SSMS add-in](https://www.devart.com/dbforge/sql/unit-test/) ). That said, fasten your seatbelts, it’s going to be a curious ride. CONTENTS Prerequisites Download and install dbForge Studio for SQL Server Download and install dbForge Unit Test Download and install AdventureWorks2019 Run unit tests on a table Run unit tests on a stored procedure Run unit tests on a function Run unit tests on a view Prerequisites First and foremost, let’s make sure that we’ve got all of our prerequisites at hand. To get started, we’ll need the following: An active ChatGPT account – we’re going to ask it to write us unit tests and see how well it fares. dbForge Studio for SQL Server – an all-in-one IDE that covers nearly all aspects of work with SQL Server databases, where we’ll run ChatGPT’s output and see if it works. Alternatively, you can use SQL Server Management Studio (a.k.a. SSMS) – the basic IDE for SQL Server databases. In that case, you will also need to install dbForge Unit Test – an add-in that will actually make it possible to work with unit tests, since the default SSMS lacks this functionality. The AdventureWorks2019 sample database – the go-to database for testing and demo purposes. These prerequisites will make you ready to take the same journey as the one we’re about to begin. Now let us help you a bit with installing dbForge Studio/dbForge Unit Test (whichever you prefer) and AndventureWorks2019. Download and install dbForge Studio for SQL Server You can [download dbForge Studio here](https://www.devart.com/dbforge/sql/studio/download.html) for a free 30-day trial . Once you download the installation file, just open it and follow the instructions of the installation wizard. It will be so easy that you won’t even need a manual. Download and install dbForge Unit Test Alternatively, if you prefer using SSMS, you can [download dbForge Unit Test](https://www.devart.com/dbforge/sql/unit-test/download.html) as part of dbForge SQL Tools, a bundle of 15 SSMS add-ins and standalone apps for SQL Server development, management, and administration. Similarly to the Studio, you only need to open the downloaded installation file and follow the wizard. There is one difference, though. During the installation, you will be able to specify the individual products that you want to install, so you can select Unit Test only; or, if you wish, you can install the entire bundle, since there are a lot of useful goodies to be found and explored there. The bundle is also available for a free 30-day trial , so the try-before-you-by principle is firmly in place. After the installation, the Unit Test menu will appear on the SSMS menu bar. Download and install AdventureWorks2019 Now it’s time to install our sample database. Download the AdventureWorks2019.bak file (you can use the [direct download here](https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorksDW2019.bak) ) from the official website. Open dbForge Studio (or SSMS), connect to your SQL Server instance, and restore AdventureWorks2019 from the downloaded .bak file. Make sure that the restoration of the database has been successful by connecting to it, going to Database/Object Explorer, and navigating across its objects: tables, views, stored procedures, and so on. Note: If you decide to give it a try, make sure you [install the tSQLt framework](https://docs.devart.com/unit-test/working-with-unit-test/adding-a-database-to-test.html) on your database beforehand. And if you would like to get more information about running your tests in Unit Test, refer to [Running SQL database unit tests](https://docs.devart.com/unit-test/working-with-unit-test/running-tests.html) . This functionality in dbForge Studio and dbForge Unit Test will be identical. Now we’re all set up to pay ChatGPT a visit and see it in action! Run unit tests on a table Let’s start by asking ChatGPT to generate unit tests for the Person.Person table of the AdventureWorks2019 sample database. For your convenience, here’s the code to take a closer look at. -- Load the tSQLt framework\nEXEC tSQLt.NewTestClass 'TestPerson';\n\n-- Create a test procedure for checking if Person.Person table exists\nCREATE PROCEDURE TestPerson.[test if Person.Person table exists]\nAS\nBEGIN\n -- Assert that the Person.Person table exists\n EXEC tSQLt.AssertObjectExists 'Person.Person';\nEND;\nGO\n\n-- Create a test procedure for checking if the Person.Person table has the correct columns\nCREATE PROCEDURE TestPerson.[test Person.Person columns]\nAS\nBEGIN\n -- Define the expected columns of the table\n DECLARE @ExpectedColumns TABLE (ColumnName NVARCHAR(100));\n\n INSERT INTO @ExpectedColumns (ColumnName)\n VALUES ('BusinessEntityID'), ('PersonType'), ('NameStyle'), ('Title'), ('FirstName'), ('MiddleName'), ('LastName'), ('Suffix'), ('EmailPromotion'), ('AdditionalContactInfo'), ('Demographics'), ('rowguid'), ('ModifiedDate');\n\n -- Assert that the Person.Person table has the expected columns\n EXEC tSQLt.AssertEqualsTable 'Person.Person', @ExpectedColumns;\nEND;\nGO\n\n-- Create a test procedure for checking if the Person.Person table has any records\nCREATE PROCEDURE TestPerson.[test Person.Person has records]\nAS\nBEGIN\n -- Assert that the Person.Person table has at least one record\n EXEC tSQLt.AssertRowCountGreaterThan 'Person.Person', 0;\nEND;\nGO\n\n-- Create a test procedure for checking if the Person.Person table has any duplicate records\nCREATE PROCEDURE TestPerson.[test Person.Person has no duplicate records]\nAS\nBEGIN\n -- Assert that the Person.Person table has no duplicate records\n EXEC tSQLt.AssertNoDuplicateRows 'Person.Person';\nEND;\nGO\n\n-- Run all the tests in the TestPerson test class\nEXEC tSQLt.RunTestClass 'TestPerson'; Now let’s go to dbForge Studio and see how it works via Unit Test. Note that the line EXEC tSQLt.NewTestClass 'TestPerson'; is executed automatically once a directory for unit tests is created. That said, we create a test that shows whether our table actually exists. Then we insert the code suggested by ChatGPT. And now we run the test. As you can see, it’s successful. Let’s try another case. Say, let’s check whether our Person.Person table has correct, expected columns. We insert the code again. This time we can see that ChatGPT’s suggestion wasn’t all that flawless, and the test returns an error. Let’s take a look at the code and fix it as follows. CREATE PROCEDURE TestPerson.[test Person.Person columns]\nAS\nBEGIN\n -- Define the expected columns of the table\n DECLARE @ExpectedColumns VARCHAR(8000) = 'BusinessEntityID,PersonType,NameStyle,Title,FirstName,MiddleName,LastName,Suffix,EmailPromotion,AdditionalContactInfo,Demographics,rowguid,ModifiedDate';\n\n -- Create a temporary table to store the actual columns of the Person.Person table\n DECLARE @ActualColumns VARCHAR(8000);\n\n SELECT @ActualColumns = STRING_AGG(c.COLUMN_NAME, ',') WITHIN GROUP (ORDER BY c.ORDINAL_POSITION)\n FROM INFORMATION_SCHEMA.COLUMNS c\n WHERE TABLE_NAME = 'Person' AND c.TABLE_SCHEMA='Person';\n\n -- Assert that the Person.Person table has the expected columns\n EXEC tSQLt.AssertEquals @ExpectedColumns, @ActualColumns, 'The Sales.vSalesPerson view does not have the expected columns.';\nEND;\nGO Now let’s go back to dbForge Studio and insert our fixed code. Then we run it. Success! How about one more case? Let’s see whether the Person.Person table is empty or contains any records. We insert the code. We’ve got an error. The suggested procedure does not exist. Actually, ChatGPT thinks this procedure must be there by default, yet it isn’t so. To make it work, we tweak the code. CREATE PROCEDURE TestPerson.[test Person.Person has records]\nAS\nBEGIN\n SET NOCOUNT ON;\n\n BEGIN TRY\n -- Check if any data exists in the Person.Person table\n IF NOT EXISTS(SELECT 1 FROM Person.Person)\n BEGIN\n -- No data exists, throw a custom exception\n THROW 50001, 'No data found in Person.Person table', 1;\n END\n END TRY\n BEGIN CATCH\n -- Handle the exception\n DECLARE @ErrorMessage NVARCHAR(4000) = ERROR_MESSAGE();\n RAISERROR(@ErrorMessage, 16, 1);\n RETURN;\n END CATCH\nEND;\nGO Then we go to the Studio and insert it. And so we run it – now it works! Okay, since ChatGPT provided us with four unit test examples, let’s have a look at the final one and see whether our Person.Person table has any duplicate records. We insert the suggested code. Once again we’ve got a problem. ChatGPT has suggested a procedure that does not exist. Let’s make corrections accordingly. CREATE PROCEDURE TestPerson.[test Person.Person has no duplicate records]\nAS\nBEGIN\n SET NOCOUNT ON;\n\n IF EXISTS(\n SELECT FirstName, LastName, MiddleName, COUNT(*) AS NumDuplicates\n FROM Person.Person\n GROUP BY FirstName, LastName, MiddleName\n HAVING COUNT(*) > 1\n )\n BEGIN\n -- Duplicates exist, show an error message\n RAISERROR('Duplicate rows found in Person.Person table', 16, 1);\n RETURN;\n END\n ELSE\n BEGIN\n -- No duplicates exist, return a message\n SELECT 'No duplicates found in Person.Person table' AS Result;\n END\nEND\nGO Back in the Studio, we insert the code. We’ve got an error. This wasn’t unexpected, because the table contains some identical data, and we don’t really know whether it’s an actual duplicate or not. To improve the table structure, we can suggest adding yet another field—for instance, date of birth. And if we group by that date of birth, we’ll get improved results that will help us clearly understand whether we’re dealing with the same person or not. As it is, we can identify duplicates in the Person.Person table using the following SELECT query. SELECT FirstName, LastName, MiddleName, COUNT(*) AS NumDuplicates\n\tFROM Person.Person\n\tGROUP BY FirstName, LastName, MiddleName\n\tHAVING COUNT(*) > 1\n\tORDER BY NumDuplicates DESC, LastName, FirstName; In our case, we can even find up to four different people with the same first and last names. Run unit tests on a view That was a curious experiment, but let’s move on from tables to views. Now we’ll ask ChatGPT to write unit tests for the Sales.SalesPerson view from the same AdventureWorks2019 sample database. That should be enough. Back in the Studio, we’ll get ourselves a new test class. And first of all, we’ll check whether our view actually exists. Now let’s insert the code suggested by ChatGPT and run it. Success! This time the code needs no tweaking. Now let’s get another test that will verify that our view returns the expected columns. We insert the code… Okay, here we have an error. There’s an incorrect expected result in the columns and an incorrect data type. To make it work, we tweak the code as follows. CREATE PROCEDURE TestView.[test SalesTest.vSalesPerson.testViewColumns]\nAS\nBEGIN\n -- Arrange\n DECLARE @ExpectedColumns VARCHAR(8000) = 'BusinessEntityID,Title,FirstName,MiddleName,LastName,Suffix,JobTitle,PhoneNumber,PhoneNumberType,EmailAddress,EmailPromotion,AddressLine1,AddressLine2,City,StateProvinceName,PostalCode,CountryRegionName,TerritoryName,TerritoryGroup,SalesQuota,SalesYTD,SalesLastYear';\n\n -- Act\n DECLARE @ActualColumns VARCHAR(8000);\n SELECT @ActualColumns = STRING_AGG(c.name, ',') WITHIN GROUP (ORDER BY c.column_id)\n FROM sys.views AS v\n INNER JOIN sys.columns AS c ON v.object_id = c.object_id\n WHERE v.name = 'vSalesPerson';\n\n -- Assert\n EXEC tSQLt.AssertEquals @ExpectedColumns, @ActualColumns, 'The Sales.vSalesPerson view does not have the expected columns.';\nEND;\nGO Then we take it to the Studio… And so we run it again. Success! Run unit tests on a function Now let’s get ChatGPT to write unit tests for a function. Again, we create a new test class along with a new test that will help us check the returned product price. Then we insert the code suggested by ChatGPT. Then we run it to get another error. ChatGPT has missed one function parameter and thus hasn’t declared it. The expected price result is also incorrect. The product with an ID that equals 1 doesn’t even exist in the table, and it can’t have a price of 3578.2700. Okay, time to tweak the code. CREATE PROCEDURE CheckFunction.[test MyUnitTest_1]\nAS\nBEGIN\n -- Arrange\n DECLARE @ProductID INT = 707;\n DECLARE @OrderDate DATE = CURRENT_TIMESTAMP;\n DECLARE @ExpectedPrice MONEY = 0.00;\n\n -- Act\n SELECT @ExpectedPrice = [dbo].[ufnGetProductListPrice](@ProductID, @OrderDate);\n\n -- Assert\n EXEC tSQLt.AssertEquals @Expected = 34.99, @Actual = @ExpectedPrice, @Message = 'Price for product ID 1 is incorrect';\n\nEND;\nGO Let’s insert it into the Studio… And see what happens if we run it. Success! Now let’s take a look at another case and check whether any of our product prices have NULL values. Here we create a new test. After we insert and run the code, we’ve got yet another error. There is an undeclared variable and a missed parameter. To fix that, we tweak the code. CREATE PROCEDURE CheckFunction.[test CheckPriceOnNULL]\nAS\nBEGIN\n -- Arrange\n DECLARE @ProductID INT = 99999;\n DECLARE @OrderDate DATE = CURRENT_TIMESTAMP;\n DECLARE @ExpectedPrice MONEY = NULL;\n\n -- Act\n SELECT @ExpectedPrice = [dbo].[ufnGetProductListPrice](@ProductID,@OrderDate);\n\n -- Assert\n EXEC tSQLt.AssertEquals @Expected = @ExpectedPrice, @Actual = NULL, @Message = 'Price for invalid product ID is not NULL';\n\nEND;\n\nGO Then we insert it back. And we run our test. As you can see, it’s successful. Finally, let’s run both tests at once to make sure everything works well. Run unit tests on a stored procedure Let’s finish our marathon by asking ChatGPT to write us a few unit tests for a stored procedure. We create yet another new test class alongside a unit test that takes the result returned by the procedure and checks whether it matches the expected result. A mismatch, naturally, will return an error. We insert the code suggested by ChatGPT… Here we can already notice that we don’t have the input parameter DECLARE @CheckDate DATETIME = ‘2023-04-05 18:40:28.215’. Other missing things include the variable and the structure of the table that we’ll use to verify the result. Let’s make some corresponding adjustments in the code. CREATE PROCEDURE CheckProcedure.[test Procedure returns any rows]\nAS\nBEGIN\n -- Assemble\n DECLARE @ProductAssemblyID INT = 800\n DECLARE @CheckDate DATETIME = '2023-04-05 18:40:28.215'\n\n -- Act\n CREATE TABLE #actual_result (\n [ProductAssemblyID] [int] NOT NULL,\n [ComponentID] [int] NOT NULL,\n [ComponentDesc] [nvarchar](50) NOT NULL,\n [TotalQuantity] [decimal](8, 2) NOT NULL,\n [StandardCost] [money] NOT NULL,\n [ListPrice] [money] NOT NULL,\n [BOMLevel] [smallint] NOT NULL,\n [RecursionLevel] [int] NOT NULL\n );\n INSERT INTO #actual_result EXEC dbo.uspGetBillOfMaterials @StartProductID = @ProductAssemblyID, @CheckDate = @CheckDate\n\n -- Assert\n DECLARE @ExpectedRowCount INT = 87 -- Update the expected row count\n DECLARE @ActualRowCount INT = (SELECT COUNT(*) FROM #actual_result)\n\n EXEC tSQLt.AssertRowCount @Expected = @ExpectedRowCount, @Actual = @ActualRowCount\nEND;\nGO Now let’s take it to the Studio. Finally, we run the code… and it works! Okay, you must be a bit tired already, so we’ll stop here. As you can see, ChatGPT never guarantees completely error-free code, which is only natural, so make sure you inspect its output and make amendments, if necessary. Other than that, it’s a rather promising assistant when it comes to giving you something to start with. Download dbForge tools for SQL Server for a free 30-day trial today! Now you know that ChatGPT can be helpful with yet another typical task of a database developer, and that our tools can run ChatGPT’s output—properly reviewed and adjusted, of course—and yield results in no time. That said, let us invite you once more to [download dbForge Studio](https://www.devart.com/dbforge/sql/studio/download.html) , an IDE that has virtually everything you need to facilitate effective development, management, and administration of SQL Server databases. Alternatively, you can [download dbForge Unit Test](https://www.devart.com/dbforge/sql/unit-test/download.html) (alongside the entire SQL Tools bundle) and put it to good use in your daily work. Besides Unit Test, you’ll get other tools and SSMS add-ins that will help you do the following: Write SQL code faster with context-aware completion, formatting, and refactoring Build queries on diagrams with no coding Debug SQL scripts, functions, triggers, and stored procedures Compare and synchronize database schemas and table data Generate realistic test data Link your databases to the most widely used version control systems Monitor server performance Fix index fragmentation Automate recurring tasks from the command line Build a CI/CD cycle to streamline your database development Download our tools for SQL Server and see them in action—we bet your routine work will never be quite the same afterwards! Tags [ChatGPT](https://blog.devart.com/tag/chatgpt) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [dbForge Unit Test](https://blog.devart.com/tag/dbforge-unit-test) [SQL Server](https://blog.devart.com/tag/sql-server) [sql tools](https://blog.devart.com/tag/sql-tools) [unit test](https://blog.devart.com/tag/unit-test) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+ChatGPT+Can+Help+Database+Developers+Write+Unit+Tests+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html&title=How+ChatGPT+Can+Help+Database+Developers+Write+Unit+Tests+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html&title=How+ChatGPT+Can+Help+Database+Developers+Write+Unit+Tests+for+SQL+Server) [Copy URL](https://blog.devart.com/how-chatgpt-can-help-database-developers-write-unit-tests-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-dbforge-schema-compare-helped-devops-engineers.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How dbForge Schema Compare Helped the DevOps Engineers Resolve Their Work Challenges in Complex Environments By [Julia Lutsenko](https://blog.devart.com/author/jane-williams) June 20, 2023 [0](https://blog.devart.com/how-dbforge-schema-compare-helped-devops-engineers.html#respond) 2103 Software tools are only as effective as their ability to address issues and professional challenges. In our previous articles, we discussed [use cases where our tools helped our clients enhance workflows](https://blog.devart.com/how-a-data-engineer-streamlined-his-teams-operations-using-dbforge-sql-tools.html) and increase productivity. Those tools have proved their value. This article further extends our success stories series. Our current guest is a Principal DevOps Engineer at an enterprise company with over 1300 employees and a remarkable history of nearly 90 years on the market. Treating its profound traditions with all respect, this company is striving to ride the waves of progress and utilize all the possibilities that digital technologies offer. While the client has chosen to remain anonymous, they have allowed us to share their experiences. The Problem The company has an in-house development department that focused on creating software for business use as swiftly and securely as possible. Consequently, the engineers adopted the DevOps approach as the most efficient method to facilitate the continuous development and delivery of their software. However, this approach came with its own challenges – the company’s CI/CD pipeline was complex and demanding. It encompassed Development, Test, UAT, and Production environments, and required the integration of GitLab, TeamCity, and Octopus to streamline the 3-stage development process. The source database (MS SQL Server is the provider) is housed in a local GitLab repository, where the Master must mirror the Production environment. Regular nightly checks are conducted to alert relevant developers of any discrepancies detected between the Master and Production. The developers pushed changes through GitLab; TeamCity executed CI/CD builds and created the deploy and rollback scripts; Octopus was then utilized to deploy changes across the necessary environments. While the team managed to address these challenges with the tools at their disposal, they eventually reached a point where the current tools were insufficient. The development team members sought new solutions for the demanding tasks: Matching the existing requirements of pushing and deploying changes Maintaining complete synchronization between the Master and Production branches Eliminating the need for generating complex SQL deployment scripts manually The client took time to conduct some extensive research and experiments with several third-party tools before settling on Devart’s [Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/features.html) , a part of the comprehensive [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) package (solutions designed to overcome nearly all database development and deployment challenges). According to our client, the decision to choose Schema Compare was due to its broad functionality and exceptional support. The Results It did not take long to validate that choice. The tools integrated seamlessly into the existing environment due to their robust compatibility features within the DevOps module. In particular, Schema Compare took over the task of generating both the deploy and rollback scripts for the team, allowing the developers to concentrate on more innovative and crucial features. Following the trial period, our client was able to confidently state: “Schema Compare is the lynchpin in our in-house developed completely automated CI/CD process for SQL Server database changes. It allows us to automatically generate deployments and rollbacks for releases, and keeps our Master branches [equal] to Production… It is an excellent product that meets and exceeds our needs in the SQL Server CI/CD space. It is as fast, safe, and reliable…” One more significant factor in favor of the Devart tools was the quality of technical support provided by our company. The client emphasized the importance of quick and effective responses. “…quicker support turnaround is an obvious win. To add to that, I feel like your support team knows me a little bit and doesn’t talk to me like I’m a new user. This offers more efficiency gains in that we can often get straight into the nuts and bolts of a problem without having to go through “Did you try the obvious things first?” Now, our client is completely satisfied with all the workflows supported by Schema Compare. They strongly recommend Devart solutions to all specialists in need of reliable tools for database development. The Tools Let’s explore the features that impressed our clients the most. Schema Compare – visualize differences The Schema Compare tool presents the comparison results in a smart and clean graphical user interface (GUI). Users can apply various grouping and sorting options, view all discrepancies between object pairs and select individual or multiple objects for synchronization. In addition, the tool previews deployment scripts for any object in the bottom window of the GUI. After [reviewing the comparison results](https://www.devart.com/dbforge/sql/schemacompare/features.html#analysis) , users can generate a comparison report and export it in multiple formats (HTML, XLS, and XML). For instance, the HTML report contains the details of the source and target databases, a list of all comparison results, comparison and synchronization options, deployment scripts, and warnings. These reports can be saved for future reference. Schema Compare – support for .scomp files The Schema Compare tool supports .scomp files containing the results of schema comparisons, and it is possible to [manage those results directly](https://www.devart.com/dbforge/sql/studio/sql-server-schema-compare.html#scomp-document) in the document. The tool allows for filters, sorting, and grouping of objects, including, and excluding them from the file. It also displays the differences in data definition language (DDL) between corresponding objects. Working with such files in the tool is much simpler than writing complex commands. Furthermore, these .scomp files can be saved and reused. In particular, users can load settings for automating deployment routines through the command-line interface from these files. DevOps Automation – a versatile module The SQL Tools package includes the [DevOps module](https://www.devart.com/dbforge/sql/database-devops/) , which enables users to combine several dbForge tools into a single toolchain. This toolchain is designed to build, test, and deploy databases safely and effectively. The Schema Compare tool is just one component of that toolchain, while the functionality of the DevOps Automation module covers all tasks related to building CI/CD pipelines. Try SQL Tools for free! If you are seeking powerful and reliable tools to enhance your DevOps jobs and all aspects of your work, the dbForge tools developed for SQL Server experts are the ideal choice. Besides Schema Compare which was highly evaluated by our client, the package includes solutions for effective database development, management, analysis, and administration. The entire toolset is available as a [fully functional Free Trial](https://www.devart.com/dbforge/sql/sql-tools/download.html) , allowing you to leverage its capabilities for 30 days. This allows you to evaluate its performance and results under full workload conditions and achieve success in your work. Tags [dbForge SQL Tools](https://blog.devart.com/tag/dbforge-sql-tools) [SQL Server](https://blog.devart.com/tag/sql-server) [success story](https://blog.devart.com/tag/success-story) [Julia Lutsenko](https://blog.devart.com/author/jane-williams) Julia is a technical writer with a strong background in Linguistics. She specializes in creating clear and well-researched technical content and supports the team in delivering accurate, accessible content across platforms. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-schema-compare-helped-devops-engineers.html) [Twitter](https://twitter.com/intent/tweet?text=How+dbForge+Schema+Compare+Helped+the+DevOps+Engineers+Resolve+Their+Work+Challenges+in+Complex+Environments+%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-schema-compare-helped-devops-engineers.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-dbforge-schema-compare-helped-devops-engineers.html&title=How+dbForge+Schema+Compare+Helped+the+DevOps+Engineers+Resolve+Their+Work+Challenges+in+Complex+Environments+%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-dbforge-schema-compare-helped-devops-engineers.html&title=How+dbForge+Schema+Compare+Helped+the+DevOps+Engineers+Resolve+Their+Work+Challenges+in+Complex+Environments+%C2%A0) [Copy URL](https://blog.devart.com/how-dbforge-schema-compare-helped-devops-engineers.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Uncategorized](https://blog.devart.com/category/uncategorized) How dbForge SQL Complete Enhanced the Collaboration Within Two Development Teams at a Finland-Headquartered Tech Company By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) March 29, 2023 [0](https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html#respond) 2107 It is always interesting to explore a case where dbForge tools are used in a collaborative process. Take [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , for instance. It was created to make individual SQL developers exceptionally productive and help them write error-free code much faster with the integrated autocompletion, debugging, and refactoring. Yet it was also designed with teamwork in mind—and things like shareable templates, code snippets, and formatting profiles are rather vital when it comes to building up team productivity. One of our clients (who preferred to stay anonymous), a seasoned Lead Developer at a Finnish tech company, quickly noted that collaborative potential. The primary database toolset adopted by the client and his colleagues was SQL Server Management Studio—a proven solution for sure, but with room for improvement as well. For instance, here are a few obstacles that became evident over time: It became necessary to give a boost to the overall performance of the development teams by speeding up their routine SQL coding. At the same time, they wanted to do less manual work and keep a sharp focus on what they were doing. The default functionality of SSMS wasn’t quite enough to expand the capabilities of developers and make them more flexible in daily work. There was a need to establish unified code formatting standards for all projects, so that any developer switching to another project would be able to delve into its code as fast as possible. It is also worth noting that the client didn’t want to employ some alternative toolset; instead, he was looking for something to properly enhance SSMS with. After a research, he found the solution in our flagship SSMS add-in, [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . This is what he told us during an interview: “I act as a lead developer on multiple projects, and I’ve been working with databases quite a bit. And at first, I just looked for a plugin to use myself. I pretty much just googled the best plugin for SSMS and went through the results. I went through people’s opinions on different plugins, compared them, and saw what could be achieved with their features. Then I decided that you had the best match, exactly what I was looking for. And when I tested SQL Complete, I pretty much figured out that it could be used by a much broader team at our company.” Eventually, the client presented the solution to his colleagues, and it didn’t take long until SQL Complete was employed by two different development teams, which comprise 10 people in total. Needless to say that the expected performance growth didn’t keep them waiting. The following features were noted by the client as especially valuable. Context-aware code completion and debugging Probably the first thing that comes to mind when you think of SQL Complete is its selection of coding assistance tools: IntelliSense-like code completion; an integrated Debugger for scripts, functions, triggers, and procedures; and, of course, smart refactoring with auto-correction of references to renamed objects. Customizable SQL formatting The built-in SQL formatting features help beautify the code and improve its readability. A rich set of configurations let users effectively set up and apply formatting standards, edit the available templates, or create custom ones. It is also possible to format an entire script or its particular parts only. Predefined and custom SQL snippets Another feature that often turns out to be quite helpful for teamwork is the collection of predefined snippets that can be quickly modified, applied, and shared with fellow developers. And of course, it can be further expanded with custom snippets—the user is in full control here. Get dbForge SQL Complete for a free 14-day trial today! This is where our story ends—yet this is where yours is about to begin. You can easily try it all out yourself; simply [download SQL Complete for a free trial](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and see all of its capabilities in action. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html) [Twitter](https://twitter.com/intent/tweet?text=How+dbForge+SQL+Complete+Enhanced+the+Collaboration+Within+Two+Development+Teams+at+a+Finland-Headquartered+Tech+Company&url=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html&title=How+dbForge+SQL+Complete+Enhanced+the+Collaboration+Within+Two+Development+Teams+at+a+Finland-Headquartered+Tech+Company) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html&title=How+dbForge+SQL+Complete+Enhanced+the+Collaboration+Within+Two+Development+Teams+at+a+Finland-Headquartered+Tech+Company) [Copy URL](https://blog.devart.com/how-dbforge-sql-complete-enhanced-the-collaboration-within-two-development-teams-at-a-finland-headquartered-tech-company.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How dbForge Studio for MySQL is Involved in the DevOps Process By [dbForge Team](https://blog.devart.com/author/dbforge) September 21, 2020 [0](https://blog.devart.com/how-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html#respond) 2656 When developing a software application based upon complex database structure, it is often necessary to run diverse business logic test scenarios on the same input data. That’s where the problem of repeated recovery of test data arises. This article offers the simplest way to automate the process using only one tool – [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) . Test data recovery is a time-consuming and labor-intensive process usually done manually. Therefore, its automation is essential in assuring a fast, reliable, and cost-effective delivery. But before embarking on a detailed walkthrough on how to automate the test data recovery process with the help of dbForge Studio for MySQL, let’s first gloss over some terms. What is DevOps? DevOps is quite new and rapidly spreading concept that describes a set of practices aimed at automating software development and delivery processes so that development and IT teams could shorten the entire service lifecycle and provide continuous delivery of high-quality software. Why DevOps for databases? Database changes are the main reason for delay when performing application deployments. Thus, database development automation aimed at increasing the speed of delivery of database changes leads to shorter iterations and definitely has a positive impact on the continuous delivery of software applications. How to automate test data recovery with dbForge Studio for MySQL Background: Suppose, that our database source code is located on a remote repository, and the test data for it have to be generated or taken from other sources such as a script folder, another database, or a file. Prerequisites: dbForge Studio for MySQL installed on the machine The necessary template files previously configured with dbForge Studio for MySQL (.scomp, .dcomp, .dit, and/or .dgen). Step 1. Clone a repository Suppose that in a remote Git repository there is a scripts folder we need to use as a data source for creating our database. First of all, we need to clone this remote repository to a local folder. In this worked example, we will clone the repository to D:\\Temp\\DevOps_MySQL\\. Below is the CMD command for this operation: git clone https://github.com/repository-name/sakila.git D:\\Temp\\DevOps_MySQL Where: repository-name The name of the repository you want to clone sakila The name of the database containing the test data you want to use Note: Before running the above-mentioned command, make sure that the Git for Windows client is installed on your machine. Step 2. Create or recreate the database on the server Before deploying the database on the server, you need to create or recreate the database (drop the old database one and create a new one). That can be easily done with a database creation script, CMD, and dbForge Studio for MySQL. We are not going to labor the point in this article as these operations are pretty basic. To create the sakila database on the server, we’ll use the following command-line script: dbforgemysql.com /execute /connection:\"User Id=%user-name%;password=%your-password%;Host=%your-host%\" /inputfile \"D:\\Temp\\DevOps_MySQL\\Create_sakila2.sql\" Please note , that you need to enter your own credentials to make this script work. For those, who prefer the PowerShell operating environment, we provide the PS script as well: .\\dbforgemysql.com /execute /connection:\"User Id=%user-name%;password=%your-password%;Host=%your-host%\" /inputfile \"D:\\Temp\\DevOps_MySQL\\Create_Sakila2.sql\"\nif ($? -eq $true) { Write-host \"Database sakila2 created\" -ForegroundColor Cyan } else { Write-host \"Error\" -ForegroundColor Yellow } Again, don’t forget to add your own credentials to the script. Step 3. Synchronize schemas between your local and remote databases Below are the CMD and PS scripts to synchronize the schemas of your empty local sakila2 database with the remote sakila database. As a result of synchronization, all the tables, views, procedures, functions, and triggers that you’ve selected when configuring the template will be created. CMD dbforgemysql.com /schemacompare /compfile:\"D:\\Temp\\DevOps_MySQL\\sakila vs sakila2.scomp\" /sync\n@echo %ERRORLEVEL% PS .\\dbforgemysql.com /schemacompare /compfile:\"D:\\Temp\\DevOps_MySQL\\sakila vs sakila2.scomp\" /sync\nif ($? -eq $true) { Write-host \"Objects for sakila2 created\" -ForegroundColor Cyan } else { Write-host \"Error\" -ForegroundColor Yellow } Note: The scripts provided imply that you’ve previously configured and saved the schema comparison template file (.scomp) to synchronize the specified databases. Please, don’t forget to use the [Schema Comparison functionality](https://www.devart.com/dbforge/mysql/studio/database-synchronization.html) of dbForge Studio for MySQL to prepare the file in question. Step 4. Fill the database with data In the previous step, we have restored the database structure. All we need now is to populate our database with test data. dbForge Studio for MySQL can boast three different tools you can use to fill the database with data. Since all the three can be invoked from the command line and, thus, can be used in the database automation process, we will consider each of them separately. Populate the database using the Data Comparison functionality You can easily restore the reference data located in your script folder or in another database using the advanced Date Comparison functionality built-into dbForge Studio for MySQL. Use the following scripts to insert data into the country table: CMD dbforgemysql.com /datacompare /compfile:\"D:\\Temp\\DevOps_MySQL\\country.dcomp\" /sync\n@echo %ERRORLEVEL% PS .\\dbforgemysql.com /datacompare /compfile:\"D:\\Temp\\DevOps_MySQL\\country.dcomp\" /sync\nif ($? -eq $true) { Write-host \"Data inserted in sakila2\" -ForegroundColor Cyan } else { Write-host \"Error\" -ForegroundColor Yellow } Note: The scripts provided above imply that you’ve previously configured and saved the data comparison template file (.dcomp) to synchronize data in the specified databases. Please, don’t forget to prepare the file in question before running the script. Populate the database using the Data Import/Export functionality Data Import is the best option to fill the database with data if the reference data is stored not in the database itself, but in separate files. dbForge Studio for MySQL provides a simple and effective way to import this data directly into the tables you need, using the cutting-edge [Data Import/Export functionality](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) . Use the following scripts to insert data into the address table: CMD dbforgemysql.com /dataimport /templatefile:\"D:\\Temp\\DevOps_MySQL\\address.dit\"\n@echo %ERRORLEVEL% PS .\\dbforgemysql.com /dataimport /templatefile:\"D:\\Temp\\DevOps_MySQL\\address.dit\"\nif ($? -eq $true) { Write-host \"Data inserted in sakila2\" -ForegroundColor Cyan } else { Write-host \"Error\" -ForegroundColor Yellow } Note: The scripts provided above imply that you’ve previously configured and saved the data import template file (.dit) to import data into the specified tables. Please, don’t forget to prepare the file in question before running the script. Populate the database using the Data Generator functionality dbForge Studio for MySQL possesses powerful [functionality for generating data](https://www.devart.com/dbforge/mysql/studio/data-generator.html) . It is most suitable in situations where you need to populate the database with large amounts of realistic data, but you can not or don’t want to store this huge array of data on your disk. In this case, again, having previously configured the data generator template file (.dgen), you can use the following scripts to insert data into the actor table : CMD dbforgemysql.com /generatedata /projectfile:\"D:\\Temp\\DevOps_MySQL\\actor.dgen\"\n@echo %ERRORLEVEL% PS .\\dbforgemysql.com /generatedata /projectfile:\"D:\\Temp\\DevOps_MySQL\\actor.dgen\"\nif ($? -eq $true) { Write-host \"Data inserted in sakila2\" -ForegroundColor Cyan } else { Write-host \"Error\" -ForegroundColor Yellow } Note: It is highly important to configure and save the data generator template file (.dgen) before running the scripts provided above. All the scripts provided in the article return the exit status. As a result, if you compile the three scripts – for restoring the database structure, synchronizing database schemas, and populating the database, you’ll get a MySQL database automation script for restoring test data allowing you to monitor and control execution stages. Conclusion dbForge Studio for MySQL is a feature-rich IDE allowing you to extend the DevOps approach to your MySQL and MariaDB database development and deployment without much effort. [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) and get your app development soar. Tags [Manuals](https://blog.devart.com/tag/manuals) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html) [Twitter](https://twitter.com/intent/tweet?text=How+dbForge+Studio+for+MySQL+is+Involved+in+the+DevOps+Process&url=https%3A%2F%2Fblog.devart.com%2Fhow-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html&title=How+dbForge+Studio+for+MySQL+is+Involved+in+the+DevOps+Process) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html&title=How+dbForge+Studio+for+MySQL+is+Involved+in+the+DevOps+Process) [Copy URL](https://blog.devart.com/how-dbforge-studio-for-mysql-is-involved-in-the-devops-process.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-ssis-expression-works.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SSIS Components](https://blog.devart.com/category/products/ssis-components) How Do SSIS Expressions Work? By [Victoria Shyrokova](https://blog.devart.com/author/victorias) September 24, 2024 [0](https://blog.devart.com/how-ssis-expression-works.html#respond) 867 Using SQL Server Integration Services, you can perform all kinds of operations on your company’s data. Through a user-friendly environment with a relatively easy learning curve, you can create a vast range of complex and scalable data-rich solutions. Moreover, all of this is done within an easy-to-use IDE, where coding is often unnecessary for most common tasks. Fitting naturally into Microsoft environments, [SSIS](https://www.devart.com/ssis/what-is-ssis.html) is also capable of connecting with diverse data sources and destinations. These integration services seamlessly work for highly heterogeneous solutions. For instance, you can load data from flat files, Excel, or other databases into the SQL Server database and then set an output for equally diverse destinations. The scalability potential of SSIS makes this connectivity solution a perfect choice for third-party development and flexible integration into the original ecosystem. This flexibility stems mainly from the ability to use SSIS expressions, which you can write to return specific values, calculate values on the fly while executing the [SSIS package](https://www.devart.com/ssis/ssis-packages.html) , and perform other advanced data manipulations. As a result, the execution of the data flow will be directed dynamically according to the incoming data or environment variables. In this article, we’ll overview SSIS expressions, which are commonly used to parametrize the connection strings. We’ll also provide a list of operators and functions you can use to streamline your SSIS connectivity, and show how you can use SSIS expressions syntax for a Data Flow. Table of contents SSIS Expression Syntax SSIS Examples How to Use SSIS Expressions How Devart Components Extend the Capabilities of Standard SSIS Tasks Conclusion SSIS Expression Syntax An expression is a user-defined code snippet that returns a value. Expressions often include functions, operators, identifiers, symbols, and literals that are then used as parameters for a connection. Let’s overview some of the most common syntax elements you can use to write SSIS expressions. Operators You can use these operators to define what action you want to perform with the data. Check the list below to ensure you are aware of all the options. Operator Description + Adds two numeric expressions. + Concatenates two expressions. – Subtracts the second numeric expression from the first one. – Negates a numeric expression. * Multiplies two numeric expressions. / Divides the first numeric expression by the second one. % Provides the integer remainder after dividing the first numeric expression by the second one. () Identifies the evaluation order of expressions. == Performs a comparison to determine if two expressions are equal. != Performs a comparison to determine if two expressions are not equal. > Performs a comparison to determine if the first expression is greater than the second one. < Performs a comparison to determine if the first expression is less than the second one. >= Performs a comparison to determine if the first expression is greater than or equal to the second one. <= Performs a comparison to determine if the first expression is less than or equal to the second one. && Performs a logical AND operation. || Performs a logical OR operation. ? Returns one of two expressions based on the evaluation of a Boolean expression. & Performs a bitwise AND operation of two integer values. | Performs a bitwise OR operation of two integer values. ^ Performs a bitwise exclusive OR operation of two integer values. ~ Performs a bitwise negation of an integer. ! Negates a Boolean operand. Functions There is a wide range of functions available in SSIS, separated into different types depending on the type of data they handle. Familiarize yourself with the most common ones you can use to build an expression. Math functions Type : Math Function Parameters Description Math ABS (NumericExpression) Returns the absolute, positive value of a numeric expression. Math CEILING (NumericExpression) Returns the smallest integer greater than or equal to a numeric expression. Math EXP (NumericExpression) Returns the exponential of the specified expression. Math FLOOR (NumericExpression) Returns the largest integer that is less than or equal to a numeric expression. Math LN (NumericExpression) Returns the natural logarithm of a numeric expression. Math LOG (NumericExpression) Returns the base-10 logarithm of a numeric expression. Math POWER (NumericExpression, Power) Returns the result of raising a numeric expression to a power. The power parameter must be evaluated to an integer. Math ROUND (NumericExpression, Length) Returns the integer that is closest to a given value. The length parameter must be evaluated to an integer. Math SIGN (NumericExpression) Returns the positive (+1), negative (-1), or zero (0) sign of a numeric expression. Math SQUARE (NumericExpression) Returns the square of a numeric expression. Math SQRT (NumericExpression) Returns the square root of a numeric expression. String functions Type : String Function Parameters Description String CODEPOINT (Character expression) Returns the Unicode code value of the leftmost character of a character expression. String FINDSTRING (Character expression, String, Occurrence) Returns the location of the specified occurrence of a string within a character expression. String HEX (Character expression) Returns a string representing the hexadecimal value of an integer. String LEFT (Character expression, number) Returns the left part of a character expression with the specified number of characters. String LEN (Character expression) Returns the number of characters in a character expression. String LOWER (Character expression) Returns a character expression after converting uppercase characters to lowercase characters. String LTRIM (Character expression) Returns a character expression after removing leading spaces. String REPLACE (Character expression, Search Expression, Replace Expression) Returns a character expression after replacing a character string within the expression with either a different character string or an empty string. String REPLICATE (Character expression, times) Returns a character expression that is replicated a number of times. String REVERSE (Character expression) Returns a character expression in reverse order. String RIGHT (Character expression, number) Returns the right part of a character expression with the specified number of characters. String RTRIM (Character expression) Returns a character expression after removing trailing spaces. String SUBSTRING (Character expression, Start, Length) Returns the part of a character expression that starts at the specified position and has the specified length. String TOKEN (Character expression, Delimiter Expression, Occurrence) Returns the specified occurrence of a token in a string. A token may be marked by a delimiter in a specified set of delimiters. Returns an empty string if not found. String TOKEN COUNT (Character expression, Delimiter Expression) Returns the number of tokens in a string. A token may be marked by a delimiter in a specified set of delimiters. String TRIM (Character expression) Returns a character expression after removing leading and trailing blanks. String UPPER (Character expression) Returns a character expression after converting lowercase characters to uppercase characters. Time functions Type: Time Function Parameters Description Time DATEADD (DatePart, Number, Date) Returns a new DT_DBTIMESTAMP value after adding a number that represents a date or time interval to the specified datepart in a date. Time DATEDIFF (DatePart, StartDate, EndDate) Returns the number of date and time boundaries crossed between two specified dates. The datepart parameter identifies which date and time boundaries to compare. Time DATEPART (DatePart, Date) Returns an integer representing a datepart of a date. Time DAY (Date) Returns an integer that represents the day datepart of a date. Time GETDATE Returns the current date of the system. Time GETUTCDATE Returns the current date of the system in UTC time (Universal Time Coordinate or Greenwich Mean Time). Time MONTH (Date) Returns an integer that represents the month datepart of a date. Time YEAR (Date) Returns an integer that represents the year datepart of a date. NULL functions Type: NULL Function Parameters Description NULL ISNULL (Expression) Returns a Boolean result based on whether an expression is null. NULL REPLACENULL (Expression, Expression) Returns the value of the second expression parameter if the first expression parameter is null. NULL NULL (DataType) Returns the value for a NULL for the data type passed as parameter. Type casts Type casts Function Parameters Description Type Casts (DataType) Expression to convert Returns the casted type passed as a parameter. You can convert from to any data type supported by IIS to properly manage the data types inside a solution. SSIS Examples To better understand how SSIS expressions are built, here are several examples to explore in detail. Concatenate FirstName and LastName to obtain the complete name Since we have two separate strings and an operator that concatenates the values, we can use it to get the desired output. Expression: [First Name] + ” ” + [Last Name] FirstName: “Giancarlo” LastName: “Milano” Output: “Giancarlo Milano” Check if the LastName is empty, returning False or True In this example, we determine if two expressions are equal and look for empty expressions. Since the LastName isn’t empty, we get False output. Expression: [Last Name] == “” LastName: “Milano” Output: False If the LastName is empty, the output will return True . Expression: [Last Name] == “” LastName: “” Output: True Check if the Age value is over 18 years by returning True For age validation, you can easily use the following comparison operator. Expression: Age >= “18” LastName: 35 Output: True Check if the Age value is between 18 and 28 years Alternatively, you can check if the value falls in a certain age range. Expression: Age >= “18” && Age <= \"28\" LastName: 25 Output: True These are only several examples of using SSIS expressions to perform operations on the data you pull. Note that there are more operators you can explore yourself using the same syntax. How to Use SSIS Expressions Let’s create a practical example using several expressions we’ve already discussed to understand how to apply them in real-world scenarios and [SSIS tasks](https://www.devart.com/ssis/ssis-tasks.html) . Follow the guide below to see how the expressions are used. Create a new SSIS Project and add a Sequence Container . Within the Sequence Container , place a Data Flow Task . This will establish the fundamental environment for trying our examples in action. Open the Data Flow Task and construct the following layout: a Data Source , a Data Output , and a Derived Column element between them to test SSIS Expressions. You can use a simple file input and output to streamline the demonstration, as shown in the picture below, but any other options are also valid. Here’s a prepared dataset to serve as input. Expression: Age >= “18” && Age <= \"28\" LastName: 25 Output: True Double-click the Derived Column to enter it. Let’s create some expressions to get different results from the source data. Highlight the columns that you want to modify. The rest will remain as the default values suggested by SSIS. Derived Column Name Expression FullName [First Name] + ” ” + [Last Name] Email [Last Name] + [First Name] + “_” + Country + “@” + “school.org” Age DATEDIFF(“yy”,(DT_DBTIMESTAMP)BirthDate,GETDATE()) IsOver18Years DATEDIFF(“yy”,(DT_DBTIMESTAMP)BirthDate,GETDATE()) > 18 Press OK to complete the configuration: The final package will generate a flat file output with the data generated based on the expressions you have created. How Devart Components Extend the Capabilities of Standard SSIS Tasks Clearly, using SSIS expressions can save you so much time, as you do not have to pull all the data, accelerating integration. However, there are even more options to facilitate the process. Check the SSIS Data Flow Components from Devart to simplify your work with sources and destinations, as well as perform a lookup that won’t require you to actually pull the data to see what results you’ll be getting, significantly reducing the server round-trips. Try [SSIS Data Flow Components](https://www.devart.com/ssis/) from Devart for accelerated integration with cloud applications and databases! Conclusion With SSIS expressions, you can easily handle many possible operations with data. Certain scenarios and tools exist where writing a simple expression can help you avoid complex constructions and data manipulations, reduce the workload, and accelerate operations. Moreover, you can also use SSIS expressions to automate various tasks: generate new data from existing data, perform data validation, and direct data flow based on the result of expressions. Combined with the [SSIS Data Flow Components](https://www.devart.com/ssis/) , this can significantly enhance your data integration speed. Tags [SSIS](https://blog.devart.com/tag/ssis) [SSIS Data Flow Components](https://blog.devart.com/tag/ssis-data-flow-components) [SSIS Integration](https://blog.devart.com/tag/ssis-integration) [SSIS Tutorial](https://blog.devart.com/tag/ssis-tutorial) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-ssis-expression-works.html) [Twitter](https://twitter.com/intent/tweet?text=How+Do+SSIS+Expressions+Work%3F&url=https%3A%2F%2Fblog.devart.com%2Fhow-ssis-expression-works.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-ssis-expression-works.html&title=How+Do+SSIS+Expressions+Work%3F) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-ssis-expression-works.html&title=How+Do+SSIS+Expressions+Work%3F) [Copy URL](https://blog.devart.com/how-ssis-expression-works.html) RELATED ARTICLES [SSIS Components](https://blog.devart.com/category/products/ssis-components) [New in SSIS Data Flow Components 3.1: Optimized Performance & Expanded API Support](https://blog.devart.com/new-in-ssis-data-flow-components-3-1-optimized-performance-expanded-api-support.html) March 5, 2025 [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [SSIS Components](https://blog.devart.com/category/products/ssis-components) [How to Use Data Conversion Tasks in SSIS for Accurate Data Handling](https://blog.devart.com/how-to-use-data-conversion-tasks-in-ssis.html) April 10, 2025"} {"url": "https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Products](https://blog.devart.com/category/products) [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) How the Integration of HubSpot and Excel Powers Business Growth By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) March 26, 2024 [0](https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html#respond) 1456 In today’s business environment, CRM systems and data analysis tools are essential for achieving growth. HubSpot and Microsoft Excel are two such solutions that are actively utilized by businesses. HubSpot is a comprehensive CRM platform for managing marketing, sales, and customer service, and the importance of Excel is quite undisputable as well. This blog post explores the integration of HubSpot and Excel and how it can help businesses improve their operations. We’ll show you the easiest way to achieve this integration using the Excel Add-in for HubSpot by Devart. This feature-rich add-in allows extracting and analyzing data from HubSpot without leaving the Excel environment, making data management and analysis more efficient and streamlined. [https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-04T07_54_31_How-the-Integration-of-HubSpot-and-Excel-Powers-Business-Growth.mp3](https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-04T07_54_31_How-the-Integration-of-HubSpot-and-Excel-Powers-Business-Growth.mp3) Listen to the Devart podcast to learn how the integration of HubSpot and Excel powers business growth. Contents About HubSpot About Microsoft Excel Benefits of integration between HubSpot and Excel Best solution for integration: Devart Excel Add-in for HubSpot How to install and connect the Excel Add-in for HubSpot How to preview and import data from HubSpot to Excel How to bulk edit HubSpot data in Excel Online Conclusion: The advantages of Devart Excel Add-ins About HubSpot [HubSpot](https://hubspot.com/) is a powerful and comprehensive CRM system designed to help businesses streamline their marketing, sales, and customer service processes. It is a cloud-based software product that offers a wide range of features and capabilities. With HubSpot, businesses can create and execute marketing campaigns, automate sales processes, and provide excellent customer service. The platform offers a range of tools for inbound marketing, including social media management, email marketing, and content creation. It also includes sales automation features such as lead management, deal tracking, and sales analytics. The customer service module of HubSpot allows businesses to track interactions, manage customer inquiries, and provide personalized support. The platform’s unified database allows businesses to have a complete view of their customers, from their contact information to their purchase history. About Microsoft Excel [Microsoft Excel](https://www.microsoft.com/en-us/microsoft-365/excel) is a well-known spreadsheet application, everyone’s go-to solution for creating spreadsheets, performing calculations, and visualizing data using charts and graphs. One of the most significant features of Excel is its ability to handle large amounts of data. Users can manage and organize data using filters, sorting, and formatting. Excel also offers a range of functions and formulas for data analysis, making it an ideal tool for financial modeling, forecasting, and trend analysis. Furthermore, Excel helps users create dynamic visualizations of data, including charts, graphs, and pivot tables. All of these features make it easier to analyze data and identify patterns and trends, providing valuable insights for decision-making. The capabilities of Excel make it an ideal tool for businesses of all sizes and industries. Benefits of integration between HubSpot and Excel The integration between HubSpot and Excel creates a synergy that can significantly enhance  data analysis, reporting, and decision-making processes. For example, one of the significant benefits of integrating HubSpot and Excel is the ability to create detailed reports that help businesses identify trends, patterns, and opportunities. Excel’s advanced data manipulation and visualization tools allow analyzing data from multiple sources, including HubSpot, to create custom reports and dashboards that provide an in-depth view of their business operations. Another benefit of the integration is the ability to streamline business processes. For example, using the [Excel Add-in for HubSpot](https://www.devart.com/excel-addins/hubspot/) , businesses can easily make use of the following data: Sales data: Import sales performance data to track trends, identify top-performing products, and calculate sales forecasts. Marketing analytics: Analyze marketing campaign effectiveness, email open rates, and lead conversion metrics. Customer service metrics: Review customer satisfaction scores, ticket volumes, and resolution times to improve service strategies. Additionally, integration between HubSpot and Excel can help with budgeting and forecasting. By importing financial data from HubSpot and combining it with other data sources, businesses can create accurate and detailed financial models that enable them to make informed decisions about their operations. Overall, the integration between HubSpot and Excel offers a range of benefits for businesses looking to improve their operations and decision-making processes, which ultimately result in increased productivity and profitability. Best solution for integration: Devart Excel Add-in for HubSpot The Excel Add-in for HubSpot by Devart is a useful tool that helps manage HubSpot data directly from Microsoft Excel and thus facilitates data management and analysis. It provides real-time data synchronization, ensuring that users always have the latest information at hand. It also supports custom fields and advanced filtering and sorting. All in all, this add-in is an ideal solution for businesses of all sizes and industries looking to streamline the management and analysis of their HubSpot data. How to install and connect the Excel Add-in for HubSpot Here’s an illustrated step-by-step guide on how to install and configure the Devart Excel Add-in for HubSpot. Installation Devart Excel Add-ins are available for a variety of databases and cloud data sources. First, you need to go to the corresponding [download page](https://www.devart.com/excel-addins/hubspot/download.html) to get the installer. Then run the downloaded installer and follow the wizard’s instructions. Configuration Once you run the newly installed Add-in for Hubspot, you’ll need to choose an authentication type. For this guide, we will proceed with the private app authorization, which requires an Access Token for connection. Private app authorization If you require access to specific data from your HubSpot account via an API, consider creating a private app. This involves setting up the said app in HubSpot, configuring permissions, and generating an Access Token, which will be used to create a private app connection in the Excel Add-in. 1. First off, log in to your HubSpot and go to Settings . 2. Go to Integrations > Private apps and click Create a private app . 3. Fill in your app’s basic info, upload a logo, and provide a description. 4. On the Scopes tab, select the required permissions and create the app. 5. Use the Show token button to get your Access Token for the connection. Now you can use this Access Token to establish a connection specifically for the newly created private app. To do that, go to Excel, proceed to the settings of the add-in, and enter the Access Token in the corresponding field. To ensure that the connection has been successfully set up and is functioning correctly, it’s recommended to perform a test. You can easily do this by clicking the Test button in the same settings. This action will verify the integrity of the connection between Excel and your HubSpot account using the private app credentials you’ve provided. How to import data from HubSpot to Excel Now let’s see how to import data from HubSpot into an Excel worksheet. 1. After setting all the necessary connection parameters, click Next , which will open the Select data source object and set filter window. In this window, you can choose the table to be imported, customize conditions and filters for data import, and specify the sorting order. You can use the visual Query Builder to create a query or switch to the SQL Query tab to get the required data by writing a SELECT query. You can customize your query in the following ways: Add Conditions & Filters : Specify any conditions and filters to tailor the import. This could include date ranges, specific properties, or customer segments. Sort Order : Determine the sorting order of your data, such as ascending or descending by date, name, or any other relevant field. 2. After selecting a table or crafting an SQL query, click Next. 3. After that, you will proceed to the Preview window, where you can check your data and choose whether you want to import data to a new or existing worksheet. 4. Click Finish to complete the data import process. The data will be retrieved and displayed in the indicated Excel worksheet. How to bulk edit HubSpot data in Excel Online To reduce manual work, you can easily bulk edit HubSpot data in Excel Online. Let’s show you how it’s done with contacts and deals. Bulk editing of contacts Bulk editing contacts in Excel Online allows you to quickly update multiple records simultaneously. Follow these steps to modify contact information such as first names, cities, and job titles. To begin editing, click Edit Mode . To modify contact data, locate the required columns and edit the desired entries. For example, let’s change a contact’s first name to “Maya”. Now let’s find the column that lists the cities of our contacts and update a few city names. We might just as well scroll to job titles and edit them to reflect any new positions. Modified cells will be highlighted in olive by default, helping you track which data has been updated. If preferred, you can customize the highlighting colors via the Appearance section in the Options menu. Finally, once you have finished editing, click Commit . Bulk editing of deals Modifying deal information directly in Excel and syncing changes back to HubSpot is just as easy. For example, we will locate a deal named “EcoGuardian Tech” and change the deal name to “EcoGuardian Tech New”. Additionally, we will update the deal stage to “Qualified to Buy” by editing the respective column. To get our edits accurately reflected in HubSpot, we click Commit . That’s how you can edit your deals in about the same way. Once you commit your changes, wait a bit… …and then go to HubSpot to make sure your edits are all there. In our case, you can see that everything has been updated as intended. Here is an edited contact. And here is an edited deal. Conclusion: The advantages of Devart Excel Add-ins Devart Excel Add-ins are powerful tools that provide businesses of all sizes with simplified data management, analysis, and reporting. HubSpot is just one of the possible options; there is a huge variety of supported data sources. Let’s briefly recap some key advantages of utilizing Devart Excel Add-ins to manage your data: Streamlined data management : Excel Add-ins simplify repetitive tasks, reducing manual data entry into various sources to something as quick and simple as typing in a spreadsheet. Enhanced collaboration : Excel Add-ins easily facilitate collaboration among team members. Real-time data sharing and synchronization ensures that everyone has access to the latest information. Multitude of supported data sources : Currently, Excel Add-ins cover 25+ data sources, including the most popular database systems, major CRM platforms, and other services. Now you can see that managing your CRM data is as easy as editing a spreadsheet. Thus we encourage you to explore the rich selection of [Devart Excel Add-ins](https://www.devart.com/excel-addins/) and see them in action firsthand. Tags [hubspot](https://blog.devart.com/tag/hubspot) [Hubspot Excel integration](https://blog.devart.com/tag/hubspot-excel-integration) [Hubspot integration](https://blog.devart.com/tag/hubspot-integration) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-the-integration-of-hubspot-and-excel-powers-business-growth.html) [Twitter](https://twitter.com/intent/tweet?text=How+the+Integration+of+HubSpot+and+Excel+Powers+Business+Growth&url=https%3A%2F%2Fblog.devart.com%2Fhow-the-integration-of-hubspot-and-excel-powers-business-growth.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html&title=How+the+Integration+of+HubSpot+and+Excel+Powers+Business+Growth) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html&title=How+the+Integration+of+HubSpot+and+Excel+Powers+Business+Growth) [Copy URL](https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Access HubSpot Data Source from Power BI, Tableau, and Excel Using the ODBC Driver for HubSpot By [DAC Team](https://blog.devart.com/author/dac) September 29, 2022 [0](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html#respond) 2858 The article explains how to connect to the HubSpot database and retrieve its data using the Tableau, Excel, and Power BI tools and the ODBC driver for HubSpot. Devart ODBC Driver for HubSpot is a high-performance connection solution with enterprise-level functionality. With it, you can connect to HubSpot from reporting, analytics, BI, and ETL applications that are ODBC-compliant from 32-bit and 64-bit Windows. ODBC driver offers simple and safe access to real-time HubSpot data from any location and fully covers standard ODBC API methods and data types. Key Features Extended Syntax for SQL Thanks to the ODBC driver, working with HubSpot objects is now as simple as working with SQL tables. With the enhanced SQL syntax, you may use everything SQL offers in SELECT statements that are still backward compatible with SQL-92. DML Procedures In HubSpot, data may be updated in much the same manner as in SQL databases, thanks to the DML (INSERT, UPDATE, DELETE) capability provided by the Devart ODBC Driver for HubSpot. ODBC Conformance With the driver, the process of changing large amounts of data in HubSpot becomes more streamlined and accelerated, and you can also perform bulk changes using SQL statements. Compliance with Open Database Connectivity The ODBC driver offers complete support for ODBC Data Types and ODBC API Functions. Additionally, the use of Advanced Connection String options is accommodated. As a result, HubSpot may be accessed from any ODBC-compatible environment, including desktop and online apps. Compatibility with HubSpot ODBC driver entirely supports the HubSpot API data types. The driver may also communicate with the HubSpot application programming interface. Further, we will provide detailed instructions on how to connect to HubSpot via the ODBC driver. Advanced Data Conversion We have developed forward Data Conversion mechanisms that allow two-direction mapping among ODBC and HubSpot data types. Integration The driver works with some popular integrated development environments (IDEs) and various third-party tools like Microsoft Excel, Visual Studio, Power BI, etc. See [Compatibility](https://docs.devart.com/odbc/hubspot/compatibility.htm) for a full rundown of supported platforms and tools. Wide Range of Platforms There’s no need to tweak the driver, software, or setting to utilize the Devart ODBC Driver for HubSpot – it works on both 32-bit and 64-bit systems. Fully Unicode Driver Our ODBC driver is completely Unicode that allows you to obtain data and operate with them from HubSpot databases that include various languages and scripts, regardless of the environment’s default charset. Excellent Performance ODBC driver features, such as local data caching, connection pooling, query optimization, and more, speed up any interaction with HubSpot significantly. Connecting to HubSpot Using ODBC Driver To get data from HubSpot, you first need to create a Data Source Name (DSN) using the Data Source Administrator. 1. Start up the ODBC Data Source Administrator. You may discover ODBC Data Sources under ControlPanel > Administrative Tools . The icon was known as Data Sources (ODBC) in previous versions of Windows. Locate the software with a bitness compatible with the external program by searching for ODBC Data Sources in Windows (32-bit or 64-bit). To create a DSN in 32 bits, run odbcad32.exe from the C:WindowsSystem32 folder, whereas 64-bit DSNs may be created using odbcad64.exe from the C:WindowsSysWOW64 folder. 2. Decide between the User DSN and the System DSN options. Most software supports all DSN types. However, there are a few exceptions. 3. Press the Add button. The Create New Data Source window will appear. 4. Select the Devart ODBC Driver for HubSpot and click Finish . The driver configuration dialog will open. 5. Enter the data of network settings into the corresponding fields. 6. To check if your connection is successful, press the Test Connection button. 7. Click OK to save the DSN. How to Generate an ODBC Trace Log In both 64-bit and 32-bit ODBC Administrators, tracing is synchronized when it is enabled or disabled in one. Make sure Machine-Wide tracing for all user identities is enabled if the ODBC client program you want to monitor uses the Local System account or a user login other than yours. The ODBC Trace Log Generation on Windows Follow these procedures to create a trace file in Windows using ODBC Source Administrator. 1. In Windows 10, search for ODBC Data Sources ; in older versions of Windows, go to Control Panel > Administrative Tools , and choose the application with an appropriate bitness. 2. Choose the Tracing tab. 3. If required, modify the location of the log files in the Log File Path menu and hit the Apply button. 4. Choose the Start Tracing Now button to begin. 5. Activate a system-wide reset by relaunching all programs. 6. To verify the driver connectivity, go to the DSN settings and click the Test Connection button. 7. Reproduce the issue. 8. In the Tracing section, click the Stop Tracing Now button. 9. Provide the collected log file to us (for example, devart.log). The ODBC Trace Log Generation on macOS On macOS, open ODBC Administrator and go to the Tracing tab to activate the trace feature. 1. Open ODBC Administrator. 2. Click on the Tracing tab. 3. If required, modify the location of the log file. 4. In the When to trace option, choose All the time The ODBC Trace Log Generation on Linux On Linux, to trace the ODBC calls set the Trace and TraceFile keyword/value pairs in the [ODBC ] section of the /etc/odbcinst.ini file: [ODBC]\nTrace=Yes\nTraceFile=/home/test/devart.log After acquiring a log file, you should turn off logging since it slows down read and write operations. Using ODBC Driver for HubSpot with Third-Party Tools ODBC Driver for HubSpot can work with a wide range of ODBC-compliant tools: DBeaver Oracle Database Link Microsoft Access Microsoft Excel OpenOffice and LibreOffice PHP Power BI Python QlikView SQL Server Management Studio SSIS Tableau You can find detailed instructions on how to connect ODBC-compliant tools to HubSpot via ODBC Driver [here](https://docs.devart.com/odbc/hubspot/using_in_third_party_tools.htm) . Connecting to HubSpot with ODBC Driver into Power BI Power BI is a popular business intelligence solution consisting of services, applications, and connections that enable you to gather raw data from diverse sources and produce relevant insights. With our ODBC driver, you can easily link Power BI with the HubSpot data source. Let’s look at how it works. It is expected to have a DSN for the ODBC driver for HubSpot already installed and set up. 1. Start Power BI Desktop, then choose Get Data . 2. In the Get Data dialog box, navigate to the Other section and choose ODBC . Then, click Connect to confirm the choice. 3. Expand the Data Source Name DSN drop-down menu in the From ODBC dialogue box, and then choose the DSN you set up for HubSpot earlier. 4. If you wish to input a SQL statement to narrow down the returned results, click the Advanced options arrow, which widens the dialog box, and type or paste your SQL statement. 5. Hit the OK button. If your data source requires authentication, Power BI will request the appropriate credentials from you. To log in, fill out the required sections with your Username and Password , then press the button. 6. Now you should see the data structures in your data source. Clicking on a database item will provide a summary of its contents. 7. Select the table you need to import into Power BI from HubSpot, and click Load . Connecting to HubSpot with ODBC Driver into Tableau You can import raw data, run analyses, and generate insightful reports using Tableau – a data visualization tool. You may link to various cloud and on-premise relational and non-relational databases using Tableau Desktop and our ODBC drivers. 1. Launch Tableau Desktop. 2. On the homepage, click More in Connect pane. 3. Select Other Databases (ODBC) . 4. Expand the DSN drop-down list and select the DSN you set up for HubSpot. Alternatively, you may pick the Driver option and, if you haven’t already done so, select the Devart ODBC Driver for HubSpot option from the drop-down menu. 5. Hit the Connect button. 6. You may click Sign in when your connection has been established successfully. 7. Choose the appropriate HubSpot database and database schema. Tables in the linked data source should be shown here. 8. Insert the table’s name by dragging it to the field labeled. The data may be retrieved by dragging the necessary tables into this area, or a new custom SQL query can be created by clicking the corresponding button. 9. To see the most recent information, choose Update Now . Through an ODBC connection, you may import HubSpot data into Microsoft Excel. Data may be imported into a Microsoft Excel workbook and formatted into a table with the help of the ODBC Driver. Use the same version of Excel as the ODBC driver you’re working with; for example, if you installed a 64-bit ODBC driver, you should use a 64-bit version of Excel. Using ODBC drivers, you may access data from various sources inside Microsoft Excel: Incorporating Get & Transform to Link Excel and HubSpot (Power Query) Data Connection Wizard: Linking Excel and HubSpot (Legacy Wizard) Using Excel’s Query Wizard to Connect to HubSpot Connecting Excel to HubSpot with Microsoft Query Using PowerPivot to Link Excel and HubSpot Connecting Excel to HubSpot Using Get & Transform (Power Query) Connecting Excel to HubSpot through ODBC is possible using Get & Transform (Power Query). The use of an ODBC driver for HubSpot is assumed here. 1. Select Get Data from the drop-down menu after clicking Data in Excel. Then Select From ODBC under From Other Sources . 2. Pick your data source in the From ODBC menu (DSN). You may input the connection string for your data source if you haven’t done so (without credentials, defined in the credentials dialog box in the next step). If you choose, you may also provide a SQL query that will run when a connection has been made. Hit the OK button. 3. The database is where you’ll go to input your login and password for accessing a database, followed by Connect . 4. Select Default or Custom and then hit Connect if your database is not password-protected or if you have already entered your credentials in the ODBC data source settings. 5. Select the table from which you wish to load data and click Load in the resulting box. 6. An Excel spreadsheet with the table’s information will open. Connecting Excel to HubSpot with Data Connection Wizard (Legacy Wizard) This solution allows you to access a predefined OLE DB or ODBC external data source. 1. Click on the file’s Data tab in Excel. Select Other Sources from the menu, then use the Data Connection Wizard to import your data. 2. In the new window, choose ODBC DSN and click Next to proceed. 3. To proceed, click the Next button and choose the data source you want to link to. 4. When you’ve located the table holding the necessary data, click on its name, click Next to fill out the details of your new file, or Finish to save your changes. 5. Choose how you want your data displayed in Excel and where you want it placed in the worksheet from the Import Data dialog, then click OK . 6. The necessary information has been added to the current Excel sheet. Using Excel Query Wizard to Connect to HubSpot You may use this option to write a basic query for getting data from HubSpot to Excel through the ODBC driver. Click the Data button in Excel’s main menu to get started. Open the aforementioned drop-down option and choose From Microsoft Query from the list of available sources. Once the dialogue box has been displayed, choose the desired data source. After establishing a connection, choose the information you’d want to see in Excel and go to the next step. Data filtering and classification are the subsequent two phases. To go on without completing these steps, click Next . The Save option to the right allows you to save the query for later use. 7. Click Return Data To Microsoft Excel and hit Finish . 8. Select how you want your data displayed in Excel and where you want it placed in the worksheet from the Import data window, then click OK . 9. Excel now has the necessary information. Connecting to HubSpot into Excel with Microsoft Query This feature allows you to construct a more complex query for exporting data from HubSpot to Excel using the ODBC driver. 1. Launch Excel, and go to the Data menu. 2. Choose From Other Sources and then From Microsoft Query on the new ribbon. 3. Select the data source you want to link to in the following window (e.g., using the data source name – Devart ODBC HubSpot). Uncheck To create or modify queries , use the Query Wizard and choose OK . 4. You may now choose which tables to include in your query. When you’re ready, select the plus sign. 5. Using the visual editor, you may do various data manipulation tasks, such as filtering rows or columns, sorting data, joining tables, making a parameter query, etc. Connecting to HubSpot into Excel Using PowerPivot PowerPivot is an Excel add-in that can be used to conduct in-depth analyses of data and build sophisticated data models. Follow these steps to begin importing the necessary information: 1. Navigate to the PowerPivot window by selecting it from the Excel menu, clicking the PowerPivot tab, and selecting Manage . 2. Select From Other Sources in the new window that opens. 3. Select Others (OLEDB/ODBC) and click Next in the Table Import Wizard . 4. Select the Build button within the Specify a Connection String box. 5. Select the desired data source in the Data Link Properties pane (for instance, Devart ODBC HubSpot), and then click Next . 6. Select a method for data importation now (either select a table from the list or write a query to specify the data to be imported). 7. You might exit the window by selecting Close if the import was successful. The information that was obtained is added to the current worksheet. Tags [data import](https://blog.devart.com/tag/data-import) [odbc](https://blog.devart.com/tag/odbc) [odbc driver for hubspot](https://blog.devart.com/tag/odbc-driver-for-hubspot) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Access+HubSpot+Data+Source+from+Power+BI%2C+Tableau%2C+and+Excel+Using+the+ODBC+Driver+for+HubSpot&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-access-hubspot-data-source-from-powerbi-tableau-excel.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html&title=How+to+Access+HubSpot+Data+Source+from+Power+BI%2C+Tableau%2C+and+Excel+Using+the+ODBC+Driver+for+HubSpot) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html&title=How+to+Access+HubSpot+Data+Source+from+Power+BI%2C+Tableau%2C+and+Excel+Using+the+ODBC+Driver+for+HubSpot) [Copy URL](https://blog.devart.com/how-to-access-hubspot-data-source-from-powerbi-tableau-excel.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Access Snowflake Data Warehouse Easily from Power BI or Tableau using ODBC Driver for Snowflake By [DAC Team](https://blog.devart.com/author/dac) October 20, 2022 [0](https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html#respond) 2884 This article explains how to connect to Snowflake using ODBC Driver for Snowflake and make a trace log on Windows, macOS, and Linux operating systems. Besides, you will also know how to access and view Snowflake data using third-party tools on the examples of such BI and analytics instruments as Power BI and Tableau. Devart ODBC Driver for Snowflake is a high-performance solution with enterprise-level functionality that allows connecting to Snowflake easily. Using our driver, you can connect fast to Snowflake from different analytics, reporting, BI, or ETL applications that are ODBC-compliant from both 32-bit and 64-bit Windows. ODBC driver fully covers standard ODBC API methods and data types and offers safe and simple access to real-time Snowflake data from any location. Let’s first take a look at the features of the ODBC driver for Snowflake. Key Features Connection to Snowflake ODBC-enabled programs may establish secure connections to Snowflake via the Internet using our connectivity solution. Connecting through a proxy server is an alternative to connecting directly to Snowflake. The Syntax of Extended SQL Utilizing our ODBC driver, you may integrate Snowflake objects like regular SQL tables into your data infrastructure. All the advantages of SQL may be used in SELECT queries that are compatible with SQL-92 thanks to the enhanced SQL syntax: Complex JOINs; WHERE conditions; Subqueries; GROUP statements; Aggregation functions; ORDER statements; and more. Bulk Updates The driver enables bulk updates to Snowflake by uniting SQL queries into batches, which streamlines and speeds up the process of updating massive amounts of data in Snowflake. Compliance with ODBC This driver fully supports the standard ODBC interface: ODBC API Functions support ODBC Data Types support Additionally, Advanced Connection String options are supported. This means Snowflake may be accessed from any ODBC-compatible environment, including desktop and online apps. Snowflake Compatibility The ODBC driver for Snowflake entirely supports all of the data formats provided by the Snowflake API. In addition, the driver works well with the official Snowflake API. Superior Conversion of Data Our sophisticated Data Conversion methods allow reversible mapping between any Snowflake and ODBC data type. Integration The driver supports third-party data analysis tools like Microsoft Excel, which may be used in conjunction with Visual Studio and other integrated development environments (IDEs). Refer to the [Compatibility](https://docs.devart.com/odbc/snowflake/compatibility.htm) page for a comprehensive rundown of supported software and hardware. Multiplicity of Systems There is no need to further set up the driver, programs, or environment to utilize the Devart ODBC Driver for Snowflake with either 32-bit or 64-bit software on either x32 or x64 systems. Fully Unicode Driver The ODBC driver for Snowflake is completely Unicode. You can accurately extract and deal with data from multilingual Snowflake databases, regardless of the charset used (Latin, Cyrillic, Hebrew, Chinese, etc.) or the localization of your working environment. Exceptional Efficiency Our driver’s features, which include local data caching, connection pooling, query optimization, and more, significantly quicken the performance of any operation when working with Snowflake. Support For immediate assistance from trained experts, speedy issue solving, and nightly builds with hotfixes, please visit the [Support](https://www.devart.com/odbc/snowflake/support.html) page. Connecting to Snowflake with the ODBC Driver After the driver has been installed, a Data Source Name (DSN) should be set up in the ODBC Data Source Administrator for use with Snowflake. Open the ODBC Data Source Administrator. Please note that before Windows 8, this icon was referred to as Data Sources (ODBC) . In Windows, go to the search box and type ODBC Data Sources; then, choose the program whose bitness is compatible with the external program (32-bit or 64-bit). Additionally, ODBC Data Sources may be accessed under Administrative Tools in the Control Panel . You may also create a 32-bit DSN by running C:WindowsSystem32odbcad32.exe, or a 64-bit DSN by running C:WindowsSysWOW64odbcad32.exe. 2. Choose either the User DSN or the System DSN menu. While both kinds of DSNs are generally compatible, specific programs may only operate only with one. 3. Click Add button. You’ll see a window that prompts you to Create a New Data Source . 4. Choose the Devart ODBC Driver for Snowflake and press Finish . The Driver Setup Dialog window will pop up. 5. To connect to Snowflake, fill up the fields with the relevant data and click Ok . Making an ODBC Trace Log on Windows Tracing is synchronized across the 32-bit and 64-bit ODBC Administrators, so if you turn it on or off in one, it does the same thing in the other. Select Machine-Wide tracing for all user identities if the ODBC client program you want to monitor runs under the Local System account or any other user login than your own. For SSMS, this is a potential need. Using Windows ODBC Source Administrator, create a trace file. Follow the steps below. 1. In Windows 10, type ODBC Data Sources into the search box (in older versions of Windows, choose Control Panel > Administrative Tools ) and select the appropriate bitness for your application. 2. Go to the Tracing tab. 3. Alter the location of the log files if required. Verify that the program has to write access to the selected path, and then hit the Apply button. 4. Simply press the Trace Now button. 5. It’s time to force a reload of every program. 6. To verify the driver’s connectivity, go to the DSN settings and click the Test Connection button. 7. Try to recreate the issue. 8. On the Tracing menu, choose Stop Tracing Now . 9. Please provide the collected log (for example, devart.log). Making an ODBC Trace Log on macOS Use the ODBC Administrator’s Tracing tab to activate the tracing feature on macOS. 1. Bring up the ODBC Administrator. 2. Go to the Tracing tab. 3. Modify the location of the Log file if required. 4. In the When to trace drop-down, choose Always. Creating an ODBC Trace Log on Linux In Linux, you may monitor ODBC calls by configuring the Trace and TraceFile keyword/value pairs in the [ODBC] section of the /etc/odbcinst.ini file, as shown below: [ODBC]\nTrace=Yes\nTraceFile=/home/test/devart.log After receiving a log file, you should deactivate logging since it slows down to read/write operations. Using Third-Party Tools with ODBC Driver for Snowflake ODBC Driver for Snowflake is compliant with the following tools: DBeaver Oracle Database Link Microsoft Access Microsoft Excel OpenOffice and LibreOffice PHP Power BI Python QlikView SQL Server Management Studio SSIS Tableau For a detailed description of every tool, please follow this [link](https://docs.devart.com/odbc/snowflake/using_in_third_party_tools.htm) . Connecting Power BI to Snowflake using ODBC Driver Power BI is a famous solution that includes services, applications, and connectors for aggregating data from diverse sources and generating insights. A compatible ODBC driver allows Power BI to communicate with various databases. This comprehensive guide will teach how to use an ODBC driver to connect to Snowflake and get your data into Power BI Desktop. That will mean you have a Snowflake DSN for ODBC driver set up and ready to go. 1. Start Power BI Desktop and select Get Data . 2. To use an ODBC connection, open the Get Data dialogue box and choose the Other tab. Confirm your selection by clicking the Connect button . 3. If you’re using Snowflake, you may choose it from the preexisting DSN list in the From ODBC dialog box by expanding the drop-down menu and clicking on it. 4. Select the Advanced options arrow to see an expanded dialogue box where you may write or paste a SQL query to filter the results. 5. Power BI will ask you for those details if the connection to your data source requires authentication. To log in, fill out the blanks with your Username and Password, then press the OK button. 6. Your data source’s underlying structures should now be visible. Simply clicking on a database item will provide a glimpse of its contents. 7. Select the table you’ll use for analysis, and then click Load to import the Snowflake data into Power BI. Using an ODBC Connection to Bring Snowflake Data into Tableau Tableau is a data visualization application, that also allows for importing raw data, conducting analyses, and generating reports. You can connect to a wide variety of cloud and on-premise relational and non-relational databases using Tableau Desktop and our ODBC drivers. Let’s see how to connect Tableau Desktop to Snowflake using the ODBC driver for Snowflake. 1. Start Tableau Desktop. 2. Select More… in the Connect section of the homepage to access additional features. 3. Pick Different Information Sources (ODBC) . 4. Select the Snowflake-specific DSN you set up earlier from the resulting drop-down menu. Instead of creating a DSN, you may pick Driver and then the Devart ODBC Driver for Snowflake from the drop-down menu. 5. Hit the Link button. 6. Please choose Sign in when your connection has been established successfully. 7. Choose the appropriate Snowflake database and schema. 8. There should be a list of all tables to which you have access in the linked data source. 9. Write the table’s name into the corresponding field. Then drop tables here to access their contents, or use the New Custom SQL button to craft a query that will get only the necessary information. 10. To see the most recent information, choose Update Now . Integrating the ODBC Driver for Snowflake into an SSIS Data Flow Data migration is only one of the many tasks that SQL Server Integration Services (SSIS) can perform. Microsoft ODBC 3.x is used for communication between the Devart ODBC Driver for Snowflake and SSIS, which acts as a translation layer between the data source and SSIS. A problem may arise if you use the SQLExecDirect function to extract data from an ODBC data source since SSIS anticipates the ODBC 2.x behavior while the ODBC driver keeps retrieving data from a data source using ODBC version 3.x. Open the DSN settings, go to the Advanced Settings tab, and then pick Ver 2.x from the ODBC Behavior drop-down to ensure that SQLExecDirect functions correctly. Tags [odbc](https://blog.devart.com/tag/odbc) [sql operators](https://blog.devart.com/tag/sql-operators) [SSIS](https://blog.devart.com/tag/ssis) [SSIS Data Flow](https://blog.devart.com/tag/ssis-data-flow) [ssms](https://blog.devart.com/tag/ssms) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-access-snowflake-using-odbc-driver-by-devart.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Access+Snowflake+Data+Warehouse+Easily+from+Power+BI+or+Tableau+using+ODBC+Driver+for+Snowflake&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-access-snowflake-using-odbc-driver-by-devart.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html&title=How+to+Access+Snowflake+Data+Warehouse+Easily+from+Power+BI+or+Tableau+using+ODBC+Driver+for+Snowflake) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html&title=How+to+Access+Snowflake+Data+Warehouse+Easily+from+Power+BI+or+Tableau+using+ODBC+Driver+for+Snowflake) [Copy URL](https://blog.devart.com/how-to-access-snowflake-using-odbc-driver-by-devart.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-activate-and-use-sql-complete-express.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to activate and use SQL Complete Express By [dbForge Team](https://blog.devart.com/author/dbforge) April 24, 2019 [0](https://blog.devart.com/how-to-activate-and-use-sql-complete-express.html#respond) 8439 [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) is a multifunctional add-in for SSMS that simplifies the development and administration processes greatly. Our users have an opportunity to use this tool in two editions: Standard and Express. The availability of a free Express edition appears after the 14-day trial Standard. Let’s see what the user has to do to start using dbForge SQL Complete Express. Installation We provide the opportunity to use the fully functional version of SQL Complete for a month absolutely for free. Therefore, you have to start using the trial period first and choose the option of switching to the free Express edition after. To do this, you have to go to our website and [create a personal account](https://id.devart.com/Account/Login?ReturnUrl=%2Fconnect%2Fauthorize%2Fcallback%3Fclient_id%3Dwebsite%26redirect_uri%3Dhttps%253A%252F%252Fwww.devart.com%252Fsignin-oidc%26response_type%3Dcode%2520id_token%26scope%3Dopenid%2520profile%2520devart-ordering-api%2520offline_access%26response_mode%3Dform_post%26nonce%3D636915227469158555.ZDRjYTExZDYtNWMwMy00MDhhLTg0NTMtZGIyMzc5MTM0ZjFjOTdmOTI2YjAtOWUzYy00ZjhkLTk1NTYtZjZlODZmMzE2YjE1%26acr_values%3Dsection%253Alogin%26state%3DCfDJ8Ma2WGTzuPtNkknzlAtRs1RikBgr083Vbw8rX3DZ1l3XzVhRpVhD8BpPosxY7v6Ids7IDpwapd0SMRD1ho2xBHZJaeYXaCAhdAkfBKo7dZ3-ejA_U_o3Kk3tUqd0bq5CXk_MMDz5sbd6_lGsIoAAfEY8OIb0veGdM1-KdajtLBa2VZig7mLlH9b8rx-VJ_az6LRSIHL7uKG1QsjhaCmXrUHZXtgq3PoNePfnZpPuMAmhCO-HiCaUEsu-5rXsWPcvXYtx1sHttRs7_WStgwIyvqtUbK-RjaFVLJQliv_ezWHOgZnScR_UCsbgNeQ8nVl961egW1BDkyhI95BiKYMqil1xLM6AZJdAbHE9KhqRr2ud%26x-client-SKU%3DID_NETSTANDARD2_0%26x-client-ver%3D5.3.0.0) via the registration form. After that, you will be able to download and use all our products without any restrictions. Go to the [SQL Complete download page](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and choose the Standard Edition trial, as shown below: The installation process is intuitive and does not take much time. After the add-in install, all its functionality will be automatically added to the IDE and will be ready to use. Using trial At the first launch after installation, you will receive a notification window with the information about the start of the trial period usage. Each next day, the notification timer will vary to keep the user track the remaining time. During the trial period, you can use the whole features pack of SQL Complete Standard without any restrictions. Customize profiles, optimize the development, automate processes – the product will greatly facilitate the implementation of all tasks and save time for you to drink an extra cup of coffee. You can learn more about SQL Complete on our [Documentation page](https://docs.devart.com/sqlcomplete/) . Switching to Express Edition After the expiration of the trial period, you will receive a notification that the possibility of using the full functionality is over. After that, it will be necessary to purchase a license or start using the free version. Express edition has restricted functionality and provides less optimization compared to Standard. More details about the differences between the editions can be found on the [Editions page](https://www.devart.com/dbforge/sql/sqlcomplete/editions.html) . In case you prefer to use the Express edition, it will be necessary to click the corresponding button in the notification window after the SSMS launch. After that, the add-in itself will perform the reconfiguration and will be ready for use. Note, that the notification window will not appear after switching to Express edition. In case you want to start using Standard edition again and purchase a license key, it will be necessary to activate it manually. To do that, you will have to click on the SQL Complete tab inside the IDE, select Help and click on Activate Product . In the opened dialogue window, it will be necessary to paste the purchased license key and click Activate . Tags [sql complete](https://blog.devart.com/tag/sql-complete) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-activate-and-use-sql-complete-express.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+activate+and+use+SQL+Complete+Express&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-activate-and-use-sql-complete-express.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-activate-and-use-sql-complete-express.html&title=How+to+activate+and+use+SQL+Complete+Express) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-activate-and-use-sql-complete-express.html&title=How+to+activate+and+use+SQL+Complete+Express) [Copy URL](https://blog.devart.com/how-to-activate-and-use-sql-complete-express.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-address-column-value-dependencies.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Fixing Dependent Column Issues With dbForge Data Generator By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) September 3, 2024 [0](https://blog.devart.com/how-to-address-column-value-dependencies.html#respond) 918 Users should maintain data consistency to ensure accurate database testing and analysis, especially when generating data in dependent columns. For example, they can use formulas, data generators, or post-scripts to fill in columns based on other columns in the table. However, post-scripts may be complicated and affect database performance. In the article, we’ll explore how to generate values for the referenced columns using dbForge Data Generator for SQL Server. We’ll also discuss common troubleshooting issues and the ways to resolve them. dbForge Data Generator is a tool to populate tables with realistic random data using basic demo data generators, such as Python, RegExp, Lorem Ipsum, Text File, Files Folder, Weighted List, and others. In addition, it is part of dbForge SQL Tools, an ultimate suite for simplifying complex database tasks and fostering a more agile development environment that establishes heightened productivity through its intuitive interfaces and advanced functionalities. Contents Understanding the requirements Offering a solution using dbForge Data Generator for SQL Server Setting up the Data Generator Using Python for dependent column calculation Addressing NULL values in generated data Analyzing server-specific considerations Troubleshooting common issues Understanding the requirements For demonstration purposes, we create a sample database – Tax – and an empty table – dbo.PurchaseData , to populate it with testing data. The table will contain the PurchasePrice and TaxAmount columns. -- Create the database\nCREATE DATABASE Tax;\n\n-- Create the PurchaseData table in the dbo schema\nCREATE TABLE dbo.PurchaseData (\n PurchaseID INT PRIMARY KEY IDENTITY(1,1), \n PurchasePrice MONEY NOT NULL, \n TaxAmount MONEY NOT NULL \n); Here is what we’ll do to manage dependencies between columns using the Python generator available in the dbForge tool: Generate random values for the PurchasePrice column Calculate TaxAmount based by multiplying PurchasePrice by 0.15 Handle NULL values in the generated columns Offering a solution using dbForge Data Generator for SQL Server Let us learn how to generate the referenced column values using [Data Generator for SQL Server](https://www.devart.com/dbforge/sql/data-generator/) . The workflow would be as follows: Setting up the Data Generator Using Python for dependent column calculation Addressing NULL values in generated data Analyzing server-specific considerations Setting up the Data Generator To begin, [download](https://www.devart.com/dbforge/sql/data-generator/download.html) the dbForge Data Generator from the Devart website and [install](https://docs.devart.com/data-generator-for-sql-server/getting-started/installing.html) it on your computer. Once done, open the tool and initiate data generation using one of the following ways: On the Start Page, click New Data Generation . On the standard toolbar, click New Data Generation . This will open the Data Generator Project Properties dialog, where you can set up generation options. On the Connection page, select the required connection and the database in which the table you want to populate is located. Note that if you have not established the server connection, you can create a new one by selecting Manage from the Connection list. On the Options page, set default data generation options and click Open . The Data Generation document appears, displaying the settings you can configure for the table. Using Python for dependent column calculation In the Tables and columns to populate tree, select the TaxAmount column. In the Column generation settings for TaxAmount section, select Python from the Generator list. Clear the default text and enter the following expression: round(PurchasePrice * 0.15, 2) This expression performs the following actions: Calculates 15% of the PurchasePrice . The round function rounds the result of the multiplication to represent a monetary amount with two decimal places. 2 specifies that the rounding should be to two decimal places. The TaxAmount column will be automatically re-calculated. On top of the document, click Populate data to the target database . This will open the Data Population Wizard , where you need to select the Execute the data population script against the database output option and click Generate . Then, click New SQL on the toolbar and execute the SELECT query to verify that the data has been generated: Done! Let us now consider some cases where you may encounter errors when generating data for SQL tables. Addressing NULL values in generated data If there is a NULL in the referenced columns, you may get an error. To prevent errors or incorrect results, add the following part to your Python code: if column_name is not None else NULL where column_name is the column name based on which the result will be calculated. If we rewrite the previously used code, we get the following result: Analyzing server-specific considerations Some servers require inserting 0 instead of NULL in the generated column; otherwise, an error may arise. In this case, we recommend modifying the Python code as follows: if column_name is not DBNull.Value else 0 In dbForge Data Generator, you must also clear the Include NULL values checkbox to turn off the generation of NULL values. So, let us update the Python code in our example: Troubleshooting common issues When generating data using a Python generator or any automated tool, you might face issues that prevent the process from working as expected. Let us consider some tips to help you identify and solve these issues for a successful data generation process: What to do if the proposed solution does not work Review error messages and logs produced by the Python script. They may have information about syntax errors, data type mismatches, or constraint violations that caused unexpected behavior. Ensure that your Python script can successfully connect to the database. Issues such as incorrect credentials, firewall restrictions, or network problems can impact data generation. Run the data generation process on a small subset of data to see if the issue is related to data volume or complexity. Checking the DDL of the table for column constraints Use SQL commands or a database management tool to review the DDL of the target table. For example, search for constraints such as NOT NULL , UNIQUE , CHECK , or FOREIGN KEY that might cause data generation failure. Ensure that the data types used in your Python script match the column definitions in the table. Review default values and triggers to verify that they do not interfere with the data you insert. Ensuring that the database structure supports the desired data generation Before starting data generation, validate that the database schema is correctly set up to support the data you want to generate. Check that primary keys and unique indexes are correctly defined to avoid duplicate records. Otherwise, the data generation process will fail. Update your data generation to include correct foreign key values. Consider limitations such as maximum table size, row limits, or available storage space on the server if the table and database cannot handle the volume of data you want to generate. Optimize database for generating large datasets. Conclusion In summary, we have explored how easily it is to generate the referenced columns with NULL values using dbForge Data Generator. We have also examined some tips you should consider to ensure errorless data generation. [Download dbForge SQL tools](https://www.devart.com/dbforge/sql/sql-tools/) and experience how its advanced capabilities can enhance your productivity and database development and management. Tags [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-address-column-value-dependencies.html) [Twitter](https://twitter.com/intent/tweet?text=Fixing+Dependent+Column+Issues+With+dbForge+Data+Generator&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-address-column-value-dependencies.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-address-column-value-dependencies.html&title=Fixing+Dependent+Column+Issues+With+dbForge+Data+Generator) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-address-column-value-dependencies.html&title=Fixing+Dependent+Column+Issues+With+dbForge+Data+Generator) [Copy URL](https://blog.devart.com/how-to-address-column-value-dependencies.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-automate-database-schema-changes-tracking-with-powershell-scripts.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Automatically Track Database Schema Changes With PowerShell Scripts By [dbForge Team](https://blog.devart.com/author/dbforge) November 11, 2021 [0](https://blog.devart.com/how-to-automate-database-schema-changes-tracking-with-powershell-scripts.html#respond) 3948 In the article, you will find a detailed guide on how to automatically monitor a SQL Server database for schema changes and automatically create a report and log file. The examples of scripts to set up the process are provided. Automation of database schema changes tracking increases the speed of application releases and brings a certain level of agility to the organization’s database development routine. This practice facilitates the lives of database developers and database administrators (DBAs) and improves their performance. Database change management is not easy. The database is not a simple collection of files or code fragments, and you can not just roll back to the previous version to correct a mistake as in application development. Database changes are definitely the riskiest part of any application update and as a database grows, DBAs start to keep a close eye on it to avoid unexpected issues. That’s why it is so important to automate database schema changes monitoring—to save a database team loads of time and energy. In a multi-user environment, to keep the risks to a minimum, it is vitally important to regularly monitor changes made to the database schemas. Can the process be automated? How to set up automatic database schema changes tracking Prerequisites To monitor a SQL Server database for schema changes we will use the [dbForge Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) tool that comes as part of the [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle, a custom configuration file, and PowerShell scripts. By the way, to learn how to use dbForge Schema Compare for creating a database during the CI process, feel free to watch [this video](https://youtu.be/hllTzoXvoO8) . The logic of the process Step 1. A custom PowerShell script initiates dbForge SQL Schema Compare, [creates a database snapshot](https://docs.devart.com/schema-compare-for-sql-server/working-with-other-data-sources/working-with-snapshots.html) , and puts it into the D:\\Monitor\\Baseline folder. Step 2. After a specified time interval, another PowerShell script launches and initiates the schema comparison between the current database and the snapshot. Step 3. In case the schema differences are detected, a new snapshot is created with the current date and time and placed into the D:\\Monitor\\DiffSnapshots folder, a report is created and placed into the D:\\Monitor\\DiffReports folder, and a log file with the text: “The databases are different” is created in the D:\\Monitor\\Logs folder. In case there are no schema differences found, exit code 100 is returned, and, accordingly, a new snapshot and a report are not generated, but only a log file with the text: “The databases are identical” is created in the D:\\Monitor\\Logs folder so that the DBA could be sure that the process has run. Please, see the picture below to better understand the logic behind the process. Implementation Create the configuration file In the configuration file D:\\Monitor\\Databases.txt , you need to specify the server name, the database name, authentication type, login, and password for generating a snapshot, as well as the target database to compare it with. For example: demo-mssql\\SQLEXPRESS,AdventureWorks2022_P1,,login,password Our configuration file looks like as below as we using Windows Authentication to connect to SQL Server. demo-mssql\\SQLEXPRESS,AdventureWorks2022_P1,true,, Create a custom PowerShell script to create a database snapshot #region Variables\n\n$rootFolder = \"D:\\Monitor\"\n$databasesTxtPath = \"D:\\Monitor\\Databases.txt\"\n#Declare $diffToolLocation variable for dbForge Studio for SQL Server\n$diffToolLocation = \"C:\\Program Files\\Devart\\dbForge SQL Tools Professional\\dbForge Schema Compare for SQL Server\\schemacompare.com\"\n#Declare $diffToolLocation variable for dbForge Studio for SQL Server\n#$diffToolLocation = \"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\"\n\n#endregion\n\nforeach ($line in [System.IO.File]::ReadAllLines($databasesTxtPath)) {\n # Read the connection parameters for the current database from the configuration file\n $server = ($line -split \",\")[0]\n $database = ($line -split \",\")[1]\n $isWindowsAuthentication = ($line -split \",\")[2]\n $userName = ($line -split \",\")[3]\n $password = ($line -split \",\")[4]\n \n $BaselineLocation = New-Item -ItemType Directory -Force -Path ($rootFolder + \"\\\" + \"BaseLine\") \n \n $srvCleanName = ($server -replace \"\\\\\", \"\")\n $currentSnapshotFile = Join-Path $BaselineLocation \"$srvCleanName.$database.snap\" \n \n \n # Create database connection\n if ($isWindowsAuthentication -eq 'True') {\n $connectionString = \"Server=$server;Database=$database;Integrated Security=True;\" \n }\n else {\n $connectionString = \"Server=$server;Database=$database;User ID=$userName;Password=$password;\"\n }\n\n\n # Test database connection\n Write-Host \"Testing the database connection...\"\n $connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)\n try {\n $connection.Open()\n Write-Host \"Connection successful\"\n } \n catch { Write-Host \"Connection failed: $($_.Exception.Message)\" } \n finally { $connection.Close() }\n \n \n Write-Host \"Creating a snapshot for the Server: $server; Database: $database\"\n \n # Create a snapshot \n if ($isWindowsAuthentication -eq 'True') {\n Start-Process -FilePath $diffToolLocation \"/snapshot /connection:`\"Data Source=$server;Initial Catalog=master;Integrated Security=True;User ID=$userName`\" /database:$database /file:`\"$currentSnapshotFile`\" /compress:No\" -PassThru -Wait -windowstyle hidden\n } \n else { \n Start-Process -FilePath $diffToolLocation \"/snapshot /connection:`\"Data Source=$server;Initial Catalog=master;Integrated Security=False;User ID=$userName`\" /database:$database /password:$password /file:`\"$currentSnapshotFile`\" /compress:No\" -PassThru -Wait -windowstyle hidden\n } \n} You can run the script manually, schedule its execution, or use it in your CI. On successful script execution, a new snapshot will be created in the D:\\Monitor\\Baseline folder. Create a custom PowerShell script to run the comparison between the snapshot and a target database #region Variables\n\n$rootFolder = \"D:\\Monitor\"\n$databasesTxtPath = \"D:\\Monitor\\Databases.txt\"\n#Declare $diffToolLocation variable for dbForge Studio for SQL Server\n#$diffToolLocation = \"C:\\Program Files\\Devart\\dbForge SQL Tools Professional\\dbForge Schema Compare for SQL Server\\schemacompare.com\"\n#Declare $diffToolLocation variable for dbForge Studio for SQL Server\n$diffToolLocation = \"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\"\n\n#endregion\n\n\nforeach ($line in [System.IO.File]::ReadAllLines($databasesTxtPath)) {\n # Read the connection parameters for the current database from the configuration file\n $server = ($line -split \",\")[0]\n $database = ($line -split \",\")[1]\n $isWindowsAuthentication = ($line -split \",\")[2]\n $userName = ($line -split \",\")[3]\n $password = ($line -split \",\")[4]\n $today = (Get-Date -Format \"dd-MM-yyyy_HH_MM_ss\")\n $BaselineLocation = New-Item -ItemType Directory -Force -Path ($rootFolder + \"\\\" + \"BaseLine\") \n $DiffsnapshotsLocation = New-Item -ItemType Directory -Force -Path ($rootFolder + \"\\\" + \"DiffSnapshots\") \n $ReportsLocation = New-Item -ItemType Directory -Force -Path ($rootFolder + \"\\\" + \"DiffReports\") \n $logsLocation = New-Item -ItemType Directory -Force -Path ($rootFolder + \"\\\" + \"Logs\") \n $srvCleanName = ($server -replace \"\\\\\", \"\")\n $currentSnapshotFile = Join-Path $BaselineLocation \"$srvCleanName.AW2019Dev.snap\"\n $currentReportFile = Join-Path $ReportsLocation \"$srvCleanName.$database.$today\"\n $logName = Join-Path $logsLocation \"$srvCleanName.$database.$today.txt\"\n $diffSnapshotFile = Join-Path $DiffsnapshotsLocation \"$srvCleanName.$database.$today.snap\"\n \n\n Write-Host \"Server: $server; Database: $database; isWindowsAuthentication: $isWindowsAuthentication\"\n\n \n# Create database connection\n if ($isWindowsAuthentication -eq 'True') {\n $connectionString = \"Server=$server;Database=$database;Integrated Security=True;\"\n $TargetConnectionString = \"Data Source=$server;Initial Catalog=$database;Integrated Security=True;\" \n }\n else {\n $connectionString = \"Server=$server;Database=$database;User ID=$userName;Password=$password;\"\n $TargetConnectionString = \"Data Source=$server;Initial Catalog=$database;Integrated Security=False;User ID=$userName;Password=$password;\"\n }\n\n\n # Test database connection\n Write-Host \"Testing the database connection...\"\n $connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)\n try {\n $connection.Open()\n Write-Host \"Connection successful\"\n } \n catch { Write-Host \"Connection failed: $($_.Exception.Message)\" } \n finally { $connection.Close() }\n \n \n # Log information about checking the database \n New-Item -ItemType File -Force -Path $logName\n \n \n # Compare \n $process = Start-Process -FilePath $diffToolLocation -ArgumentList \"/schemacompare /source snapshot:`\"$currentSnapshotFile`\" /target connection:`\"$TargetConnectionString`\" /report:`\"$currentReportFile`\" /reportformat:html /includeobjects:All /log:`\"$logName`\"\" -PassThru -Wait -windowstyle hidden\n \n \n \n\n \n # Return exit code 100 in case the databases are identical\n if ($process.ExitCode -eq 100) {\n Add-Content -Path $logName -Value \"The databases are identical\"\n #remove the newly created report, since no differences are detected\n Remove-Item -Path $currentReportFile\".html\" -Force:$true -Confirm:$false\n continue\n }\n else {\n Add-Content -Path $logName -Value \"The databases are different\"\n # Generate a new snapshot in case there are differences detected \n if ($isWindowsAuthentication -eq 'True') {\n Start-Process -FilePath $diffToolLocation \"/snapshot /connection:`\"Data Source=$server;Initial Catalog=master;Integrated Security=True;User ID=$userName`\" /database:$database /file:`\"$diffSnapshotFile`\" /compress:No\" -PassThru -Wait -windowstyle hidden\n } \n else { \n Start-Process -FilePath $diffToolLocation \"/snapshot /connection:`\"Data Source=$server;Initial Catalog=master;Integrated Security=False;User ID=$userName`\" /database:$database /password:$password /file:`\"$diffSnapshotFile`\" /compress:No\" -PassThru -Wait -windowstyle hidden\n } \n}\n} You can run the script manually, schedule its execution, or use it in your CI. On successful script execution, a new snapshot is created in the D:\\Monitor\\DiffSnapshots folder, and a diff report is created in the D:\\Monitor\\DiffReports folder. Workflow The database administrator every morning monitors the D:\\Monitor\\Logs and D:\\Monitor\\DiffSnapshots folders and, accordingly, can see whether there have been changes. If there were changes, there is a newly created snapshot file in the D:\\Monitor\\DiffSnapshots folder. Then the administrator can check the D:\\Monitor\\DiffReports folder to view and analyze the differences. The database administrator can also compare the baseline snapshot from the D:\\Monitor\\Baseline folder and the snapshot that was automatically generated and placed into the D:\\Monitor\\DiffSnapshots folder. In this simple, yet elegant way, you can use dbForge Schema Compare for SQL Server to automate the process of detecting and tracking database changes. Scaling The proposed worked example to automate database schema changes tracking can be scaled to multiple databases. Just make changes to the configuration file—add the databases you want to track schema changes for. demo-mssql\\SQLEXPRESS,AdventureWorks2022_P1,true,,\ndemo-mssql\\SQLEXPRESS,AdventureWorks2022_P2,true,,\ndemo-mssql\\SQLEXPRESS,AdventureWorks2022_P3,true,, Conclusion The article provides a simple way of automating SQL Server database schema comparison tasks that can be scaled to multiple databases. It is worth mentioning, that the schema compare functionality is also available in [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) —our all-in-one IDE that covers nearly every aspect of SQL Server database development, management, and administration. To automatically monitor a SQL Server database for schema changes, you can use either dbForge Schema Compare for SQL Server or dbForge Studio for SQL Server—just choose the tool that suits you best. Devart products come with a free 30-day trial. Download [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) or [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/download.html) and see for yourself that automation of schema changes tracking can be easy and painless. Note To give our customers more choice, we deliver dbForge Schema Compare for SQL Server as part of two different toolkits: [dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) (a pack of two essential compare tools) and [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) (a pack of 15 advanced tools for DB development, management, and administration )—just choose the tools pack that meets your project requirements. Tags [Automate Database Changes Tracking](https://blog.devart.com/tag/automate-database-changes-tracking) [database changes](https://blog.devart.com/tag/database-changes) [monitor database changes](https://blog.devart.com/tag/monitor-database-changes) [PowerShell Script](https://blog.devart.com/tag/powershell-script) [Schema Compare](https://blog.devart.com/tag/schema-compare) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-automate-database-schema-changes-tracking-with-powershell-scripts.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Automatically+Track+Database+Schema+Changes+With+PowerShell+Scripts&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-automate-database-schema-changes-tracking-with-powershell-scripts.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-automate-database-schema-changes-tracking-with-powershell-scripts.html&title=How+to+Automatically+Track+Database+Schema+Changes+With+PowerShell+Scripts) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-automate-database-schema-changes-tracking-with-powershell-scripts.html&title=How+to+Automatically+Track+Database+Schema+Changes+With+PowerShell+Scripts) [Copy URL](https://blog.devart.com/how-to-automate-database-schema-changes-tracking-with-powershell-scripts.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Automatically Commit SQL Server Database Schema Changes to the GIT Repository By [dbForge Team](https://blog.devart.com/author/dbforge) December 22, 2022 [0](https://blog.devart.com/how-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html#respond) 2658 Want to make git ‘commit’ automatically run every time a database schema is updated? Read more to find out how you can do this with dbForge Schema Compare for SQL Server. In a perfect world, any schema changes to a database come through a strict review and management process with quite a few database administrators having access to deploying those changes. Unfortunately we do not live in a perfect world. Thus, tracking the history of schema changes is essential for database health. As a DBA, you might want to know the whole history of schema updates, even if the user later chose to “undo” their changes to a database. In this article, we will demonstrate to you how to configure git-autocommit on every change in the structure of your databases. How to set up database schema changes git autocommit: Step-by-step guide Let us first have a look at the overall logic of the process, and then dwell on each step in a more detail. Scenario [Create a scripts folder](https://docs.devart.com/schema-compare-for-sql-server/working-with-other-data-sources/working-with-scripts-folders.html) from the database whose schema changes you want to git-autocommit. Link that scripts folder to a Git repository. Run the PowerShell scripts that initiates the [schema comparison between a database and a scripts folder](https://docs.devart.com/schema-compare-for-sql-server/working-with-other-data-sources/working-with-scripts-folders.html) . Here, the functionality of dbForge Schema Compare for SQL Server is used. In case there are no changes detected, no action is taken. However, if there are any schema changes, the scripts folder gets updated. After the scripts folder has been updated, the changes are automatically committed and pushed to the Git repository. Let us now look at the real-life example, showing to put this into practice. Worked example Prerequisites A database the schema changes of which you want to autocommit [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) downloaded and installed A scripts folder linked to the remote Git repository PowerShell script The following script initiates schema comparison, and in case there are schema changes detected, it updates the scripts folder and automatically commits and pushes the changes to the remote Git repository. All you need is to run this script from the command line. You can either create a .ps1 file with the given script and initiate its execution from PowerShell or Command Prompt or paste the script directly to terminal. $ErrorActionPreference = \"SilentlyContinue\"\n\n$connectionString = \"Data Source=server_name;Initial Catalog=database_name;Integrated Security=False;User ID=user_name; Password=password\"\n$scriptsfolder = \"D:\\JordanS\\dbForgeSchemaCompareRepository\\scriptsfolder_name\\\"\n$diffToolLocation = \"C:\\Program Files\\Devart\\Compare Bundle for SQL Server\\dbForge Schema Compare for SQL Server\\schemacompare.com\"\n\n# Set working directory where a script folder repo is located\nSet-Location -Path $scriptsfolder\n \n# Switch to branch \"main\" and update the files in the working directory\ngit checkout main\n\n# Launch schema comparison between a database and a scripts folder\n$process = Start-Process -FilePath $diffToolLocation -ArgumentList \"/schemacompare /source connection:`\"$connectionString`\" /target scriptsfolder:`\"$scriptsfolder`\"\" -PassThru -Wait -windowstyle hidden\n\n# Exit code 101 means that schema differences have been detected.\nif ($process.ExitCode -eq 101) { \n\n Write-Host \"There are differences between the database and the scripts folder.\"\n\n # Synchronize the database and the script folder\n $process = Start-Process -FilePath $diffToolLocation -ArgumentList \"/schemacompare /source connection:`\"$connectionString`\" /target scriptsfolder:`\"$scriptsfolder`\" /sync\" -PassThru -Wait -windowstyle hidden\n\n # Add the files or changes to the repository\n git add --all \n\n # Commit new files/changes to the local repository\n git commit -m \"Script folder is changed\" \n\n # Push to the remote branch\n git push -u origin main\n}\nelseif ($process.ExitCode -eq 100){\n Write-Host \"Database and scripts folder are identical.\"\n} Where: connectionString – specifies information about a data source and how to connect to it. $scriptsfolder – provides a path to the scripts folder. $diffToolLocation – specifies a path to the tool to be used for schema comparison. Note There are two tools by Devart that you can use for comparing schemas between a database and a scripts folder: Schema Compare for SQL Server, which we have already mentioned earlier, and a all-in-one IDE containing a bunch of useful tools for working with SQL Server – [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . Suppose, we have the AdventureWorks2019 database scripts folder that is linked to our JordanSandersRepo GitLab repository. Let us save the scripts as UpdateDiff.ps1 and run it from the command line. The connection string for it will look like as follows: $ErrorActionPreference = \"SilentlyContinue\"\n\n$connectionString = \"Data Source=JordanS\\SQLSERVER2019;Initial Catalog=AdventureWorks2019;Integrated Security=False;User ID=JordanS; Password=JordY\"\n$scriptsfolder = \"D:\\JordanS\\AdventureWorks2019\\\"\n$diffToolLocation = \"C:\\Program Files\\Devart\\Compare Bundle for SQL Server\\dbForge Schema Compare for SQL Server\\schemacompare.com\" Note You need to provide a full path to the scripts folder. As you can see, our database and the scripts folder are different, and the [exit code](https://docs.devart.com/schema-compare-for-sql-server/using-the-command-line/exit-codes-used-in-the-command-line.html) of the operation is 0, which means it was successful. Let us now look at our Git repository. As you can clearly see, a new commit containing the schema changes has been made to the repository. Let us run the script again. Our database and the scripts folder are now supposedly identical. The operation returns the 100 [exit code](https://docs.devart.com/schema-compare-for-sql-server/using-the-command-line/exit-codes-used-in-the-command-line.html) , which means that there are no differences between the database and the scripts folder. How to automate and schedule database comparison and synchronization with the version-controlled scripts folder You can schedule and automate the comparison and synchronization task described above with the help of dbForge Schema Compare for SQL Server (or dbForge Studio for SQL Server) and Windows Task Scheduler. Scheduling database comparison and synchronization includes the following steps: Create a .ps1 file containing the PowerShell script to initiate schema comparison and automatically commit and push the changes to the remote Git repository. Create a .bat file to initiate that script. Create a synchronization task using Windows Task Scheduler to call that .bat file. As we have already created a .ps1 file, let us move to the step 2. To create a .bat file: Open any 3-rd part text editor, for example, a Notepad or Notepad++. Enter the following command to it: PowerShell.exe -file UpdateDiff.ps1 Save the file with .bat extension. Place the newly created .bat file to the same folder with your .ps1 file. Now, all you need is to schedule that .bat file execution with any task scheduler tool, for example, Windows Task Scheduler. To schedule the .bat file execution: 1. Open the Control Panel > Administrative Tools and select Task Scheduler . 2. In the Task Scheduler window that opens, navigate to the Actions pane and click Create Basic Task to create a scheduled task. 3. In the Create Basic Task Wizard window that opens, specify the name and description of the task and click Next . 4. On the Trigger tab, choose when to launch the task and then click Next . 5. On the Action tab, click Start a program to schedule a program to start automatically and then click Next . 6. On the Start a Program subtab, click Browse to select the .bat file you have created earlier and then click Next . 7. On the Finish tab, verify the settings and click Finish . The task will be displayed in the Active Tasks section. Got tired of having to write commands via CLI? Choose the [best Git client for Windows](https://blog.devart.com/top-10-mysql-gui-tools-for-database-management-on-windows.html) option to strengthen your toolset! Conclusion Devart products give you the power to automate your database routines. In this article, we provided a detailed walkthrough on how you can automate and schedule your schema comparison tasks. All you need is dbForge Schema Compare tool, a PowerShell script, and a .bat file. Couple of minutes of preparations, and you will free up a lot of time time for more important tasks. How to get this functionality To give our customers more choice, the dbForge Schema Compare for SQL Server functionality can be acquired as part of three different toolkits: [dbForge Compare Bundle for SQL Server](https://www.devart.com/dbforge/sql/compare-bundle/) (a pack of two essential compare tools) [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) (a pack of 15 advanced tools for DB development, management, and administration) [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) (an all-in-one IDE designed to cover all the possible database-related tasks). Just choose the tools pack that best meets your project requirements, download it from our website, and start a free 30-day trial to evaluate the functionality. Tags [dbForge Schema Compare](https://blog.devart.com/tag/dbforge-schema-compare-2) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Automatically+Commit+SQL+Server+Database+Schema+Changes+to+the+GIT+Repository&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html&title=How+to+Automatically+Commit+SQL+Server+Database+Schema+Changes+to+the+GIT+Repository) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html&title=How+to+Automatically+Commit+SQL+Server+Database+Schema+Changes+to+the+GIT+Repository) [Copy URL](https://blog.devart.com/how-to-automatically-commit-sql-server-database-schema-changes-to-the-git-repository.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-automatically-generate-sql-server-documentation.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to quickly generate documentation for your SQL Server database By [dbForge Team](https://blog.devart.com/author/dbforge) November 2, 2016 [3](https://blog.devart.com/how-to-automatically-generate-sql-server-documentation.html#comments) 11071 In this blog post, we will learn how to quickly create a comprehensive documentation of an SQL Server database using [SQL documentation tool](https://www.devart.com/dbforge/sql/documenter/) from Devart. There are a lot of benefits you get when generating database documentation automatically using dbForge Documenter for SQL Server. You don’t have to spend hours retrieving and writing technical information from scratch. Database documentation tool builds an accurate and error-free documentation in just a few minutes. Besides, Documenter provides a rich set of features for customizing documentation output to meet your specific requirements. Documenter presents documentation in an easy to view format, so you can share it with your boss or clients, other developers, DBAs, testers, project managers, business executives or other related persons. The following example demonstrates how to generate documentation for the AdventureWorks2012 sample database. 1. Run [dbForge Documenter for SQL Server](https://www.devart.com/dbforge/sql/documenter/download.html) . Click New Documenter… on the Start Page . 2. In the Select Connections dialog box, select one or several existing connections from the list or create a new one. Click Select . 3. The Database Documenter project opens. 4. The Preview pane initially shows the Cover Page of the documentation, on which you can enable breadcrumb navigation links, add a logo, specify a header, title, and descriptive text of the documentation, specify the author and the date of creation. In the upper right Style drop-down list, you can select a style defining the appearance of the documentation contents. Documenter contains a number of style templates and also allows to use various Bootstrap themes to change the documentation layout and appearance. 5. In the search field of the Structure pane, start typing the database name you want to document. For example, type “ Adv… ”. As you type, Documenter filters out the databases and displays only the matching ones and highlights the relevant letters of the search text. 6. In the Structure pane, click the arrow next to the AdventureWorks2012 to expand the list of database objects. Documenter starts retrieving data from the database and analyzing its structure. Documenter extracts an extensive database info including a wide range of SQL object types, their details, properties, as well as inter-object dependencies and DDL codes. 7. After the data have been extracted, the Structure pane displays a tree-view structure of the components available in the database. You can further expand the components by clicking the arrows. 8. Click the checkboxes next to the components you want to include in the documentation. These can be the components at different levels, such as the entire database, or a specific table, or a column of a table. 9. The right-hand pane shows a preview of the information that will be included in the documentation. The information is represented under the following sections: Description , Object Types , Properties , Options , Database Files , and Sections To Include . These are the database-level sections. Along with that, Documenter allows you to configure elements of documentation on several levels, including: – servers level; – server level; – databases level; – database level; – objects group level; – database object level. For example, in the Structure pane, click the arrow next to Tables to expand the list of tables in the AdventureWorks2012 database and then select the Person.Address table from the list. The Documenter shows a preview for the table with individual sections ( Description , Properties , Columns , Indexes , Foreign Keys , SQL Script , Depends On , Used By ) that will be included in the documentation. You can exclude any of the listed sections and also specific properties (in the Properties section) for the Person.Address table so they will not appear in the generated documentation. In the same manner, you can customize the documentation by excluding sections and properties of objects at different levels mentioned above, for example, server, database, views, stored procedures, etc. 10. The descriptive text that is automatically inserted in the Description field for each database object is pulled from the MS_Description extended properties. You can edit the descriptions directly in Documenter. 11. When you’re all set, click Generate to proceed to the generation of documentation. 12. The Generate Documentation dialog box opens, and you are prompt to choose a file format for the documentation. HTML format is suitable, for example, for databases to be published on the web, and PDF is good for distributing to various systems and devices. Both formats are searchable, which is very convenient especially for large databases. 13. Specify the folder to save the generated documentation to. Optionally, you can select to Append timestamp to the file name and to Open documentation after generation . 14. Click Generate to start generating the AdventureWorks2012 documentation. Documenter shows the progress of the generation. 15. After successful generation, Documenter will open the generated documentation, if the Open documentation after generation option has been selected. If not, open the destination folder you’ve specified to explore the documentation. More Features One of the great features of Documenter is that it integrates seamlessly with SQL Server Management Studio (SSMS). Databases may be documented directly from the Object Explorer of your SSMS solution. What is more, Documenter is included in [dbForge Developer Bundle for SQL Server](https://www.devart.com/dbforge/sql/developer-bundle/) , an ultimate toolkit from Devart that combines feature-rich functionality allowing to version-control databases, compare schemas and data, optimize database performance, write SQL queries on a fly, generate meaningful test data and much more straight in your favorite IDE. Conclusion As you can see, dbForge Documenter for SQL Server fully automates the process of generating documentation and reduces the amount of manual effort applied while yielding a comprehensive nice-looking technical description of an SQL Server database. Tags [documenter](https://blog.devart.com/tag/documenter) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-generate-sql-server-documentation.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+quickly+generate+documentation+for+your+SQL+Server+database&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-generate-sql-server-documentation.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-automatically-generate-sql-server-documentation.html&title=How+to+quickly+generate+documentation+for+your+SQL+Server+database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-automatically-generate-sql-server-documentation.html&title=How+to+quickly+generate+documentation+for+your+SQL+Server+database) [Copy URL](https://blog.devart.com/how-to-automatically-generate-sql-server-documentation.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 3 COMMENTS Andrey November 3, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 8:49 pm The example of document required in this article. Marshall Stickley November 15, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 5:21 pm Is this an accurate statement “Documenter builds an accurate and error-prone documentation in just a few minutes.” If so, not sure I would want the software if it is generating “Error-Prone” documentation. dbForge Team November 18, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 10:56 am Sorry for this misprint. Of course, Documenter generates error-free documentation. Comments are closed."} {"url": "https://blog.devart.com/how-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to automatically synchronize data in two SQL Server databases on a schedule By [dbForge Team](https://blog.devart.com/author/dbforge) June 3, 2019 [0](https://blog.devart.com/how-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html#respond) 13769 Data in SQL databases often needs to be synchronized in real time – this can be achieved by checking one database for updates and then applying them to another database. In this case, the process of change detection and synchronization should be run automatically on a schedule and with no need for outside intervention. How we can achieve this goal [Data Compare](https://www.devart.com/dbforge/sql/datacompare/) is an external tool that allows you to compare data in SQL databases, backups, and script folders. With dbForge Data Compare for SQL Server, you can [schedule almost real-time database synchronization](https://docs.devart.com/data-compare-for-sql-server/using-the-command-line/scheduling-database-synchronization.html) . You can set up the process by following these steps: [Run Data Compare](https://www.devart.com/dbforge/sql/datacompare/download.html) In the New Data Comparison window, choose the source and target databases on the Source and Target tab: On the Options tab, set up various comparison settings, if needed. On the Mapping tab, select which objects should be compared. Also, you can specify the key columns and the list of columns for comparison, if needed: To start the comparison process, click Compare in the bottom right corner. When the comparison is done, you can see the results in detail: Select all necessary objects by using the corresponding checkboxes and click Save : The saved project (dcomp) file will contain all objects and options needed for scheduling data synchronization. When the project (dcomp) file is saved, click Synchronize which will open the Data Synchronization wizard: Choose Execute the script directly against the target database so that the databases can be synchronized after you set up all necessary options: Now, click Synchronize in the bottom right corner When the synchronization process is over, you can view synchronization results in the bottom pane of the window. Automating the process As we have already successfully tested the synchronization process in Data Compare and saved the project (dcomp) file, let’s automate the process with a PowerShell script. Setting things up First, we’ll need to create a function that will check if the Outputs folder exists – it will be used to store date-stamped output summaries. We want to make sure that an easy-to-find application log of every synchronization is saved in case we will need to perform troubleshooting in the future: #checks if the Outputs folder exists. If it doesn’t, the script creates it and returns its full path\nfunction CheckAndCreateFolder($rootFolder, [switch]$Outputs)\n{\n $location = $rootFolder\n\n #set the location based on the used switch\n if($Outputs -eq $true)\n {\n $location += \"\\Outputs\"\n }\n #create the folder if it doesn't exist and return its path\n if(-not (Test-Path $location))\n { mkdir $location -Force:$true -Confirm:$false | Out-Null }\n\n return $location\n} Next, define the root folder and the location for data-stamped output summaries: #set the root folder\n$rootFolder = \"D:\\DataSync\\\"\n\n#set the location of output files\n$outsLoc = CheckAndCreateFolder $rootFolder -Outputs Variable and switches In this section, we define the application’s location along with the data stamp variable. Also, we define the variable containing the application’s parameters, such as: the path to the saved project (dcomp) file; the /sync switch for direct synchronization of the destination database; the /rece switch which returns the ‘102 – No differences detected’ message when data sources are equal; a date-stamped output summary. The following script allows us to achieve this: #define the tool’s location, date stamp variable and the tool’s parameters \n$toolLocation = \"C:\\Program Files\\Devart\\Compare Bundle for SQL Server Professional\\dbForge Data Compare for SQL Server\\datacompare.com\"\n$dateStamp = (Get-Date -Format \"Mmddyyyy_HHMMss\")\n\n#output log file path\n$logPath = \"$outsLoc\\DataOutput_$dateStamp.txt\"\n\n$Params = \"/datacompare /compfile:\"\"D:\\DataSync\\Project\\test_DB_1vstest_DB_2.dcomp\"\" /log:\"\"$logPath\"\"\"\n$sync = \" /sync\" Execution The next part of the PowerShell script will call Data Compare from its location with the parameters we stated in the previous step. Then, the return code variable is defined: #initiate the comparison of data sources\n(Invoke-Expression (\"& `\"\" + $toolLocation +\"`\" \" +$Params))\n $returnCode = $LASTEXITCODE\n \n $message = \"\" The script’s final part serves to create proper responses for the three possible outcomes: An error occurred and the output summary will be opened. There are differences, e.g. return code 0 – Success There are no differences, e.g. return code 100 – No differences detected if ($returnCode -notin (100, 101))\n { #an error is encountered\n $logPath = \"$outsLoc\\DataOutput_error.txt\"\n\n $message >> $logPath\n clear-content $logPath\n $message = \"`r`n $returnCode - An error is encountered\"\n\n #output file is opened when an error is encountered\n Invoke-Item \"$logPath\"\n }\n else{\n if ($returnCode -eq 101)\n {\n clear-content $logPath\n (Invoke-Expression (\"& `\"\" + $toolLocation +\"`\" \" +$Params+$sync))\n $returnCode = $LASTEXITCODE\n\n #schema changes are detected\n }\n if($returnCode -eq 0)\n {\n $message = \"`r`n $returnCode - Schema changes were successfully synchronized\"\n }\n else\n {\n #there are no schema changes\n if($returnCode -eq 100)\n {\n $message = \"`r`n $returnCode - There are no schema changes. Job aborted\"\n }\n }\n }\n $message >> $logPath Now that the job has been automated, it can be scheduled in in any way you prefer – for example, with the help of Windows Scheduler. Reviewing results Once everything is up and running, an output summary can be reviewed anytime. In this example, the location of output files is defined by the $outsLoc variable, so the output files will be saved to $rootFolder\\$outsLoc – in this particular example, DataSync\\Outputs: If an error occurs when the script is being executed, an error message will be displayed to provide more information about the potential cause of this error. Additionally, a DataOutput_error.txt file with details of the error will be created. Here’s the script in its entirety: #checks if the Outputs folder exists. If it doesn’t, the script creates it and returns its full path\nfunction CheckAndCreateFolder($rootFolder, [switch]$Outputs)\n{\n $location = $rootFolder\n\n #set the location based on the used switch\n if($Outputs -eq $true)\n {\n $location += \"\\Outputs\"\n }\n #create the folder if it doesn't exist and return its path\n if(-not (Test-Path $location))\n { mkdir $location -Force:$true -Confirm:$false | Out-Null }\n\n return $location\n}\n\n#set the root folder\n$rootFolder = \"D:\\DataSync\\\"\n\n#set the location of output files\n$outsLoc = CheckAndCreateFolder $rootFolder -Outputs\n\n#define the tool’s location, date stamp variable and the tool’s parameters \n$toolLocation = \"C:\\Program Files\\Devart\\Compare Bundle for SQL Server Professional\\dbForge Data Compare for SQL Server\\datacompare.com\"\n$dateStamp = (Get-Date -Format \"Mmddyyyy_HHMMss\")\n\n#output log file path\n$logPath = \"$outsLoc\\DataOutput_$dateStamp.txt\"\n\n$Params = \"/datacompare /compfile:\"\"D:\\DataSync\\Project\\Database1vsDatabase2.dcomp\"\" /log:\"\"$logPath\"\"\"\n$sync = \" /sync\"\n\n#initiate the comparison of data sources\n(Invoke-Expression (\"& `\"\" + $toolLocation +\"`\" \" +$Params))\n $returnCode = $LASTEXITCODE\n \n $message = \"\"\n\nif ($returnCode -notin (100, 101))\n { #an error is encountered\n $logPath = \"$outsLoc\\DataOutput_error.txt\"\n\n $message >> $logPath\n clear-content $logPath\n $message = \"`r`n $returnCode - An error is encountered\"\n\n #output file is opened when an error is encountered\n Invoke-Item \"$logPath\"\n }\n else{\n if ($returnCode -eq 101)\n {\n clear-content $logPath\n (Invoke-Expression (\"& `\"\" + $toolLocation +\"`\" \" +$Params+$sync))\n $returnCode = $LASTEXITCODE\n\n #schema changes are detected\n }\n if($returnCode -eq 0)\n {\n $message = \"`r`n $returnCode - Schema changes were successfully synchronized\"\n }\n else\n {\n #there are no schema changes\n if($returnCode -eq 100)\n {\n $message = \"`r`n $returnCode - There are no schema changes. Job aborted\"\n }\n }\n }\n $message >> $logPath If any questions or issues arise during the process of setting this up, feel free to [contact us](https://www.devart.com/support/) anytime. Tags [data compare](https://blog.devart.com/tag/data-compare) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [synchronize database](https://blog.devart.com/tag/synchronize-database) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+automatically+synchronize+data+in+two+SQL+Server+databases+on+a+schedule&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html&title=How+to+automatically+synchronize+data+in+two+SQL+Server+databases+on+a+schedule) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html&title=How+to+automatically+synchronize+data+in+two+SQL+Server+databases+on+a+schedule) [Copy URL](https://blog.devart.com/how-to-automatically-synchronize-data-in-two-sql-server-databases-on-a-schedule.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html", "product_name": "Unknown", "content_type": "Blog", "content": "[SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Automatically Synchronize Schema Changes in Two SQL Server Databases on a Schedule By [dbForge Team](https://blog.devart.com/author/dbforge) May 3, 2022 [0](https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html#respond) 3294 This article introduces a solution on how to automatically synchronize two SQL Server databases on a schedule. Prerequisites To automate and schedule schema changes, we will use the [dbForge Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/) tool that comes as part of the [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) bundle, the project file created with the help of dbForge Schema Compare, and a customized PowerShell script. The logic behind the whole process is to have a PowerShell script that will launch the .scomp project file created via dbForge Schema Compare and containing all the necessary schema comparison and synchronization settings. That PowerShell script responsible for initiating the Schema Compare tool and running the synchronization can be further scheduled via any third-party task scheduler. Thus, to set up automatic schema sync between two databases, you need to go through the following steps: Save the project file Create the PowerShell script for scheduling synchronization Schedule the PowerShell script execution Check the synchronization results Save the project file To begin with, you need to create and save the Schema Compare project (.scomp) file. This file will be invoked by the PowerShell script that we will create in the next step. 1. Run [dbForge Schema Compare](https://www.devart.com/dbforge/sql/schemacompare/download.html) . 2. Click New Schema Comparison : 3. In the New Schema Comparison wizard, select the source and target connection and databases you want to synchronize schema changes between: 4. On the Options , Schema Mapping , and Table Mapping pages, you can customize the comparison settings. To get more information on how to do this, refer to [Setting comparison options](https://docs.devart.com/schema-compare-for-sql-server/comparing-schemas/setting-comparison-options.html) , [Mapping Schemas](https://docs.devart.com/schema-compare-for-sql-server/comparing-schemas/mapping-schemas.html) , [Mapping Tables, and Columns](https://docs.devart.com/schema-compare-for-sql-server/comparing-schemas/mapping-tables-and-columns.html) topics of the Schema Compare documentation. 5. To start the comparison process, in the New Schema Comparison wizard, click Compare . The window displaying the schema comparison results will open: 6. Add objects to synchronization by selecting the corresponding checkboxes and then click the Save icon: Note The saved project (.scomp) file will contain information on the objects for synchronization and default schema synchronization options. Add custom schema synchronization options to the project file In case you need the Schema Compare project file to include the custom synchronization options, click Synchronize to open the Schema Synchronization wizard. On the Options tab of the wizard, make the necessary settings, close the wizard, and only then save the project file. Create the PowerShell script for scheduling the synchronization process Next, you need to create the PowerShell script that will launch the synchronization process. #checks if the Outputs folder exists. If it doesn’t, the script creates it and returns its full path\nfunction CheckAndCreateFolder($rootFolder, [switch]$Outputs)\n{\n $location = $rootFolder\n\n #set the location based on the used switch\n if($Outputs -eq $true)\n {\n $location += \"\\Outputs\"\n }\n #create the folder if it doesn't exist and return its path\n if(-not (Test-Path $location))\n { mkdir $location -Force:$true -Confirm:$false | Out-Null }\n\n return $location\n}\n\n#set the root folder\n$rootFolder = \"C:\\SchemaSync\\\"\n\n#set the location of output files\n$outsLoc = CheckAndCreateFolder $rootFolder -Outputs\n\n#define the tool’s location, date stamp variable and the tool’s parameters \n$diffLoc = \"C:\\Program Files\\Devart\\Compare Bundle for SQL Server Professional\\dbForge Schema Compare for SQL Server\\schemacompare.com\"\n$dateStamp = (Get-Date -Format \"Mmddyyyy_HHMMss\")\n\n#output log file path\n$logPath = \"$outsLoc\\DataOutput_$dateStamp.txt\"\n\n$Params = \"/schemacompare /compfile:\"\"C:\\SchemaSync\\Project\\adventureworks2019.scomp\"\" /log:\"\"$logPath\"\"\"\n$sync = \" /sync\"\n\n#initiate the schema comparison and synchronization process\n(Invoke-Expression (\"& `\"\" + $diffLoc +\"`\" \" +$Params))\n $returnCode = $LASTEXITCODE\n \n $message = \"\"\n\nif ($returnCode -notin (100, 101))\n { #an error is encountered\n $logPath = \"$outsLoc\\DataOutput_error.txt\"\n\n $message >> $logPath\n clear-content $logPath\n $message = \"`r`n $returnCode - An error is encountered\"\n\n #output file is opened when an error is encountered\n Invoke-Item \"$logPath\"\n }\n else{\n if ($returnCode -eq 101)\n {\n clear-content $logPath\n (Invoke-Expression (\"& `\"\" + $diffLoc +\"`\" \" +$Params+$sync))\n $returnCode = $LASTEXITCODE\n\n #schema changes are detected\n }\n if($returnCode -eq 0)\n {\n $message = \"`r`n $returnCode - Schema changes were successfully synchronized\"\n }\n else\n {\n #there are no schema changes\n if($returnCode -eq 100)\n {\n $message = \"`r`n $returnCode - There are no schema changes. Job aborted\"\n }\n }\n }\n $message >> $logPath In this script, you need to: 1. Provide the path to the directory where dbForge Schema Compare is stored. 2. Specify the date stamp format. 3. Provide the path to the log file. 4. Provide the path to the saved .scomp file. You can run the script manually, schedule its execution, or use it in your CI. Schedule the PowerShell script execution Now, when the script is ready, you can schedule its execution, for example, via Windows Task Scheduler. 1. Save the script to the .ps1 file and open Task Scheduler on your PC. 2. In Task Scheduler, navigate to Action > Create Basic Task . 3. Fill in the Name and Description fields. Then click Next . 4. Select the desired option and click Next . 5. Select the time when you want the script to be launched. Set the number of days for the script execution and click Next . 6. Select an action for the task and click Next . 7. Click Browse and select the path of the PowerShell script. Then click Next . 8. Finally, check the settings and click Finish . The task will be displayed in the Active Tasks section. Check the synchronization results After the PowerShell script has run, you can check schema output summaries in the location you have specified in the $rootFolder and $outsLoc variables. In our case, the files are saved to \\temp\\SchemaSync\\Outputs . If there is an error, you will see the corresponding information about it in a SchemaOutput.txt file. To be aware of any error, you can use [Exit Codes Used in Command Line for /schemacompare](https://docs.devart.com/schema-compare-for-mysql/working-with-command-line/exit-codes-used-in-command-line.html) article. Interested in ways to detect and report data changes, and ensure automated data updates on schedule between two SQL databases? Explore [Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) , a powerful SQL data comparison tool! Conclusion We have described how to automate the synchronization of schema changes on a schedule using dbForge Schema Compare, the PowerShell script, and Windows Task Scheduler. As you can see, you can set up automatic schema synchronization quickly and easily. You can check it yourself: [download dbForge Compare Bundle for SQL Server (including Schema Compare) for a free 30-day trial](https://www.devart.com/dbforge/sql/schemacompare/download.html) and start scheduling and automating your project’s synchronization processes. Tags [Schema Compare](https://blog.devart.com/tag/schema-compare) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Automatically+Synchronize+Schema+Changes+in+Two+SQL+Server+Databases+on+a+Schedule&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html&title=How+to+Automatically+Synchronize+Schema+Changes+in+Two+SQL+Server+Databases+on+a+Schedule) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html&title=How+to+Automatically+Synchronize+Schema+Changes+in+Two+SQL+Server+Databases+on+a+Schedule) [Copy URL](https://blog.devart.com/how-to-automatically-synchronize-schema-changes-in-two-sql-server-databases-on-a-schedule.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-build-a-database-from-source-control.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Build a Database from Source Control By [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) December 24, 2015 [2](https://blog.devart.com/how-to-build-a-database-from-source-control.html#comments) 5397 There is no doubt that database developers can and should benefit from using source control systems. Regardless of the type of source control system, developers also must think over the development model. They must also consider how they wish to build and deploy databases from source control. In this article, we will discuss several approaches of how to build a database from source control by creating a batch file and using dbForge Source Control for SQL Server. Deploying a database with a batch file To begin with, let’s copy the files from the remote repository to a local working copy on your computer. For demonstration purposes, I will use [Visual SVN](https://www.visualsvn.com/server/) . You can use any of your favorite source control systems. Assume that you have a script folder in the remote repository. The script folder contains DDL scripts for schema objects. The following image demonstrates the case: To start working with the database, you need to check out all files to a local working copy folder. For this, you can create a folder on your computer ( sales_demo in our case) and execute the SVN checkout command. For your convenience, I would recommend you to use [TortoiseSVN](https://tortoisesvn.net/) — a Subversion SVN client. To check out files to the local working copy, right-click the folder you’ve created and then click SVN Checkout : Once done, the sales_demo folder contains all the SQL files from the remote repository. Now, we can start deploying the database. One way to do this is executing SQL files one-by-one. This is the worst solution I can imagine and we are not going to do this. Instead, we will create a batch file to automate the process. Creating a batch file to deploy the database In the text editor, create a batch file with the following content: sqlcmd -S \"server\"-U \"login\"-P \"password\"-i \"input_file\" \nPAUSE where input_file is a path to a SQL file that will create a database. In our case, the input file is D:\\sales_demo_build\\sales_demo_build.sql . Then, create a single SQL file that will add a new database: SET NOCOUNT ON\n GO\n PRINT 'Creating sales_demo1 database'\n USE [master]\n GO\n DECLARE @db_name NVARCHAR(255);\n SET @db_name = N'sales_demo';\n IF EXISTS (SELECT 1 FROM sys.databases d WHERE d.name = @db_name)\n BEGIN\n EXEC (N'ALTER DATABASE '+@db_name+N' SET SINGLE_USER WITH ROLLBACK IMMEDIATE');\n EXEC (N'DROP DATABASE '+@db_name);\n END;\n EXEC (N'CREATE DATABASE '+@db_name);\n GO\n USE sales_demo\n GO\n :On Error exit\n :r \"D:\\sales_demo\\Tables\\dbo.Customers.sql\"\n :r \"D:\\sales_demo\\Tables\\dbo.OrderLines.sql\"\n :r \"D:\\sales_demo\\Tables\\dbo.Orders.sql\"\n :r \"D:\\sales_demo\\Tables\\dbo.Products.sql\"\n PRINT 'Creation is Completed'\n GO Note that the file also contains a list of SQL files to be executed. In the script, :r is a SQLCMD command that parses additional Transact-SQL statements and SQLCMD commands from the file specified by into the statement cache. In the command prompt, run the batch file: Now, refresh the SSMS Object Explorer to see the database deployed. This approach requires some time to prepare the batch file and main SQL file that builds the database. To simplify the process, we recommend using [dbForge Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/) , a powerful SSMS add-in that can link databases to the supported source control systems and deliver a smooth and clear workflow in a familiar interface. Deploying a database with the help of dbForge Source Control for SQL Server First, download and install [dbForge Source Control for SQL Server](https://www.devart.com/dbforge/sql/source-control/download.html) . Then, open SSMS and in the Object Explorer, create a new database, for example, sales_demo2 . Right-click the database and select Source Control > Link Database to Source Control . In the Link Database to Source Control window, under Source control repository , click + to select SVN source control and configure its settings. In the Source Control Repository Properties dialog box, select a source control system, specify the URL to the repository, source control settings, and then click OK . Then, select a database development model and click Link . That’s it! Note that the database icon has changed its appearance in the Object Explorer. Now, you can work with SQL scripts as application developers work with code, files, etc. You can change it and commit to the source control or get remote changes to your local database. Conclusion In this article, we explored the ways to deploy a database from source control. There are two methods to complete this task: the first one is to create a batch file that would build the database and the second one is to use dbForge Source Control for SQL Server. As discussed, the dbForge Source Control allows you to save time and avoid some extra steps while deploying the database. Tags [source control](https://blog.devart.com/tag/source-control) [SQL Server](https://blog.devart.com/tag/sql-server) [ssms](https://blog.devart.com/tag/ssms) [Andrey Langovoy](https://blog.devart.com/author/andrey-langovoy) Product manager at Devart Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-build-a-database-from-source-control.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Build+a+Database+from+Source+Control&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-build-a-database-from-source-control.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-build-a-database-from-source-control.html&title=How+to+Build+a+Database+from+Source+Control) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-build-a-database-from-source-control.html&title=How+to+Build+a+Database+from+Source+Control) [Copy URL](https://blog.devart.com/how-to-build-a-database-from-source-control.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025 2 COMMENTS Alex Trodder March 11, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 12:54 am I have a little bit of experience with website management. I know that keeping your SQL databases secure is essential to prevent hacks or SQL injections. I’ve never had to build one from a source control system. I like how your article includes the use of Visual SVN. I’m still pretty inexperienced with Linux code. I’ll have to read your article again to help learn more about site security. Patzylin May 2, 2016\t\t\t\t\t\t At\t\t\t\t\t\t 8:49 pm I trying out a demo of source control but when i try to connect with TFS show me an error message “Please update local client for TFS 2012”, i don’t understand… I ‘ve installed SQL Server 2012 (no express) team Explorer, VS 2013 and TFS 2012 on my server Comments are closed."} {"url": "https://blog.devart.com/how-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Build a Job to Import Multiple CSV files into a SQL Server Database By [dbForge Team](https://blog.devart.com/author/dbforge) January 20, 2022 [0](https://blog.devart.com/how-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html#respond) 11308 Have you ever faced the necessity to load data from multiple CSV files into multiple database tables? Not so uncommon, right? In this article, we share a step-by-step tutorial on how to perform an automatic bulk insert from the CSV files. The script examples are provided too. In the tutorial, we will be using the [Data Pump for SQL Server](https://www.devart.com/dbforge/sql/data-pump/) tool⁠ which is a SQL Server Management Studio add-in for filling SQL databases with external source data and migrating data between systems. It supports the command-line interface, which gives the user richer control over the tool’s functionality and allows automating and scheduling regular data import and export tasks. Contents What is the CSV format and why is it so popular? How to import multiple CSV files into multiple SQL Server tables in one go Step 1. Create data import templates Step 2. Create a text file with the list of import templates Step 3. Create a .bat file to run the import job Step 4. Populate the database from CSV files via the command line Step 5. Schedule a bulk insert from the command-line Conclusion What is the CSV format and why is it so popular? Comma-separated values (or CSV) format is extremely widespread today. It was designed to store and manage data without difficulty. CSV files are plain-text files that are easy to create, read, and manipulate. In fact, CSV is a delimited text file that uses a comma to separate values. Since CSV files typically store tabular data in plain text, they’re simpler to import into databases. CSV parsing is easy to implement, this format can be processed by almost all the applications, it better organizes large amounts of data. All these advantages make the CSV format so popular today. How to import multiple CSV files into multiple SQL Server tables in one go As a SQL Server developer, analyst, or DBA, you may face situations when you need to insert data from more than one CSV file into more than one database table. Can you do it all at once and thus save a lot of time and effort? Let’s check. Prerequisites Data Pump for SQL Server installed (In case you don’t have the tool on your machine, you can [download it from our website](https://www.devart.com/dbforge/sql/data-pump/download.html) ). CSV files to be imported. Data import templates created with the help of Data Pump. The text file containing paths to the data import templates. A .bat file with the script to run the import job. Step 1. Create data import templates 1.1 In Object Explorer, right-click the database you want to import data and select Data Pump -> Import Data . 1.2 In the wizard that opens, select the file format to be imported—CSV in our case—and continue customizing the import process. Once done, click Save Template . Step 2. Create a text file with the list of import templates 2.1 Launch any text editor tool, for example, Notepad. 2.2 Enter the names of the import templates. Here you can write as many templates as you need, just separate them with a comma. In our worked example, the file contents look as follows: C:\\DataImport\\ImportAddress.dit, \nC:\\DataImport\\ImportAddressType.dit 2.3 Save the file. We save the file as Files.txt. Step 3. Create a .bat file to run the import job 3.1 Launch any text editor tool, for example, Notepad. 3.2 Enter the script for launching the data import process like in the examples below. Don’t forget to change the variables. Set Import=\"C:\\Program Files\\Devart\\dbForge SQL Tools Professional\\dbForge Data Pump for SQL Server\\datapump.com\"\nSet DataSource=%DataSource%\nSet InitialCatalog=%InitialCatalog%\nSet UserID=%UserID%\nSet Password=%Password%\"\n\nFOR /F \"eol=; tokens=1 delims=, \" %%e in (C:\\DataImport\\Files.txt) do (\n%import% /dataimport /templatefile:%%e /connection:\"Data Source=%DataSource%;Initial Catalog=%InitialCatalog%;User ID=%UserID%;Password=%Password%\" /create\n)\npause Note: Set Import is a default installation path for dbForge Data Pump for SQL Server. However, if you have changed it, you will need to specify the correct path to the required tool’s .com file as well. 3.3 Save the file. Step 4. Populate the database from CSV files via the command line Now, all you need to do is execute the .bat file via the command line. Step 5. Schedule a bulk insert from the command-line After the batch file for database data import has been created, you can proceed with the creation of the import task using the Windows Task Scheduler so that the job could be carried out automatically. 5.1 Open the Control Panel > Administrative Tools and select Task Scheduler . 5.2 In the Task Scheduler dialog that opens, navigate to the Actions pane and click Create Basic Task to create a scheduled task. 5.3 In the Create Basic Task Wizard dialog that opens, provide the name and description of the task and click Next . 5.4 On the Trigger tab, choose when to launch the task and then click Next . Schedule based on the calendar: Daily, Weekly, Monthly, or One time. For this, specify the schedule you want to use. Schedule based on common recurring events: When the computer starts or When I log on. Schedule based on specific events: When a specific event is logged. For this, specify the event log, source, and event ID using the drop-down lists. 5.5 On the Action tab, click Start a program to schedule a program to start automatically and then click Next . 5.6 On the Action tab, click Start a program to schedule a program to start automatically and then click Next . 5.7 On the Finish tab, verify the settings and click Finish . The task will appear in the Active Tasks section. Conclusion dbForge Data Pump for SQL Server supports a command-line interface (CLI) thus allowing the user to perform data import and export tasks from the command line, as well as schedule and automate the population of SQL Server databases with data. This article provides a worked example and a CLI script for running data import from multiple CSV files into multiple database tables. [Download Data Pump](https://www.devart.com/dbforge/sql/data-pump/download.html) and try the given scenario⁠—you will see how simple the task can be. Tags [automatically import data from csv](https://blog.devart.com/tag/automatically-import-data-from-csv) [csv import](https://blog.devart.com/tag/csv-import) [data pump](https://blog.devart.com/tag/data-pump) [import multiple CSV files](https://blog.devart.com/tag/import-multiple-csv-files) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Build+a+Job+to+Import+Multiple+CSV+files+into+a+SQL+Server+Database&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html&title=How+to+Build+a+Job+to+Import+Multiple+CSV+files+into+a+SQL+Server+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html&title=How+to+Build+a+Job+to+Import+Multiple+CSV+files+into+a+SQL+Server+Database) [Copy URL](https://blog.devart.com/how-to-build-a-job-to-import-multiple-csv-files-into-a-sql-server-database.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-bulk-edit-shopify-product-data-in-excel.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [Products](https://blog.devart.com/category/products) [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) How to Bulk Edit Shopify Product Data in Excel By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) March 25, 2024 [0](https://blog.devart.com/how-to-bulk-edit-shopify-product-data-in-excel.html#respond) 1479 Efficient management of Shopify product data is paramount for online retailers aiming to maintain data consistency and enhance operational efficiency. Bulk editing emerges as a pivotal solution, offering substantial time savings and uniformity across product listings. [Excel Add-ins for editing Shopify](https://www.devart.com/excel-addins/shopify/) product names, descriptions, meta fields, and tags provide significant advantages over the native Bulk Editor App. This approach simplifies the data editing process and integrates seamlessly with familiar spreadsheet functionalities, thus augmenting data manipulation capabilities. Using Excel Add-ins facilitates enhanced flexibility, advanced data processing features, and the ability to handle complex data sets effectively, empowering users to manage their Shopify product data more proficiently. [https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-04T07_28_12_How-to-Bulk-Edit-Shopify-Product-Data-in-Excel.mp3](https://blog.devart.com/wp-content/uploads/2024/07/ElevenLabs_2024-07-04T07_28_12_How-to-Bulk-Edit-Shopify-Product-Data-in-Excel.mp3) Listen to the Devart podcast to learn how to bulk edit Shopify product data in Excel . Table of Contents What is Bulk Editing in Shopify? [Requirem](https://docs.google.com/document/d/11LR4DMOn1Q3xWVfvQEhhf4Z3-HCU-FUXzWNPsuGjbRM/edit#heading=h.2312eg2le7ip) ents What is Excel Add-Ins? Install and Configure Excel Add-Ins for Shopify Import Products Data From Your Shopify Store Bulk Updating Shopify Product Names in Excel How to Change Shopify SEO Title and Meta Description in B ulk How to Bulk Edit Product Tags on Shopif y How To Mass Edit Product Description s More Advantages of Excel Add-Ins for Your Business Conclusio n What is Bulk Editing in Shopify? Bulk editing in [Shopify](https://www.shopify.com/) refers to modifying multiple product details simultaneously within the Shopify interface. This functionality enables store owners to efficiently update product attributes such as prices, inventory levels, and descriptions across a range of products without needing individual adjustments. Bulk editing proves essential for store management by streamlining operations, reducing the time and effort required for inventory management, and ensuring consistency across product listings. Through bulk editing, store owners can swiftly implement changes in response to market demands, promotional activities, or inventory adjustments, thus enhancing operational efficiency and responsiveness to the competitive retail environment. This feature supports the dynamic nature of e-commerce by facilitating quick and effective store updates, which is crucial for maintaining accurate and current product information on the Shopify platform. Requirements An active Shopify store account with administrative access to manage and edit product listings. Installation of [Devart Excel Add-ins for Shopify](https://www.devart.com/excel-addins/shopify/) , ensuring compatibility with the version of Excel in use. API permissions enabled within the Shopify account to allow Devart Excel Add-ins to interact with Shopify data. What is Excel Add-Ins? [Excel Add-Ins](https://www.devart.com/excel-addins/) are essential tools for enhancing the functionality of [Microsoft Excel](https://www.microsoft.com/en-us/microsoft-365/excel) by integrating with various external data sources and services. These add-ins enable users to extend Excel’s native capabilities, allowing for direct interaction with databases, web services, and other platforms directly from within Excel. The primary utility of Excel Add-ins lies in their ability to streamline workflows, automate data processing tasks, and bring external data into Excel for advanced analysis, reporting, and decision-making. By leveraging Excel Add-ins, users can significantly reduce manual data entry, improve data accuracy, and unlock Excel’s powerful data analysis and visualization features. Among the available Excel Add-ins, the Devart Excel Add-in for Shopify distinguishes itself by targeting the needs of Shopify store owners and e-commerce managers. This add-in simplifies the management of Shopify product data by enabling direct connections to the Shopify store from within Excel. It allows efficient data import, editing, and synchronization, treating Shopify data as native Excel worksheets. This capability enhances data manipulation and analysis and streamlines the entire process of managing e-commerce data. Key functionalities of this Add-in tool include secure authentication via Access Token, convenient storage of connection details within Excel for future use, and Visual Query Builder and SQL for precise data retrieval. Additionally, the add-in’s instant data refresh feature ensures that the users always have access to the latest Shopify data, facilitating timely and informed decision-making. The Excel Add-in for Shopify thus represents a comprehensive solution for e-commerce data management, combining the ease of use of Excel with the specific requirements of Shopify store operations. Install and Configure Excel Add-Ins for Shopify Follow these detailed instructions to install and configure the Excel Add-Ins for Shopify, ensuring seamless integration with your Shopify store. This guide ensures that users can efficiently manage their Shopify data through Excel, leveraging the capabilities of the Devart Excel Add-in for Shopify. Installation of Excel Add-Ins for Shopify Acquire the Devart Excel Add-in for Shopify : Begin by [downloading the Excel Add-in](https://www.devart.com/excel-addins/shopify/download.html) from Devart’s official website. Ensure compatibility with your version of Microsoft Excel. Install the Add-in : Run the installation package and follow the on-screen instructions to install the Excel Add-in on your computer. Launch Excel : Open Microsoft Excel. The Devart Excel Add-in should now be available under the ‘Add-Ins’ tab in the Excel Ribbon. Configuring Excel Add-Ins for Shopify Access Token Authentication Enable Custom App Support in Shopify: Navigate to your Shopify admin dashboard. Select ‘Settings’ and proceed to ‘Apps and sales channel’. Enable custom app development by clicking ‘Develop apps’. Create an app by entering the required details, including the app name and developer. Configure Admin API Scopes by selecting all scopes and confirming changes. Install the app and navigate to the ‘API credentials’ tab to copy your Admin API Access Token. Connect Excel to Shopify : Click the Devart Excel Add-in tab in Excel, and select ‘Shopify’ from the connection options. Enter your Shopify store’s full domain name (xxxxx.myshopify.com) and the copied Access Token. Configure Excel Add-in : In Excel, choose’ Shopify’ under the Devart Add-in tab for a new connection. Input your Shopify store’s domain and the Access Token obtained from Shopify. After configuring the Excel Add-in with either authentication method, test the connection to ensure it is successfully established with your Shopify store. This enables you to start managing your Shopify product data directly within Excel, utilizing the powerful features of the Excel Add-in for efficient data manipulation and analysis. By following these steps, you can install and configure the Excel Add-Ins for optimal use with Shopify, streamlining the process of managing your e-commerce data with the advanced capabilities of Excel. Import Products Data From Your Shopify Store Utilizing the Devart Excel Add-in for Shopify provides a streamlined method to import, edit, and push updates directly from Excel to effectively manage and update your Shopify store’s product data. This section guides you through retrieving real product data from your Shopify store into Excel, organizing it for editing, and updating your store with the changes. Before initiating the data import process, ensure you are in ‘Edit Mode’ within Excel. This mode is crucial for enabling data manipulation and ensuring the Excel Add-in functions correctly during the import and subsequent data pushback to Shopify. How to Import Your Shopify Store’s Product Data Access the Devart Excel Add-in : Navigate to the ‘Devart’ tab in Excel and connect to Shopify. Import Data : Select the option to import data within the add-in interface. Choose ‘Products’ or any other relevant entity you wish to manage. Utilize the Visual Query Builder or SQL option to select the data fields you need precisely. Execute Import : After configuring your data selection criteria, execute the import. Your Shopify store’s product data will populate the Excel worksheet and be ready for editing. Tips for Organizing Your Data for Easier Editing Use Filters : Apply Excel’s filter functionality to columns to easily navigate through and sort your data. Conditional Formatting : Highlight data that requires attention or meets certain criteria to identify outliers or errors quickly. Utilize Tables : Convert your data range into an Excel table for enhanced data management features and easier reference. Following these steps, you can efficiently manage your Shopify store’s product data using Excel. The Devart Excel Add-in for Shopify simplifies the data import and export process, making it an invaluable tool for Shopify store owners seeking to optimize their product management workflow. Bulk Updating Shopify Product Names in Excel Bulk updating product names in your Shopify store through Excel streamlines the process, ensuring consistency and efficiency. The Devart’s Excel Add-in for Shopify simplifies this task, allowing seamless integration and update capabilities directly from Excel. Follow this detailed process to update product names in bulk and push the results back to your Shopify store. In your worksheet, before beginning to update the product names, it is crucial to choose ‘Edit Mode’. This step ensures that your modifications are correctly captured and can be saved to Shopify. Once in ‘Edit Mode’, navigate to the “Title” column containing the product names. Begin updating the names as required for accuracy and relevance. Consider utilizing Excel’s “Find and Replace” feature to maintain efficiency throughout this process. This tool enables consistent changes across multiple products, streamlining the update process and ensuring uniformity in your product catalog. Pushing Results into Shopify Once you have updated the product names in Excel: Commit Updates to Shopify : Select “Commit” to push or save your updated data back to Shopify. Confirm the action to initiate the update process. Verify Updates on Shopify : Review the product names to ensure all changes have been accurately reflected. This verification step is crucial to catch any issues that might have occurred during the push. By following these steps and utilizing the Devart Excel Add-in for Shopify, you can efficiently and accurately update product names in bulk. This process not only saves time but also enhances the consistency and professionalism of your Shopify store. How to Change Shopify SEO Title and Meta Description in Bulk Meta tags, precisely the SEO title and meta description, are crucial in optimizing a Shopify store for search engines. The SEO title is the headline that appears in search engine results, acting as the webpage’s title. The meta description briefly summarizes the page content, offering search engines and potential visitors insight into what the page is about. Both elements are vital for improving visibility, click-through rates, and the overall SEO performance of a Shopify store. To bulk-change the SEO title and meta description for Shopify products, the process mirrors updating product names, focusing on the ProductMetafields data. Follow these steps to efficiently update your Shopify store’s SEO elements: Import ProductMetafields Data Initiate the process by importing data from ProductMetafields into Excel. Specifically, choose the fields ‘Value’ for the SEO title and ‘Description’ for the meta description. This action ensures that you have access to each product’s current SEO settings. Enter Edit Mode Before making any changes, ensure you are in ‘Edit Mode’ to enable data manipulation within Excel. This mode allows the changes to be saved and pushed back to Shopify. Update SEO Titles and Descriptions : Navigate to the ‘Value’ and ‘Description’ columns corresponding to the SEO title and meta description. Update these fields as needed, considering the importance of keyword optimization. Keyword optimization is a critical component when updating SEO titles and descriptions. Incorporate relevant keywords naturally into both fields to improve your store’s search engine ranking for those terms. However, ensure the title and description remain clear, compelling, and descriptive, reflecting the product accurately while enticing potential customers. Following this method, you can efficiently update your Shopify products’ SEO titles and meta descriptions in bulk. This approach saves time and significantly enhances your store’s SEO strategy, potentially increasing traffic and sales. How to Bulk Edit Product Tags on Shopify Product tags in Shopify serve as a crucial organizational tool, allowing store owners to categorize, filter, and manage their inventory efficiently. Tags help streamline the search and discovery process on the storefront, enabling customers to find products more easily. By accurately tagging products, stores can enhance user experience, improve site navigation, and potentially increase sales. Tags also play a role in SEO, as they help create more pathways for search engine crawlers, improving the store’s visibility online. To bulk edit product tags in Shopify, follow a procedure similar to the one you would use for editing product names. However, in this case, focus on the column dedicated to “Tags”. Here’s how to accomplish this: Prepare Your Data : Import your Shopify product data into Excel, ensuring that the “Tags” column is included in your dataset. This step will display the current tags assigned to each product. Enter Edit Mode : It’s essential to switch to ‘Edit Mode’ in Excel. This mode enables the ability to make changes that can be recognized and saved by Devart’s Excel Add-in for Shopify. Update Tags : Navigate to the “Tags” column in your worksheet. Begin updating the tags as necessary, adding new tags, or modifying existing ones to better suit your products and categorization strategy. Remember, tags should be concise, relevant, and consistent across your product range to maximize their utility. Push Updates to Shopify : Once you’ve finalized the changes to your product tags, use the Excel Add-in for Shopify to push these updates back to your Shopify store. Ensure to review your changes for accuracy before confirming the update. The careful management and optimization of product tags are essential for maintaining an organized Shopify store that offers a seamless shopping experience. Bulk editing tags through Excel streamlines this process, making it more manageable to maintain consistency and relevancy in your product categorization efforts. How To Mass Edit Shopify Product Descriptions in Excel Crafting compelling product descriptions is vital for e-commerce success. These descriptions not only inform customers about the product but also entice them to make a purchase. Effective product descriptions should highlight the benefits and features of the product, use persuasive language, and be SEO-friendly to attract both customers and search engines. To mass edit Shopify product descriptions in Excel efficiently, the process closely mirrors updating product names, focusing instead on the “BodyHTML” column, which contains the HTML content for product descriptions. Here’s how to execute this: Begin by importing your Shopify product data into Excel, ensuring the “BodyHTML” column, which houses product descriptions, is included. This action loads your current product descriptions into Excel for editing. Before making any edits, ensure Excel is in ‘Edit Mode’. This mode is crucial for enabling changes and ensuring they can be saved and updated in Shopify. Navigate to the “BodyHTML” column. Here, you will update the descriptions as needed. Use this opportunity to apply strategies for crafting compelling content. Focus on making descriptions informative, engaging, and rich with keywords without compromising natural language for SEO purposes. After updating the product descriptions in the “BodyHTML” column, utilize the Excel Add-in for Shopify to push these changes back to your Shopify store. Review your updates for accuracy and completeness before finalizing the push. Tips for Effective Editing Consistency : Ensure a consistent tone and style across all product descriptions to maintain brand voice. Clarity and Conciseness : While being descriptive, keep your language clear and to the point to maintain the reader’s attention. Feature and Benefit Focus : Highlight key features of the product and how they benefit the customer. Use of Bullet Points : For easier readability, consider formatting key features or benefits in bullet points within the HTML content. By following these steps and employing effective content strategies, you can efficiently update multiple product descriptions in Excel, enhancing the appeal of your Shopify store’s product listings and potentially increasing customer engagement and sales. More Advantages of Excel Add-Ins for Your Business The advantages of Excel Add-Ins extend to overall business operations, offering: Streamlined Processes Automating data entry and updates through Excel reduces manual errors and saves time. Customized Reporting Businesses can create tailored reports to meet their specific needs, providing actionable insights for decision-makers. Scalability As businesses grow, Excel Add-Ins can easily accommodate increasing data volumes, ensuring scalability. In conclusion, Excel Add-Ins by Devart offer comprehensive benefits that extend well beyond Excel data management. By leveraging these tools for inventory management and sales analysis, businesses can optimize their operations, make informed decisions, and enhance their competitive edge in the market. Conclusion In summary, bulk editing in Shopify represents a vital strategy for efficient data management, directly impacting a store’s operational efficiency and customer satisfaction. The integration of Excel Add-Ins, particularly the Excel Add-in for Shopify, significantly enhances this process by offering advanced capabilities for inventory management, sales analysis, and overall business operations optimization. Businesses are encouraged to incorporate these powerful tools into their workflow to realize enhanced productivity, streamlined processes, and improved decision-making capabilities. Embracing these technologies can lead to substantial gains in managing your Shopify store more effectively. Tags [excel](https://blog.devart.com/tag/excel) [shopify](https://blog.devart.com/tag/shopify) [Shopify Excel integration](https://blog.devart.com/tag/shopify-excel-integration) [Shopify integration](https://blog.devart.com/tag/shopify-integration) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-bulk-edit-shopify-product-data-in-excel.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Bulk+Edit+Shopify+Product+Data+in+Excel&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-bulk-edit-shopify-product-data-in-excel.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-bulk-edit-shopify-product-data-in-excel.html&title=How+to+Bulk+Edit+Shopify+Product+Data+in+Excel) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-bulk-edit-shopify-product-data-in-excel.html&title=How+to+Bulk+Edit+Shopify+Product+Data+in+Excel) [Copy URL](https://blog.devart.com/how-to-bulk-edit-shopify-product-data-in-excel.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Import/Export Connections Using dbForge Studio for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) October 11, 2022 [0](https://blog.devart.com/how-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html#respond) 3257 If you need to work with multiple connections, you can easily import them into dbForge products with just a little preparation. Let us show you how it’s done with [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , one of our flagship solutions that help you perform dozens of diverse database development and management tasks from a single IDE. The import process consists of nine steps: 1. Create a file with connection properties by template 2. Use the Import and Export Settings Wizard to import the connections 3. Select the “Import selected environment settings” option 4. Save your current settings for sure 5. Choose settings to import 6. Select the file that contains the connection properties 7. Check everything and load the file 8. Finish the import 9. Verify the connections in the Database Explorer Without further ado, let’s get started! 1. Create a file with connection properties by template Your preliminary task is to create a file with the .settings extension that contains conventional XML with all the connections that you want to import. You can do it using any text editor such as Notepad++. The results must look similar to those shown in the screenshot below, where each block represents a connection. There can be as many connections as you like. Variable Description Connections States the connection settings of the server. Database Specify the GUID (Globally Unique Identifier) of the database. It’s a unique binary number in the network. ConnectionString Define the connection string with the parameters required to connect to the server. Name Enter the connection name that you use when you work with databases in SQL. 2. Use the Import and Export Settings Wizard to import the connection s Open dbForge Studio for SQL Server. Go to the Tools menu > Import and Export Settings . The Import and Export Settings Wizard opens. 3. Select the “Import selected environment settings” option Select Import selected environment settings and click Next . 4. Save your current settings for sure If you have any previously configured settings, on the Save Current Settings page, select either to overwrite your current settings or to save them to a file. Then click Next . 5. Choose settings to import On the Choose settings to import page, click Browse . 6. Select the file that contains the connection properties Select the XML file with connections that you have prepared during Step 1. Then click Open . 7. Check everything and load the file Provided that the file has been assembled properly, the wizard will show the following. Click Execute . 8. Finish the import When the import is complete, click Finish . 9. Verify the connections in the Database Explorer Now your new connections will be displayed in the Database Explorer of dbForge Studio, and you are all set to work with them. Similarly, you can export your connection settings via the same Tools > Import and Export Settings menu. Only this time you need to select Export selected environment settings , specify the settings that you want to export, provide a path and a name for the output file that they will be saved to, and, finally, click Execute . Fast and easy. The exported file is a good way to share SQL Server connections with your teammates—for instance, if your team needs the same connection settings for test servers. In this case, you can easily place this file in a shared folder or commit it to a Git repo. It’s also helpful when you need to [connect Java to SQL Server](https://blog.devart.com/connect-to-sql-server-in-java.html) using predefined connection settings. Let us reiterate that this method works just as well for other [dbForge products](https://www.devart.com/dbforge/) . As for [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) —if you aren’t acquainted with it yet—it is a high-end alternative to SSMS, a comprehensive development environment that delivers a multitude of capabilities: Advanced SQL code completion, refactoring, and formatting Comparison and synchronization of database schemas and objects Table data comparison and deployment of changes Visual database design Visual query building Integrated version control T-SQL debugging Generation of realistic test data Generation of comprehensive database documentation Data export and import with up to 14 supported formats Database administration and security management …and much more! You can check all of them firsthand during a free 30-day trial. Just [download the Studio](https://www.devart.com/dbforge/sql/studio/download.html) and give it a go! Tags [connect to database](https://blog.devart.com/tag/connect-to-database) [dbforge](https://blog.devart.com/tag/dbforge) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=Import%2FExport+Connections+Using+dbForge+Studio+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html&title=Import%2FExport+Connections+Using+dbForge+Studio+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html&title=Import%2FExport+Connections+Using+dbForge+Studio+for+SQL+Server) [Copy URL](https://blog.devart.com/how-to-bulk-import-connections-using-dbforge-studio-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-check-mysql-version.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Check MySQL Version By [dbForge Team](https://blog.devart.com/author/dbforge) October 7, 2024 [0](https://blog.devart.com/how-to-check-mysql-version.html#respond) 82444 MySQL is an extremely popular open-source RDMS, and it is widely used by millions of companies and professionals. In this article, you will learn how to check the current MySQL version, and how to update it if necessary. Contents Why do you need to know your MySQL version? Different ways how to get your MySQL version How to check MySQL version in Windows Terminal How to find MySQL version from the command-line client SSH for checking MySQL version MySQL SHOW VARIABLES LIKE query MySQL SELECT VERSION command MySQL STATUS command How to determine MySQL version using dbForge Studio for MySQL How to check MySQL version in phpMyAdmin How to check MySQL version in Workbench How to check MySQL version in XAMPP How to upgrade MySQL to the latest version Conclusion Why do you need to know your MySQL version? In some situations, it is critical to know what the current MySQL version is as particular features might not be compatible with your system. Also, the installation of the best MySQL version reduces the risks of your system collapsing. Getting a new version also means new features and better capability for your system. Now, let’s see how we can check if your MySQL server has already the latest version and how to upgrade it if it’s not. Different ways how to get your MySQL version Now that you know why it is important to keep your version of the database up-to-date, you might ask how to get the newest MySQL version. The good news is, it’s not that difficult. We’ll offer several ways and provide queries to check your current version. So this is how you can get your MySQL version: From the command line Using your MySQL Client With the help of dbForge Studio for MySQL From the phpMyAdmin interface Using Workbench Via XAMPP How to check MySQL version in Windows Terminal One of the easiest ways to check the version of your local MySQL server from the command line on [Windows](https://blog.devart.com/how-to-install-mysql-on-windows-using-mysql-installer.html) is by using the following command, which works not only on Windows but also on [macOS](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-macos/) , [Linux](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-linux/) , [Ubuntu](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-ubuntu/) , and [Debian](https://www.devart.com/dbforge/mysql/install-mysql-on-debian/) : mysql -V By the way, if you’re new to MySQL and database setup, it’s important to understand [how to install MySQL on Debian](https://www.devart.com/dbforge/mysql/install-mysql-on-debian/) and other systems to get started with building your database environment. How to find MySQL version from the command-line client Just open your [MySQL client](https://blog.devart.com/mysql-command-line-client.html) , and the information about your current MySQL version will be available straight away. There is also a bunch of other ways to find out the MySQL server version from the command line: Using SSH With the help of the SHOW VARIABLES LIKE query Using the SELECT VERSION command With MySQL STATUS command Let’s take a closer look at each way of checking the MySQL version. SSH for checking MySQL version You can easily use Secure Shell to check your MySQL version. Log in to the server via [SSH](https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html) and enter the following command to get the information about your current version: select @@version; MySQL SHOW VARIABLES LIKE query Another way to show the MySQL version information is with the help of a SHOW VARIABLES LIKE statement. In MySQL Command Line Client, enter the following command: SHOW VARIABLES LIKE 'version'; The MySQL version will be shown instantly. MySQL SELECT VERSION command MySQL Client allows getting the version info by running the SELECT VERSION() command in the MySQL database. Here is the syntax for MySQL SELECT VERSION query: SELECT VERSION(); Don’t forget to use semicolons as a statement delimiter when working with MySQL Client. MySQL STATUS Command You can also view your current MySQL version with the STATUS command: STATUS; The output includes the version comment information that helps check your MySQL version status, such as uptime, threads, and much more. How to determine MySQL version using dbForge Studio for MySQL If you feel like getting MySQL version information by running commands from the command line is not your cup of tea, you might try checking the current MySQL version using dbForge Studio for MySQL. It is one of the best tools for database management, administration, and development. You can quickly and easily check the current MySQL version when working in the dbForge [MySQL GUI](https://www.devart.com/dbforge/mysql/studio/) tool. In fact, there are two ways to do this. First, you get information about the MySQL server version when customizing connection settings. In the Database Connection Properties window, enter connection settings, and click Test Connection . In case you need to find out your MySQL version after you’ve connected, right-click the connection name in Database Explorer and select Properties . To learn more about how to [connect to MySQL server](https://blog.devart.com/remote-connection-to-mysql-server.html) remotely, please refer to our blog post. How to check MySQL version in phpMyAdmin phpMyAdmin provides a convenient, user-friendly interface for database management. If you want to know your MySQL version in phpMyAdmin, have a look at the information listed under the Database Server section. You will find the MySQL version there and will be able to update it if necessary. The previously mentioned dbForge Studio for MySQL works perfectly as [a versatile alternative to phpMyAdmin](https://www.devart.com/dbforge/mysql/studio/alternative-to-phpmyadmin.html) . How to check MySQL Version in Workbench If you want to know how to check the MySQL version in Workbench, we’ll provide you with a few simple steps. First, open Workbench and choose your database server in the main menu, then click Server Status. All the information regarding your version history is listed in this window. This is how you can check if you have the latest MySQL version and update it if it’s old. Besides Workbench, there is another convenient and feature-rich tool to develop and manage MySQL databases — [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/alternative-to-mysql-workbench.html) . How to check MySQL Version in XAMPP To check the MySQL version in XAMPP, open Windows Command Prompt, navigate to the folder where XAMPP is installed, and run the following command: mysql -V Another way to check your current MySQL version in XAMPP is as follows. Navigate to the readme_en.txt file, which can be found in your XAMPP installation folder. There, you will see the MySQL version number. How to upgrade MySQL to the latest version After you have found out your MySQL version, you inevitably have a question: what is the latest version of MySQL? The current latest stable version of MySQL is 9.0. To ensure that your database is running on the latest MySQL version, you can follow the MySQL community, where you can download the new version from the latest version list. The MySQL version history is really rich, and every new release gives more space for users to work faster and more efficiently. After having checked your current MySQL version, you might discover that it isn’t the latest one. In this case, you must upgrade your MySQL version following our simple instructions. If you have later versions of MySQL, that might save you time because they get auto-updated. In this case, you don’t need to do anything. If you are determined to take the process under your control and do everything manually, you can follow our simple instructions. Before starting, don’t forget to [back up your MySQL database](https://blog.devart.com/mysql-backup-tutorial.html) to be sure your data is safe. 1. Upgrade your MySQL version using the command line just like you did while checking the current MySQL version. mysql_upgrade -u root -p –force 2. Upgrading your MySQL version using cPanel provides you with deeper root access, as it comes with Web Host Manager. To upgrade your current MySQL version, access WHM and navigate to Software > MySQL Upgrade . Select the MySQL version you’d like to upgrade and click Next. 3. To upgrade the MySQL version on Linux/Ubuntu, use your SSH credentials, and in the MySQL APT Repository, run the following command: sudo apt-get update The package list will be updated. Then upgrade MySQL using either sudo apt-get upgrade mysql-server or sudo apt-get install mysql-server 3. If you need to upgrade the [MySQL version on macOS](https://www.devart.com/dbforge/mysql/how-to-install-mysql-on-macos/) , you might notice that the upgrade process is quite similar to the one on Linux as it uses Secure Shell. You will also need a package manager, e.g. Homebrew. To upgrade your MySQL version on macOS, log in to the Terminal program using your SSH credentials. After that, launch Homebrew and run the following command: brew update\nbrew install mysql This will install MySQL 9.0 If you experience problems while upgrading MySQL and to avoid version mismatch, try uninstalling the old version first by executing the following command: brew remove mysql\nbrew cleanup When the process of upgrade is over, test your system by running this command: mysql.server start 4. If you intend to upgrade to the MySQL version on Windows, note that it’s impossible to use SSH by default here. However, you can choose PuTTY, an SSH and telnet client, to input your SSH credentials. Other steps are similar to the ones you performed for upgrading MySQL on Linux/Ubuntu. Alternatively, you can use MySQL Installer. In this case, you don’t need the SSH connection. Conclusion In this article, we have a detailed guide on how to check your MySQL version in 6 simple ways. These instructions can be of great use for all users who work with MySQL and want their database management system to be the most efficient and relevant. We have also discussed several ways how to upgrade your MySQL on different operating systems. We invite you to test-drive one of the [best MySQL admin tools](https://www.devart.com/dbforge/mysql/studio/) on the market—dbForge Studio for MySQL. It is a universal all-in-one tool that incorporates all the essential tools needed for effective database development, management, and [administration](https://www.devart.com/dbforge/mysql/studio/database-administration.html) . Download a fully functional 30-day trial from our website and evaluate all the advanced features the Studio delivers. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-check-mysql-version.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Check+MySQL+Version&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-check-mysql-version.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-check-mysql-version.html&title=How+to+Check+MySQL+Version) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-check-mysql-version.html&title=How+to+Check+MySQL+Version) [Copy URL](https://blog.devart.com/how-to-check-mysql-version.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-check-oracle-version.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Check the Oracle Version By [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) February 20, 2024 [0](https://blog.devart.com/how-to-check-oracle-version.html#respond) 2006 Knowing the version of your Oracle Database is important. It helps you evaluate support options, apply necessary updates for security and stability, and take advantage of new features to meet changing business needs. What’s more, database vulnerabilities can vary between versions, so it’s better to stay up-to-date to avoid any issues. Being aware of the version can also ensure compatibility with applications and tools, making it easier to work with the database. In the article, we’ll review various methods to check the version of an Oracle database using a range of techniques, including commands, tools, utilities, and command-line operations to suit your preferences. In Oracle, a version is a release of the database (for example, 19.3) and is available in several editions, such as Express or Standard. Contents Why do you need to know your Oracle version? Checking the installation of the Oracle client from the command line Easiest methods to check a version Using SQL*Plus Querying V$INSTANCE view Using dbForge Studio for Oracle Intermediate methods to check a version Using Enterprise Manager Checking the Oracle alert log Using DBMS_UTILITY.DB_VERSION Advanced methods to check a version Using PL/SQL Using the oraversion utility Why do you need to know your Oracle version? First, let us discuss a few cases why it is important to know the Oracle version you use: Security: New versions often come with updates to fix security issues and protect against cyberattacks. Keeping up with these updates can ensure your database remains secure from potential threats. Performance: Newer versions can make your database run faster and more efficiently. They might have features that help use resources better and reduce delays. Feature enhancements: Each new version of Oracle Database brings new tools and functions that can make your work easier and more productive. Compatibility: Staying up-to-date with your Oracle version ensures it works well with the latest software and tools so you can easily connect it to other programs and services. Support and maintenance: Oracle provides support and fixes for its database software. But as versions get older, they might stop getting updates. By using newer versions, you can get help from Oracle when needed. So, knowing your Oracle Database version is important to keep your data safe, make your database work better, access new tools, and guarantee it plays well with other software. Checking the installation of the Oracle client from the command line Before we start, verify that the Oracle client is installed on the machine. If you use Unix/Linux, open the terminal window. For Windows users, open the Command Prompt. Then, execute the sqlplus command. If it is recognized, it means that the Oracle client is installed on your machine. However, if the command is not recognized, verify whether the ORACLE_HOME environment variable is set: Windows: Navigate to System > Advanced System Settings . In the System Properties dialog, click Environment Variables and check for ORACLE_HOME . Linux/Unix: Open the terminal and type echo $ORACLE_HOME . If the ORACLE_HOME variable is not found, download and install the Oracle client from the official website. To proceed, let us explore the ways to check an Oracle database version. Easiest methods to check a version Now, let’s view some methods to get information about the Oracle database version. You can run the commands to be discussed below using SQL*Plus utility, Command Prompt, dbForge Studio for Oracle, or any other SQL client tool. Using SQL*Plus SQL*Plus is a command-line utility introduced by Oracle Corporation to work with Oracle databases. It is commonly used for tasks such as querying database tables, creating and modifying database objects, and managing database users and privileges. To view the Oracle version, query the following view: SELECT * FROM v$version The query retrieves information about the Oracle database version you are currently connected to. The output includes the Oracle database version number, release, edition, and other relevant information. In the context of the query, the banner columns display the following information: BANNER shows the edition and the basic information about the Oracle Database version, BANNER_FULL shows the full release information. BANNER_LEGACY may show an older or legacy version banner format, which might be used for compatibility or historical reasons. It could include information similar to the BANNER and BANNER_FULL columns but formatted differently. Note that the server version is also displayed when you connect to the database using the SQL*Plus utility. Querying the V$INSTANCE view Another way to check the Oracle version involves querying the V$INSTANCE view. This returns information about the instance running on the database server. The query result displays the instance name, database version, start time of the instance, etc. Although the V$INSTANCE view does not typically include columns named version or version_full (especially in versions older than 18c), we can explicitly select them in the query to obtain version-related details. Note that some permissions and privileges may be required to query the v$Instance view, including: SELECT privilege on the V$INSTANCE view or a role that grants the SELECT privilege on the view. Some columns in dynamic performance views may contain sensitive information. So, access to these columns may be restricted, requiring users to have additional privileges to query them. The syntax of the view is as follows: SELECT version, version_full FROM v$instance; where version displays the version information in the format “major release.minor release.patch level”, and version_full usually includes details such as the Oracle Database edition (for example, Enterprise Edition), the specific release version (for example, 19c), the release number, and additional information about the production status. Let’s now execute this view in [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) , which is an all-encompassing integrated development environment (IDE) for Oracle database development and management tasks. With a suite of robust features and tools, SQL editing, visual design tools, data management functionalities, debugging utilities, version control integration, security management, reporting tools, and PL/SQL Formatter, it offers a complete solution for Oracle professionals. You can [download](https://www.devart.com/dbforge/oracle/studio/download.html) the Studio from the Devart website and [install](https://docs.devart.com/studio-for-oracle/getting-started/installing-guide.html) it on your machine. Then, open the Studio. On the toolbar, click New SQL to open a [SQL editor](https://www.devart.com/dbforge/oracle/studio/oracle-sql-editor.html) . It helps you simplify, optimize, and autocomplete your code with the required elements without the need to memorize code details. In addition, you can improve readability of your PL/SQL code using the smart [PL/SQL Formatter](https://www.devart.com/dbforge/oracle/studio/plsql-formatter.html) , which formats and beautifies your code effortlessly. Once done, execute the query and see the result: Using dbForge Studio for Oracle You can quickly check the Oracle server version you are connected to when working with dbForge Studio for Oracle. Here are three ways to do this. Method 1 Open the Studio. On the ribbon, select Database > New Connection . In the Database Connection Properties dialog that opens, enter the connection details and click Test Connection . The pop-up window about successful connection opens, displaying the version. You can also view the version for the already established connection. To do this, in Database Explorer , right-click the required connection and select Modify Connection . The Database Connection Properties dialog appears – click Test Connection to see the version. Method 2 On the toolbar, click New SQL . In the SQL document that opens, the server version is displayed at the bottom of the document. This information will also be shown for any SQL document you work in. Method 3 If you want to know the version of the server you’ve already connected, in Database Explorer , right-click the required connection and select Properties . Intermediate methods to check a version In this section, we’ll explore the following methods to get Oracle version information: Using Enterprise Manager Checking the Oracle Alert Log Using DBMS_UTILITY.DB_VERSION procedure Using Enterprise Manager You can easily check the server version using Oracle Enterprise Manager. It is a web-based management tool for managing and monitoring Oracle software environments. Open a web browser and enter the URL of your Oracle Enterprise Manager Console. Then, log in with the appropriate credentials and navigate to the Targets menu. There, select the server for which you want to view version information. On the properties or general information page, you can find relevant details, including the Oracle Database version. Checking the Oracle alert log An alert log is a text file that records information about database events, as well as error messages and exceptions that occur. After each startup of the instance, new information is added to the alert_.log file. It is stored in the diagnostic directory of your Oracle database instance. You can query the V$DIAG_INFO view to find the location of the database log file and read the server version. To do this, execute the following query: SELECT * FROM V$DIAG_INFO; The output displays the following result, including a server version: Using DBMS_UTILITY.DB_VERSION DBMS_UTILITY.db_version is an alternative to obtain the version details from v$version, compared to basic SQL commands. To obtain the server version, execute the following query: SET SERVEROUTPUT ON\nDECLARE\n l_version VARCHAR2(100);\n l_compatibility VARCHAR2(100);\nBEGIN\n DBMS_UTILITY.db_version (version => l_version,\n compatibility => l_compatibility);\n DBMS_OUTPUT.put_line('Version: ' || l_version || ' Compatibility: ' || l_compatibility);\nEND;\n/ The output returns the server to which the database is connected. Advanced methods to check a version Let’s now look at some advanced ways to determine a database version: Using PL/SQL Using the oraversion utility Using PL/SQL A PL/SQL block is a structured section of code that performs specific tasks. It usually consists of variable declarations, control structures (like loops and conditional statements), and executable statements. To retrieve the version, execute the following code within a PL/SQL block: begin\n dbms_output.put_line(\n dbms_db_version.version \n || '.'\n || dbms_db_version.release\n ); \nend; The query returns the version (19) and release number (0) retrieved with the help of the dbms_db_version package that contains two constants specifying the version and release. Using the oraversion utility Starting with Oracle 18c, Oracle introduced the oraversion utility. It is a command-line tool that provides a quick way to get the database version. In the Command Prompt or terminal, execute the following command to display the version: oraversion -baseVersion Once done, the version of the server is displayed. Conclusion To sum up, there are multiple ways to quickly get a database version that can suit every taste. This becomes particularly useful in instances where the version may not be the latest, prompting users to regularly update their software to mitigate compatibility and performance issues. To top it off, [download](https://www.devart.com/dbforge/oracle/studio/download.html) a trial version of dbForge Studio for Oracle, available for 30 days at no cost. Explore its outstanding features and [PL/SQL developer tools](https://www.devart.com/dbforge/oracle/studio/plsql-developer-tools.html) firsthand, experiencing the seamless execution of database operations within this versatile IDE. Useful links: [How to Check MySQL Version](https://blog.devart.com/how-to-check-mysql-version.html) Tags [check database version](https://blog.devart.com/tag/check-database-version) [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [Yuliia Vasylenko](https://blog.devart.com/author/julia-evans) Yuliia is a Technical Writer who creates articles and guides to help you get the most out of the dbForge tools. She enjoys making complex tech in useful ways. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-check-oracle-version.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Check+the+Oracle+Version&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-check-oracle-version.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-check-oracle-version.html&title=How+to+Check+the+Oracle+Version) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-check-oracle-version.html&title=How+to+Check+the+Oracle+Version) [Copy URL](https://blog.devart.com/how-to-check-oracle-version.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-combine-data-from-several-sources-using-sql-and-virtualquery.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Combine Data from Several Sources Using SQL and VirtualQuery By [DAC Team](https://blog.devart.com/author/dac) July 28, 2016 [0](https://blog.devart.com/how-to-combine-data-from-several-sources-using-sql-and-virtualquery.html#respond) 8289 The VirtualQuery component allows executing SQL queries to sources that are not a database, but a TDataSet or any of its descendants. It also allows you to connect to several data sources at a time and work with them using SQL queries as with a single data source. Thus, work in a heterogeneous environment is implemented, when execution results of queries to different sources can be retrieved in a single data set. Query to different DBMS’s For example, let’s combine DEPT tables from two DBMS’s, Oracle and PostgreSQL, within a single VirtualQuery. Content of the Oracle DEPT table: DEPTNO DNAME LOC 10 ACCOUNTING NEW YORK 20 RESEARCH DALLAS 30 SALES CHICAGO 40 OPERATIONS BOSTON Content of the PostgreSQL DEPT table: DEPTNO DNAME LOC 10 SOFTWARE GROUP TORONTO 20 SUPPORT BERLIN 30 CLOUD SERVICE OSLO Place a TOraSession component onto the form and set Oracle DB connection parameters using Session Editor. Add a TOraQuery component onto the form, set Name=OraQuery, and specify the following query in the OraQuery Editor: Select * From Dept Place a TPgConnection component onto the form and set PostgreSQL DB connection parameters using Connection Editor. Add a TPgQuery component onto the form, set Name=PgQuery, and specify the following query in the PgQuery Editor: Select * From Dept Place a TVirtualQuery component onto the form and open its Source DataSet Editor using the component’s shortcut menu In the editor that appeared, add description of the source for Oracle: Then, in the similar way, add description of the source for PostgreSQL: In this case, within the scope of VirtualQuery, we can call the OraQuery query using code name OraDept, and call PgQuery via PgDept. Moreover, for data source identification, in addition to specifying the table code name, we can also use schema code name (the SchemaName property). This allows a developer to refer to data sources flexibly, e.g., using the same table name and different schema names. Let’s open the prepared queries: OraQuery.Active := True;\nPgQuery.Active := True; In the VirtualQuery Editor, execute the following query: Select DName, Loc \n From OraDept\n\nUnion\n\nSelect DName, Loc \n From PgDept\n\nOrder By DName We will get the following result set: Thus, using VirualQuery, we have combined data from two different sources. We got the possibility to work with this data via SQL. This may appear quite useful, for example, when generating common reports containing data from various databases. Using custom data sources VirtualQuery functionality is not limited by work with queries to various DBMS’s. A data source for it can be any information retrieved with some TDataSet descendant. This can be XML documents, text files, various devices logs, etc. It is not a secret for anyone, that volumes of data to be processed constantly grow. Loggers, tracking systems, Smart House systems, various IoT devices, and other modern digital gadgets operate huge amounts of data nowadays. At this, data is often stored not in databases, but as files of various formats stored in a distributed environment. It is often handy to process such information with regular SQL queries. Let’s consider an example of retrieving information about smart-phone models as the following data set: VendorName – smartphone vendor name ModelName – model name Specification — short technical characterstics We’ll try to obtain required data from 3 different sources. Let the vendor list be an XML document ‘Vendor.xml’ Load this document to the TVirtualTable component: VT := TVirtualTable.Create(nil);\nVT.LoadFromFile('Vendor.xml'); The result set will look as follows: ID Name 10 Samsung 20 Apple 30 Sony 40 Microsoft where ID – primary key, Name – vendor name Let’s prepare the vendor list using the TClientDataSet component. Fill it in with the following data: ID VendorID ModelName 9800 10 Galaxy S7 Edge 9830 10 Galaxy Note 5 1001 20 iPhone 6 1356 20 iPhone 6 Plus 3582 40 Lumia 950 XL Dual Sim where ID – primary key, VendorID — link to a smartphone vendor, ModelName — model name. The code for filling in the ClientDataSet: CDS := TClientDataSet.Create(nil);\n ...\n CDS.AppendRecord([9800, 10, 'Galaxy S7 Edge']);\n CDS.AppendRecord([9830, 10, 'Galaxy Note 5']);\n CDS.AppendRecord([1001, 20, 'iPhone 6']);\n CDS.AppendRecord([1356, 20, 'iPhone 6 Plus']);\n CDS.AppendRecord([3582, 40, 'Lumia 950 XL Dual Sim']); And, finally, we are going to take tech specs of smartphone models from a normal text file named «NoteData.txt», that includes lines as follows: Model ID Value , Model Specification Value where Model ID Value – link to the smartphone model, Model Specification Value – model characteristics description. The comma “ , “ character serves as a field separator. Data from the text file will be presented as a dataset using the TvirtualDataSet component. For this, let’s implement OnGetRecordCount and OnGetFieldValue methods by applying the TStringList class. In the OnGetRecordCount we should specify the number of rows contained in TVirtualDataSet. In our case, the number of rows is defined by the number of elements in TStringList. In the OnGetFieldValue method, we should define how we are going to fetch data for each field of TVirtualDataSet. At this, the RecNo parameter will define the current record number. We have only to describe obtaining the necessary data using an output parameter – out Value: Variant . Thus, we have prepared three different data sources: VT — vendor data CDS – model data VDS – smartphone specifications Execute the following code to retrieve the resulting dataset: VirtualQuery := TVirtualQuery.Create(nil);\n …\n VT.Open;\n CDS.Open;\n VDS.Open;\n \n\n VirtualQuery.SourceDataSets.Add(VT, '', 'Vendor');\n VirtualQuery.SourceDataSets.Add(CDS, '', 'Model');\n VirtualQuery.SourceDataSets.Add(VDS, '', 'Info');\n\n VirtualQuery.SQL.Text := ' Select Vendor.Name As VendorName,' +\n ' Model.ModelName, ' +\n ' Info.Specification ' +\n ' From Model ' +\n ' LEFT JOIN Vendor ON Model.VendorID = Vendor.ID ' +\n ' LEFT JOIN Info ON Model.ID = Info.ID ' +\n 'Order By 1 ,2 ';\n VirtualQuery.Open; As a result, we will get the following dataset: Data editing with VirtualQuery VirtualQuery can help a developer work with third-party datasets using unified Devart DAC products’ functionality in an application. So, for example, VirtualQuery can serve not only for data reading, as shown above, but for data modification in data sources linked to it as well. Let’s consider an example of such behavior below. By clicking the «Edit» button in our project, the following code will be executed: VirtualQuery.UpdatingTable := 'Model';\n VirtualQuery.Edit;\n VirtualQuery.FieldByName('ModelName').AsString := VirtualQuery.FieldByName('VendorName').AsString + ' ' + VirtualQuery.FieldByName('ModelName').AsString;\n VirtualQuery.Post; In this case, using UpdatingTable, we have specified the name of the edited data source – ‘Model’. The modification of the field value ‘ModelName’ is implemented with calls of the Edit and Post methods. Summary The projects above demonstrate how to work with the [TVirtualQuery](https://www.devart.com/virtualdac/docs/devart.virtualdac.tvirtualquery.htm) component, including execution of a SQL query to several data sources, as well as data reading/writing in prepared XML and text documents. The used files «NoteData.txt», «Vendor.xml» are available for download …. And the source code of the considered projects is included in Demos, that are distributed with [VirtualDAC](https://www.devart.com/virtualdac/) . Tags [delphi](https://blog.devart.com/tag/delphi) [rad studio](https://blog.devart.com/tag/rad-studio) [virtualdac](https://blog.devart.com/tag/virtualdac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-combine-data-from-several-sources-using-sql-and-virtualquery.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Combine+Data+from+Several+Sources+Using+SQL+and+VirtualQuery&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-combine-data-from-several-sources-using-sql-and-virtualquery.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-combine-data-from-several-sources-using-sql-and-virtualquery.html&title=How+to+Combine+Data+from+Several+Sources+Using+SQL+and+VirtualQuery) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-combine-data-from-several-sources-using-sql-and-virtualquery.html&title=How+to+Combine+Data+from+Several+Sources+Using+SQL+and+VirtualQuery) [Copy URL](https://blog.devart.com/how-to-combine-data-from-several-sources-using-sql-and-virtualquery.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-compare-and-merge-source-code-in-visual-studio-2019.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) How to Compare and Merge Source Code in Visual Studio 2019 By [ALM Team](https://blog.devart.com/author/alm) August 22, 2019 [0](https://blog.devart.com/how-to-compare-and-merge-source-code-in-visual-studio-2019.html#respond) 6543 Summary : In this article, you will learn how to easily compare and merge source code using [Code Compare](https://www.devart.com/codecompare/) in Visual Studio 2019.  In this post, we will provide examples demonstrating the basic capabilities of Code Compare and tasks it helps you effectively solve. Comparing two revisions of a heavily refactored code can be a quite challenging task for those who are involved in programming. Finding differences between two objects and detecting source code changes have to be performed line-by-line and it takes a while. Code Compare is an easy-to-use solution for comparing and merging two source code files. It allows developers to reduce the number of bugs when coding and enhances source code editing with advanced code comparison features, e.g.: Structural comparison mode [Lexical comparison](https://www.devart.com/codecompare/lexicalcomparison.html) from major programming languages Similar lines detection You can use Code Compare as a standalone code diff tool and a [Visual Studio extension](https://www.devart.com/codecompare/visual-studio-integration.html) . It effortlessly integrates with multiple [version control systems](https://www.devart.com/codecompare/integration.html) and supports different programming languages. Integration with Visual Studio 2019 We continue to support integration with modern code editors of Visual Studio 2019 making code comparison process more productive and faster. This due to the fact that all new editor capabilities like commands, IntelliSence, syntax highlighting are fully operational, including third-party extensions. Asynchronous loading of our add-on is possible due to removing deprecated API from the Code Compare extension for Visual Studio 2019, thus, making Code Compare load faster on Visual Studio run. Moreover, the list of available languages in Code Compare for Visual Studio features such popular languages as JavaScript and TypeScript. When opening comparison, the programming language in use is defined automatically depending on the file extension. We have also considered reports from our users on the incorrect opening of the JSON files and strange blinking of the pop-up windows. Now JavaScript will be used for them. Structural Source Code Comparison Code Compare provides a wide range of possibilities for locating changes in source code with regard to its structure, e.g., matching methods and similar code lines, detecting moved blocks of code, and [much more](https://docs.devart.com/code-compare/file-comparison/structural-code-comparison.html) . We’ve significantly improved structural comparison for the latest versions of the С# and VB languages. Having considered a number of comments on the structural comparison, we decided to switch to the Roslyn compiler that is the best choice when it comes to code analysis for C# and VB.  Also, we improved the binding of comments and processor directives to the structural elements in the code. Now, Code Compare supports structural identification of all new C# and VB.NET statements, which are important for structural comparison. Below you will find the examples of basic ones. Moved and modified read-only property as an expression-bodied member Moved using static statement Moved and modified dictionary being initialized When comparing source code, we recommend you to enable the Ignore Line Break and Ignore Whitespaces options to omit nonexistent changes, and the Symbol to Symbol option to efficiently check for differences in lines of code. Quick Integration with TFS and GIT Code Compare can be automatically integrated into TFS. You need just to check Integrate with TFS Version Control during the product installation. In this case, Code Compare is set as the default compare and merge tool for TFS. If you use Git as a source code repository, you need to make just one change in the .gitconfig file. You can learn more about this by visiting the [Code Compare Integration with GIT](https://www.devart.com/codecompare/integration_git.html) page. Conclusion Code Compare is a convenient Visual Studio add-in, which lets you forget about any issues when comparing and merging source code. No matter what programming language you use, Code Compare takes into account specific features of each of them. Moreover, the tool allows you to track and control your source code changes using popular version control systems. So, [download](https://www.devart.com/codecompare/download.html) , evaluate and [order](https://www.devart.com/codecompare/ordering.html) Code Compare Pro today to discover the best way of managing your source code changes! Tags [Code Compare](https://blog.devart.com/tag/code-compare) [GIT](https://blog.devart.com/tag/git) [Parser](https://blog.devart.com/tag/parser) [Roslyn](https://blog.devart.com/tag/roslyn) [StructuralComparison](https://blog.devart.com/tag/structuralcomparison) [tfs](https://blog.devart.com/tag/tfs) [Visual Studio 2019](https://blog.devart.com/tag/visual-studio-2019) [С#](https://blog.devart.com/tag/%d1%81) [ALM Team](https://blog.devart.com/author/alm) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-and-merge-source-code-in-visual-studio-2019.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Compare+and+Merge+Source+Code+in+Visual+Studio+2019&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-and-merge-source-code-in-visual-studio-2019.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-compare-and-merge-source-code-in-visual-studio-2019.html&title=How+to+Compare+and+Merge+Source+Code+in+Visual+Studio+2019) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-compare-and-merge-source-code-in-visual-studio-2019.html&title=How+to+Compare+and+Merge+Source+Code+in+Visual+Studio+2019) [Copy URL](https://blog.devart.com/how-to-compare-and-merge-source-code-in-visual-studio-2019.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Automatically Synchronize Multiple Databases on Different SQL Server Instances By [dbForge Team](https://blog.devart.com/author/dbforge) June 3, 2021 [0](https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html#respond) 7919 In this article, we share a step-by-step guide on how to compare the schema and data of multiple SQL Server databases from the command line. Both [dbForge Schema Compare for SQL Server](https://www.devart.com/dbforge/sql/schemacompare/) and [dbForge Data Compare for SQL Server](https://www.devart.com/dbforge/sql/datacompare/) support the command-line interface, which gives the user a rich control over the tools and allows automating and scheduling regular database comparison and synchronization tasks. As a DBA or SQL Server developer, you may face situations when you need to compare schema and/or data in more than two databases and quickly sync two or more SQL Server databases. Can you do it in one go and save a lot of time and effort? Let’s check. Step 1. Create a text file with the list of source and target databases and servers 1.1 Launch any third-party text editor, for example, Notepad. 1.2 Enter the names of the source servers and databases, separated by commas. Here you can write as many servers and databases as you need. Below is the template for such a list: Source_server_name1, Source_database_name1, Source_user1, Password1\nSource_server_name2, Source_database_name2, Source_user2, Password2\nSource_server_name3, Source_database_name3, Source_user2, Password3\n...\nSource_server_nameN, Source_database_nameN, Source_userN, PasswordN In this worked example, we are going to use as Source the following databases on the following servers: DBFSQLSRV\\SQL2016, AdventureWorks2019_Dev, Source_user1, Password1\nDBFSQLSRV\\SQL2016, BicycleStoreDev, Source_user2, Password2\nDBFSQLSRV\\SQL2016, BicycleStoreDemo, Source_user3, Password3 1.3 Save the file. We will save the file with the name source_servers_databases.txt . 1.4 To create multitarget, repeat the previous step for the target servers and databases by entering their names separated by commas according to the template: Target_server_name1, Target_database_name1, Target_user1, Password1\nTarget_server_name2, Target_database_name2, Target_user2, Password2\nTarget_server_name3, Target_database_name3, Target_user3, Password3\n...\nTarget_server_nameN, Target_database_nameN, Target_userN, PasswordN In this worked example, we are going to use as Target the following databases on the following servers: DBFSQLSRV\\SQL2019, AdventureWorks2019_Test, Target_user1, Password1\nDBFSQLSRV\\SQL2019, BicycleStoreDev, Target_user2, Password2\nDBFSQLSRV\\SQL2019, BicycleStoreDemo, Target_user3, Password3 1.5 Save the file. We will save the file with the name target_servers_databases.txt . Step 2. Create a .bat file 2.1 Launch any third-party text editor, for example, Notepad. 2.2 Enter the script for comparing databases like in the examples below. Don’t forget to adjust the script to suit your needs. The script for comparing schemas of multiple databases from the command line: set StudioPath=\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\"\n\nset ToolComparePath=\"C:\\Program Files\\Devart\\Compare Bundle for SQL Server\\dbForge Schema Compare for SQL Server\\schemacompare.com\"\n\nFOR /F \"eol=; tokens=1,2,3,4* delims=, \" %%e in (source_servers_databases.txt) do (\n\nFOR /F \"eol=; tokens=1,2,3,4* delims=, \" %%i in (target_servers_databases.txt) do (\n\n%ToolComparePath% /schemacompare /source connection:\"Data Source=%%e;Encrypt=False;Enlist=False;Initial Catalog=%%f;Integrated Security=False;User ID=%%g;Password=%%h;Pooling=False;Transaction Scope Local=True\" /target connection:\"Data Source=%%i;Encrypt=False;Enlist=False;Initial Catalog=%%j;Integrated Security=False;User ID=%%k;Password=%%l;Pooling=False;Transaction Scope Local=True\" /log:\"schema_compare_sql_log.log\" \n\n) \n\n)\npause The script for comparing data of multiple databases from the command line: set StudioPath=\"C:\\Program Files\\Devart\\dbForge Studio for SQL Server\\dbforgesql.com\"\n\nset ToolComparePath=\"C:\\Program Files\\Devart\\Compare Bundle for SQL Server\\dbForge Data Compare for SQL Server\\datacompare.com\"\n\nFOR /F \"eol=; tokens=1,2,3,4* delims=, \" %%e in (source_servers_databases.txt) do (\n\nFOR /F \"eol=; tokens=1,2,3,4* delims=, \" %%i in (target_servers_databases.txt) do (\n\n%ToolComparePath% /datacompare /source connection:\"Data Source=%%e;Encrypt=False;Enlist=False;Initial Catalog=%%f;Integrated Security=False;User ID=%%g;Password=%%h;Pooling=False;Transaction Scope Local=True\" /target connection:\"Data Source=%%i;Encrypt=False;Enlist=False;Initial Catalog=%%j;Integrated Security=False;User ID=%%k;Password=%%l;Pooling=False;Transaction Scope Local=True\" /log:\"data_compare_sql_log.log\" \n\n) \n\n)\npause Where: source_servers_databases.txt is the name of the file listing source servers and databases. target_servers_databases.txt is the name of the file listing source servers and databases. %ToolComparePath% identifies that you are using dbForge Schema Compare for SQL Server and dbForge Data Compare for SQL Server. If you want to use dbForge Studio for SQL Server, replace the variable with %StudioPath% /log: data_compare_sql_log.log is a path to the file where the output result will be stored. Pay attention that each script ends with pause . The command is provided to prevent the command-line window from automatically closing. This is necessary for monitoring the execution progress. To use it in production, pause should be removed from the script. Note: ToolComparePath is a default installation path for dbForge Data Compare and dbForge Schema Compare . However, if you have changed it, you will need to specify the correct path to the required tool’s .com file as well. 2.3 Save the script. Step 3. Compare source and target databases via the command line Now, all you need to do is execute the .bat file via the command line. First, let us run the .bat file for comparing schemas in our set of databases. Now, let us run the .bat file for comparing data in our databases. You will instantly get the summary comparison results: whether the source and target databases are identical or not, how many different or conflicting records there are, etc. The output file will be generated after the successful completion of the process. Additionally, if you want to learn how to automate and schedule SQL database synchronization from the CLI, feel free to watch [this video.](https://youtu.be/44kk1a-AGxY) To learn more about comparing data and schemas from the command line, refer to the corresponding documentation topics: [Compare data in multiple databases from the command line](https://docs.devart.com/schema-compare-for-sql-server/working-with-particular-cases/compare-schemas-in-multiple-databases-from-the-command-line.html) [Compare schemas in multiple databases from the command line](https://docs.devart.com/schema-compare-for-sql-server/working-with-particular-cases/compare-schemas-in-multiple-databases-from-the-command-line.html) To learn more about how to sync MySQL databases, refer to the [MySQL database synchronization](https://www.devart.com/dbforge/mysql/studio/database-synchronization.html) page. Conclusion dbForge Schema Compare and dbForge Data Compare tools include the command-line interface (CLI) for performing schema comparison and deployments of SQL Server databases from the command line, thus allowing multi db synchronization. This article provides worked examples of CLI scripts for comparing SQL Server schemas and data across multiple databases. Try the given scenario and you will see how easy it is to sync two databases on different servers. Tags [command line](https://blog.devart.com/tag/command-line) [compare multiple databases](https://blog.devart.com/tag/compare-multiple-databases) [data compare](https://blog.devart.com/tag/data-compare) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-multiple-databases-from-the-command-line.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Automatically+Synchronize+Multiple+Databases+on+Different+SQL+Server+Instances&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-multiple-databases-from-the-command-line.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html&title=How+to+Automatically+Synchronize+Multiple+Databases+on+Different+SQL+Server+Instances) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html&title=How+to+Automatically+Synchronize+Multiple+Databases+on+Different+SQL+Server+Instances) [Copy URL](https://blog.devart.com/how-to-compare-multiple-databases-from-the-command-line.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-compare-multiple-databases-through-command-line.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [PostgreSQL Tools](https://blog.devart.com/category/products/postgresql-tools) How to Compare Multiple Databases From the Command Line By [dbForge Team](https://blog.devart.com/author/dbforge) August 30, 2017 [0](https://blog.devart.com/how-to-compare-multiple-databases-through-command-line.html#respond) 10446 Very often, there is a need to compare or synchronize data between two databases on the same server. To detect differences, you can simply use [dbForge Data Compare for PostgreSQL](https://www.devart.com/dbforge/postgresql/datacompare/) , which allows you to perform this process with a few clicks. However, what if we need to compare data of multiple target and source databases located on different servers? This post describes the way to compare multiple databases on different servers through the command line. Step 1. Creating a text file with the source and target databases and servers Open any third-party editor, for example, Notepad. Type a name of the source server and database, separated by a comma and click Save. Here you can write as many servers and databases as you need. The example is as follows: Source_server1, Source_DB_name1\n\nSource_server2, Source_DB_name2\n\nSource_server3, Source_DB_name3 You need to repeat this process with the target servers and databases by writing names separated by comma as well. For example: Target_server1, Target_DB_name1\n\nTarget_server2, Target_DB_name2\n\nTarget_server3, Target_DB_name3 Step 2. Creating a .bat file Open any third-party text editor, for example, Notepad++. Create .bat file with the following content, for example: Set Compare=\"C:\\Program Files\\Devart\\dbForge Data Compare for PostgreSQL\\datacompare.com\"\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%e in (Source_Databases.txt) do (\n\nFOR /F \"eol=; tokens=1,2* delims=, \" %%g in (Target_Databases.txt) do (\n\n%compare%  /datacompare /source connection:\"Connection Lifetime=120;Host=%%e;Port=5440;Database=%%f;User ID=postgres;Password=********;Pooling=False\" target connection:\"Connection Lifetime=120;Host=%%g;Port=5440;Database=%%h;User ID=postgres;Password=********;Pooling=False\" /log:\"D:\\Multiple_Data_Compare.log\"\n\n)\n\n)\n\npause D:\\Multiple_Data_Compare.log is the file where the output result will be stored. If you wish, you can change the path to the log file, as well as to the .bat file. If necessary, you can change the following details in the source and target connection: Port User ID and password Note: Set Compare is a default installation path of dbForge Data Compare for PostgreSQL. However, if you have changed it, you will need to specify the correct path of the Data Compare file as well. 3. Save the script. Step 3. Comparing source and target databases through the command line Execute the .bat file through the command line. As you can see, the .bat file provides you with the summary comparison results: whether the source and target databases are identical, how many different or conflicting records there are either in source or in target databases. In addition, the output file will be generated after the successful completion of the process. Conclusion In this post, we have reviewed how to compare multiple databases located on different servers, by simply running a .bat file through the command line. As you can see, this simple process will not take much time and will be completed fast by just using a few button clicks. Tags [command line](https://blog.devart.com/tag/command-line) [data compare](https://blog.devart.com/tag/data-compare) [PostgreSQL](https://blog.devart.com/tag/postgresql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-multiple-databases-through-command-line.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Compare+Multiple+Databases+From+the+Command+Line&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-multiple-databases-through-command-line.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-compare-multiple-databases-through-command-line.html&title=How+to+Compare+Multiple+Databases+From+the+Command+Line) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-compare-multiple-databases-through-command-line.html&title=How+to+Compare+Multiple+Databases+From+the+Command+Line) [Copy URL](https://blog.devart.com/how-to-compare-multiple-databases-through-command-line.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-compare-two-formatting-styles-in-dbforge-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Compare Two Formatting Styles in dbForge SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) May 16, 2022 [0](https://blog.devart.com/how-to-compare-two-formatting-styles-in-dbforge-sql-complete.html#respond) 2777 One of the primary features of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) —a top-tier add-in for SSMS and Visual Studio—is code formatting, which helps beautify SQL code and make it consistent and easy to read, understand, review, and share across teams. The built-in formatter makes this possible with a number of options, including wizard-aided bulk formatting, noformat tags, predefined profiles, database identifier case synchronization, word recognition in CamelCase identifiers, and automation. The result becomes evident with the increased efficiency of your code reviews, faster troubleshooting, and overall improved team productivity. What is more, you can set up your own custom SQL coding standards and adopt them in your company. In this article, we would like to explore one of the formatter’s capabilities; namely, we will take a predefined formatting style (called “profile” in SQL Complete), introduce several changes to it and make it a new custom profile. After that, we will compare these two styles and see how easy it is to spot the differences we have applied. Creating a custom formatting profile in SSMS using SQL Complete Our example will involve a native stored procedure, taken from AdventureWorks2019; and first off, we will generate an ALTER script for it in SSMS. To do that, in Object Explorer, we right-click a procedure (say, dbo.uspGetBillOfMaterials ), and then we select Script Stored Procedure as > ALTER To > New Query Editor Window from the shortcut menu. After we do it, the result is as follows: Note the visual inconvenience of the SELECT and WITH statements, where the parameters go one by one in a single row. Now, if we select Format Document from the SQL Complete menu, the Default formatting profile will be applied, and we will see that the parameters in the SELECT statement are now stacked: Looks better, doesn’t it? Now we want our WITH clause to look the same way. For this purpose, we will modify the current Default profile with additional settings to create a Custom profile. To do that, we go to the SQL Complete menu > Options > Formatting > Profiles > Create New . There, we enter the name for our profile, select the format to copy the initial settings from, and specify a path to where our profile will be stored in an XML file. By default, the suggested path is C:\\Users\\username\\AppData\\Roaming\\Devart\\dbForge SQL Complete\\FormatProfiles\\ProfileName.xml . In our case, the “username” will be “jordansanders”, and our “ProfileName” will be “Custom”. Once our Custom profile appears on the list, we select it, click Open Selected , go to PROCEDURE/FUNCTION > Parameters , select the Stack parameters list checkbox, and additionally select simple list : Then we click Save to apply our changes, go back to the procedure, and apply the newly created Custom profile to it by selecting Format Document from the SQL Complete menu. This makes our SELECT statement look just as great: That’s it! Now that you have seen how it works, let’s compare our profiles. Comparing Default and Custom profiles The simplest way to do this is to compare the respective profile files in [Code Compare](https://www.devart.com/codecompare/) , a Devart application that makes code comparison easy and convenient. First, we must retrieve the profile files. As we can remember, the Custom profile was previously saved to C:\\Users\\jordansanders\\AppData\\Roaming\\Devart\\dbForge SQL Complete\\FormatProfiles\\ as Custom.xml . Since the predefined files are not stored in that folder, we can go back to the SQL Complete menu > Options > Formatting > Profiles > Create New to create a new profile, copy the Default settings without changes, and here we go—we’ve got the second XML file for comparison. Now let’s open Code Compare and upload our files via the File menu (alternatively, you can simply drag-and-drop them onto the respective sections): The comparison results are delivered instantly, and now we can see where we selected the Stack parameters list checkbox: The second difference shows where we additionally selected the simple list mode, which was the first option on the list (while the default mode, single line , was the third option): That’s it! As you can see, it is very easy to create and modify custom formatting profiles in SQL Complete, and then compare them in Code Compare. All in all, SQL Complete is an absolute marvel when it comes to formatting capabilities. And you don’t have to believe our words; just [download your FREE 14-day trial of SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) and check it in action today! Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbforge sql complete](https://blog.devart.com/tag/dbforge-sql-complete) [formatting](https://blog.devart.com/tag/formatting) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tools](https://blog.devart.com/tag/sql-server-tools) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-two-formatting-styles-in-dbforge-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Compare+Two+Formatting+Styles+in+dbForge+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-compare-two-formatting-styles-in-dbforge-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-compare-two-formatting-styles-in-dbforge-sql-complete.html&title=How+to+Compare+Two+Formatting+Styles+in+dbForge+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-compare-two-formatting-styles-in-dbforge-sql-complete.html&title=How+to+Compare+Two+Formatting+Styles+in+dbForge+SQL+Complete) [Copy URL](https://blog.devart.com/how-to-compare-two-formatting-styles-in-dbforge-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Configure a Linked Server Using the ODBC Driver By [Dereck Mushingairi](https://blog.devart.com/author/dereckm) January 19, 2025 [2](https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html#comments) 34655 Linked servers rarely start as a strategy. They begin as shortcuts—a one-off query, a borrowed connection, and a promise to “clean it up later.” Six months in, it’s still duct-taped into place, unmonitored, unaudited, and assumed to be someone else’s responsibility. What most teams don’t realize is that—when properly configured—linked servers can be a reliable, long-term solution. With the help of an ODBC driver, SQL Server can connect to systems like PostgreSQL, MySQL, or Oracle and query them as if they were local. There is no ETL duplication or batch syncing—just direct, real-time access across systems. This guide walks through the correct configuration process—from choosing the right [linked server ODBC](https://blog.devart.com/new-in-odbc-support-for-visual-basic-linked-server-alteryx-and-more.html) driver to setting up the DSN, defining security, and validating the connection. The goal is to create a setup you can rely on in production. Let’s dive in! Table of contents What is a linked server in SQL Server? Why use ODBC for linked servers? Prerequisites for configuring a linked server with ODBC Configuring a linked server using SQL Server Management Studio (SSMS) Best practices for using ODBC linked servers Limitations to keep in mind Community insights Conclusion What is a linked server in SQL Server? A linked server is a SQL Server feature that allows you to query external data sources—such as MySQL, Oracle, PostgreSQL, or other SQL Server instances— directly from within SQL Server , using T-SQL. These remote systems are treated as part of the local database, enabling real-time access without moving data. SQL Server uses OLE DB or ODBC providers to establish these connections, allowing cross-database SELECT statements and, in some cases, limited INSERT, UPDATE, or DELETE operations—depending on the capabilities of the target system and driver. While linked servers don’t fully replace ETL pipelines, they can reduce complexity when real-time access is needed without full data integration. Unlike app-layer integrations, linked servers operate inside the SQL Server engine, offering centralized control over execution plans, security contexts, and logging. In hybrid environments where data spans cloud and on-prem infrastructure, a [SQL Server linked server ODBC](https://blog.devart.com/openquery-in-sql-server.html) setup also offers a stable, low-friction way to bridge systems without restructuring them. Why use ODBC for linked servers? Most enterprise environments run on a mix of systems—SQL Server, PostgreSQL, Oracle, cloud warehouses, and legacy databases. Connecting them reliably takes more than patchwork fixes. It requires a scalable approach that works across platforms without introducing unnecessary complexity. That’s where ODBC comes in. It’s not new, but it remains one of the most effective ways to extend the SQL Server’s reach across heterogeneous systems. Here’s why ODBC is often the preferred choice when configuring a linked server in SQL Server: Cross-platform compatibility: ODBC supports a wide range of databases. If there’s a stable driver and a supported OLE DB wrapper (like MSDASQL), SQL Server can query it—no custom connectors, no fragile workarounds. A unified integration layer: Instead of juggling multiple APIs or translation layers, ODBC gives you a single, standardized interface. This makes cross-system queries easier to build, debug, and maintain over time. Built for production: Vendor-supported ODBC drivers are maintained, documented, and performance-tuned. That kind of reliability isn’t optional for systems in finance, healthcare, or other regulated sectors. Architectural agility: ODBC helps decouple the SQL Server from proprietary data stacks. That allows teams to evolve infrastructure without rewriting how systems talk to each other. Governance by design: Linked servers configured through ODBC sit inside SQL Server’s native security and auditing model. That means external access behaves like internal access—with complete visibility and control. Prerequisites for configuring a linked server with ODBC To build a reliable [SQL linked server ODBC](https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html) connection, you need a clean setup. These steps lay the groundwork for secure, stable integration: Install the correct ODBC driver: Match the 64-bit driver to your target system and SQL Server version to avoid compatibility issues. Set up a system DSN: Create a system-level Data Source Name with host, port, authentication, and driver-specific settings. Define your authentication method: Choose between SQL Server or Windows authentication. Make sure it aligns with your security standards. Enable the OLE DB provider: Activate “Microsoft OLE DB Provider for ODBC Drivers” and configure it to allow distributed queries if needed. Note: When connecting to the external database using an ODBC driver, SQL Server communicates through the Microsoft OLE DB Provider for ODBC Drivers (MSDASQL). This bridge is essential to any MSSQL-linked server ODBC setup, allowing SQL Server to execute queries across ODBC-based systems as if they were local. Pro Tip: If you plan to use OPENROWSET for ad hoc queries instead of configuring a full linked server, ensure the Ad Hoc Distributed Queries option is enabled. You can do this by running: EXEC sp_configure 'show advanced options', 1; \nRECONFIGURE; \nEXEC sp_configure 'Ad Hoc Distributed Queries', 1; \nRECONFIGURE; This enables the SQL Server to run ad hoc queries against external sources using OLE DB. Treat these steps as non-negotiable. They don’t just support the connection—they determine whether it runs cleanly under pressure. Configuring a linked server using SQL Server Management Studio (SSMS) Once the prerequisites are in place, configuring an SSMS linked server ODBC is straightforward—but precision matters. A misconfigured option can lead to failed queries, security gaps, or performance bottlenecks. Here’s how to configure it properly. Once your ODBC driver and DSN are in place, the SSMS configuration is straightforward—if you follow each step carefully. In SQL Server Management Studio: Navigate to Server Objects > Linked Servers > New Linked Server . Enter a name and select Microsoft OLE DB Provider for ODBC Drivers . In the Product Name field, use the name of the external database system (e.g., PostgreSQL). For the Data Source , enter the name of the System DSN you configured. Under Security , map the login credentials the SQL Server should use when connecting to the remote source. Save the configuration. Test the connection Run the following command to confirm the linked server is reachable. EXEC sp_testlinkedserver 'YourLinkedServerName'; Then, use OPENQUERY or four-part naming syntax to verify that you can retrieve data. SELECT * FROM OPENQUERY(YourLinkedServerName, 'SELECT * FROM your_table'); This setup bridges the SQL Server with external systems—securely, transparently, and with full query support. Best practices for using ODBC linked servers ODBC-linked servers can extend the SQL Server with power and flexibility—but only if configured carefully. Below is a quick reference for monitoring performance and security. Performance optimization Poorly configured linked servers can turn quick queries into system bottlenecks. These practices help keep execution efficient and predictable: Use OPENQUERY for remote operations: Let the remote server process the query, not the SQL Server. This reduces network traffic and offloads computation. Avoid four-part naming in complex joins: While convenient, it can cause the SQL Server to mismanage execution plans, leading to poor performance. Limit data transfer: Always filter and project only the fields you need. Full-table scans over ODBC can cripple query response times. Security and access control ODBC-linked servers expand your surface area. Treat them like external trust boundaries, not internal conveniences: Isolate credentials: Use dedicated logins for linked servers. Avoid sharing high-privilege accounts across systems. Enforce the least privilege: Grant the minimum access necessary to the external source, especially in a different security domain. Audit and monitor usage: Track linked server activity through SQL Server audit logs. Look for unauthorized access or unexpected query patterns. Linked servers via ODBC can be powerful—but only when configured with discipline. Treat them like an extension of your database surface area, not a shortcut. Limitations to keep in mind ODBC-linked servers are powerful, but they’re not without trade-offs. Here’s what you should know: Linked servers do not support all data types across systems (e.g., JSON or custom types in PostgreSQL may not map cleanly). Long-running queries or full-table scans over ODBC can lead to timeouts or bottlenecks. Distributed transactions are not supported unless MSDTC is appropriately configured. For real-world experiences and community troubleshooting, [this thread](https://www.reddit.com/r/SQLServer/comments/15itx94/linked_servers_with_odbc_wont_work/) covers quirks that may not appear in official documentation. Understanding these limitations helps teams plan better and use ODBC-linked servers where they genuinely add value. Community insights Explore additional perspectives and community solutions: [SQLTeam: 64-bit ODBC driver issues on SQL Server 2016](https://forums.sqlteam.com/t/linked-server-using-odbc-64-bit-on-sql-server-2016/20549/3) [Quora: How to create a linked server using the ODBC driver](https://www.quora.com/How-can-I-create-a-linked-SQL-server-using-the-ODBC-driver) [Stack Overflow: ODBC linked server fails in SSMS](https://stackoverflow.com/questions/77082284/cannot-connect-to-odbc-linked-server-in-ssms-but-can-in-odbc-data-sources) Conclusion ODBC-linked servers are more than a technical integration—they’re an architectural decision. When used well, they extend the SQL Server’s role from a standalone engine to a cross-platform data orchestrator. That matters in environments where data is distributed, systems are diverse, and performance expectations remain high. But, ODBC is not plug-and-play. Properly configuring an SQL-linked server ODBC connection demands discipline—careful configuration, security alignment, and performance-aware query patterns. Treat it casually, and you get a fragile system. Build it with intent, and you gain real-time access across your ecosystem without sacrificing control. Tags [linked server](https://blog.devart.com/tag/linked-server) [odbc](https://blog.devart.com/tag/odbc) [salesforce](https://blog.devart.com/tag/salesforce) [ssms](https://blog.devart.com/tag/ssms) [Dereck Mushingairi](https://blog.devart.com/author/dereckm) I’m a technical content writer who loves turning complex topics—think SQL, connectors, and backend chaos—into content that actually makes sense (and maybe even makes you smile). I write for devs, data folks, and curious minds who want less fluff and more clarity. When I’m not wrangling words, you’ll find me dancing salsa, or hopping between cities. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-a-linked-server-using-the-odbc-driver.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Configure+a+Linked+Server+Using+the+ODBC+Driver%C2%A0&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-a-linked-server-using-the-odbc-driver.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html&title=How+to+Configure+a+Linked+Server+Using+the+ODBC+Driver%C2%A0) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html&title=How+to+Configure+a+Linked+Server+Using+the+ODBC+Driver%C2%A0) [Copy URL](https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 2 COMMENTS Sebastian G August 9, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 6:02 pm The linked server is created but when I try tu run a query i’m getting the following error: Msg 7399, Level 16, State 1, Line 1 The OLE DB provider “MSDASQL” for linked server “SALESFORCE” reported an error. Access denied. Msg 7301, Level 16, State 2, Line 1 Cannot obtain the required interface (“IID_IDBCreateCommand”) from OLE DB provider “MSDASQL” for linked server “SALESFORCE” `DAC Team August 10, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 11:13 am Hello, Sebastian! When creating Linked Server, use SQL Server authentication and make sure that the “Allow inprocess” option is disabled. For this, in SSMS, go to Linked Servers -> Providers and double-click MSDASQL. In the opened window, uncheck the Allow inprocess option. In addition, read more about the issues that may occur when using Microsoft SQL Server Management Studio at devart.com/odbc/salesforce/docs/index.html?troubleshooting_ssms.htm And furthermore, try running NT SERVICE\\MSSQLSERVER under your Windows user name. Comments are closed."} {"url": "https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[ODBC](https://blog.devart.com/category/odbc) How to Configure a SQL Server Linked Server to Connect to MySQL By [Victoria Shyrokova](https://blog.devart.com/author/victorias) January 29, 2025 [0](https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html#respond) 685 Looking for a simple way to connect an SQL Server Linked Server to various data sources—like an external database, a specific table, or even an object in MySQL Server? By configuring a linked server, you can query and work with external data directly from SQL Server Management Studio (SSMS), using standard SQL commands. This setup eliminates the need to move or duplicate data between systems, making cross-platform access much easier. One of the most efficient approaches is through a [SQL Server Linked Server ODBC](https://blog.devart.com/how-to-configure-a-linked-server-using-the-odbc-driver.html) connection, which ensures seamless communication without OS-related limitations. In this article, you’ll learn how to set up and configure a linked server connection from a Windows-based SQL Server instance using SSMS. We’ll walk through connecting to a Linux-based MySQL server and reading table data directly in SSMS with the help of an ODBC driver—no coding required. Table of contents Why consider connecting SQL Server to MySQL? Configuring a Linked Server from SQL Server to MySQL Testing the Linked Server connection Common troubleshooting tips Conclusion Why consider connecting SQL Server to MySQL? Creating an architecture that includes connectivity between SQL Server and MySQL databases might have different reasons behind it. Let’s explore some of the most common ones in brief. Unified data access. You can use such a connection to access and query diverse data sources without leaving SQL Server. Run queries and join tables from MySQL databases. Simplified data management. Implementation of this architecture eliminates the need for complex ETL processes and facilitates real-time integration of data. Enhanced reporting and analytics. Consolidate data with comprehensive reporting and analytics without duplicating data across multiple hybrid systems. Simplified & central management. Manage and interact with multiple databases or systems from a single SQL Server Management Studio (SSMS) interface. You may totally avoid running native database export and import steps if you set up a Linked MySQL server in SSMS. Execute remote stored procedures. You can execute stored procedures on a remote server directly from your SQL Server. Wrapping it up, you won’t have to switch databases when transferring data between them. Configuring a Linked Server from SQL Server to MySQL Let’s install the ODBC driver we will use to create a connection from SQL Server to MySQL. Follow the steps described below to learn more about the process. Step 1. Install the ODBC driver for MySQL To enable communication between SQL Server and MySQL, first, you have to install the MySQL ODBC driver. In this tutorial, we will use the [ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) from Devart, which provides a 30-day trial. [Download the driver](https://www.devart.com/odbc/mysql/download.html) . Select the appropriate version (32-bit or 64-bit) to match your SQL Server architecture. Also, choose from options that fit macOS, Linux/UNIX, or Windows. Now, let’s install the driver. Run the installer, and when a visual installation wizard appears, follow the on-screen instructions. Change any default paths as desired. During installation, select the correct driver version for your SQL Server (e.g., ODBC 8.0 ANSI or Unicode). Select your appropriate processor architecture (Win32 or Win64). Note that some of the installation steps have been skipped as they are self-explanatory, and the default settings are going to work for you. Complete the installation and click the Finish button. Optionally, you may also download the documentation in the desired format. Step 2. Configure an ODBC Data Source (DSN) for MySQL Before setting up the linked server, we need a MySQL database with tables and rows so that we’ll be able to run queries and execute other Data Manipulation Language (DML) operations on it. If you do not have a database, you can [download SAKILA](https://www.devart.com/dbforge/mysql/studio/mysql-sample-database.html) with the sample rows and tables. After installing the MySQL ODBC driver, let’s set up a Data Source Name (DSN). Open ODBC Data Source Administrator (ODBC Data Sources (64-bit) for 64-bit drivers or ODBC Data Sources (32-bit) for 32-bit drivers). Add a New DSN. 3. After you have installed the Devart ODBC driver, it will be listed under the Drivers tab as seen in the image above. 4. Now click the System DSN tab and click Add (as shown in the image below). Then, select Devart ODBC Driver for MySQL, which we have just installed. 5. In the General tab, enter a name for the DSN that will let you identify it along with its purpose (e.g., DevArt-MySQL-Link64 ) and provide the connection details. Server The IP address or the hostname of the MySQL server that will be accessed from an SSMS connection. Port Usually, it’s the default 3306 unless you have a different requirement. User ID and Password The user ID with access rights to the SAKILA database and the password you’ve set in your environment. Database The target MySQL database name. 6. Test the connection. To do it, click the Test Connection button to check if the Devart ODBC for MySQL DSN connects to the SAKILA MySQL database successfully. A window with a Connection Successful message should appear if all the settings are correct. Common issues when using ODBC drivers Sometimes, even though you have set everything according to the instructions, you still might face the “ [MySQL][ODBC 5.x(w) Driver] Access denied for user… ” issue. In this case, you must check and modify the contents of the /etc/my.conf on the Linux server to allow connections from all sides (0.0.0.0) . In the [mysqld] section, add “ bind-address = 0.0.0.0 ” line to allow connections from a network source that weren’t enabled by default. However, if you’re using a public Internet network, avoid 0.0.0.0 . Add your IP subnet instead to preserve the security of the connection. Also, you should check the correctness of the User ID and Password. Now, let’s move on to the next step of creating the Linked Server in SSMS. Step 3. Create a Linked Server in SQL Server Management Studio Start the SSMS from your Windows Start menu to configure the linked server and enable SQL Server to interact with MySQL. Expand the Server Objects . Then, right-click Linked Servers > New Linked Server . In the New Linked Server dialogue window on the General tab, enter the following information: Linked Server Enter a name that identifies clearly the link(e.g., DEVART_MYSQL_DB_LINK ). Server Type Choose the Other data source radio button. Provider Choose Microsoft OLE DB Provider for ODBC Drivers . Data Source Type the same data source name we entered above (e.g., DevArt-MySQL-Link64 ) and optionally provide the Product name . Next, in the same dialogue window, select the Security option — check the Be made using this security context and enter the Remote login and its Password . Now, in the Server Options of the same New Linked Server dialogue, select the RPC & RPC Out as True , and save this configuration. Now, your configuration steps are complete, and you can confidently move on to the next step of testing and querying the MySQL linked server through SSMS. Testing the Linked Server connection Testing the connection is a crucial step to confirm that you can successfully query and integrate data from the intended source. Keep reading to learn how to perform this test. Verifying the Linked Server connection Assuming all the settings have been configured correctly, you should now see the linked server object we just created in SSMS under Linked Servers. The database object named sakila can be expanded to display all its objects, including the tables from the sample database. Next, let’s test the connection by running a query to ensure everything is working as expected. Querying the MySQL database using SSMS The next step will be to simply run a query on the linked sakila MySQL database. This will ensure that your linked access from SSMS is ready for typical database operations. Let’s first run a usual SHOW VARIABLES MySQL command on the linked server. SELECT * FROM OPENQUERY(DEVART_MYSQL_DBLINK, ‘SELECT * FROM actor ORDER BY first_name’) Next, just for testing purposes, run a normal SQL query to select a few rows from the Actor table from the linked sakila database. SELECT * FROM OPENQUERY(DEVART_MYSQL_DBLINK, ‘SELECT * FROM actor ORDER BY first_name’) As you can see, the Actor table in the MySQL sakila database is listed in an SSMS window. At this point, we have successfully configured and verified a linked server in a native Microsoft SQL Server SSMS tool. We have also managed to run SQL operations on MySQL using the link with success. Common troubleshooting tips Even though you have followed this tutorial down to a word, sometimes things can go wrong just because of some settings that weren’t adjusted correctly and might be different for some users. Let’s check the list of the most common issues you might encounter at some point, and learn how you can solve them. 1. Authentication issues If you’re facing “Login failed for user….” error, check if the user within the MySQL database has read/write access to sakila database on the MySQL Server. To do it, connect to the MySQL server locally on the Linux server as an administrator using the root credentials. $ mysql -u root -p Next, grant root-level access to your username . mysql> GRANT ALL PRIVILEGEs ON sakila.* TO 'username'; Lastly, apply the above GRANTS immediately and reset the privileges cache. mysql> FLUSH PRIVILEGES;\n$ mysql -u devart -p mysql> show databases; \n+--------------------+ \n| Database       | \n+--------------------+ \n| information_schema | \n| performance_schema | \n| sakila         | \n+--------------------+ 2. ODBC driver configuration issues If you see “ODBC driver not found” or “Data source name not found” errors, you will need to confirm the DSN name matches what you have configured in the ODBC Data Source Administrator dialogue window and verify you are using the correct bitness of MySQL ODBC driver (32-bit or 64-bit). 3. Network connectivity, firewall, or port block issues Once in a while, “Unable to connect to MySQL server” might happen. If you encounter it, check if the MySQL server is running, if the IP address and hostname are correct, and ensure if MySQL server allows remote connections (check the bind-address setting in the MySQL configuration). Check if the MySQL service is running on the Linux server. # service mysqld status     \n# service mysqld start Verify that the IP address and ports of the MySQL database server are correct. Verify if the MySQL port is listening for connections. ping of MySQL Server ==> $ ping 192.168.0.109 telnet ==> $ telnet 192.168.0.109 3306 This should return a successful connection (e.g., “Connected to 192.168.0.109”). Turn off the firewall (temporarily) and allow TCP connections to MySQL server and port. Conclusion Setting up a SQL Server Linked Server to connect to MySQL offers numerous benefits, including seamless data access and management, the ability to query MySQL tables directly from SQL Server, and the convenience of combining data from both databases for unified reporting. This approach eliminates the need for manual data transfers, allowing real-time access to live MySQL data, and centralizes queries and data management within SQL Server Management Studio, enabling efficient cross-platform data access and analysis. Additionally, using the [ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) from Devart will let you set up the connection without writing a line of code, using an intuitive interface. Try the [ODBC Driver for MySQL](https://www.devart.com/odbc/mysql/) from Devart for 30 days and experience how it can simplify and secure your project’s database connectivity. Tags [ODBC Driver for MySQL](https://blog.devart.com/tag/odbc-driver-for-mysql) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Configure+a+SQL+Server+Linked+Server+to+Connect+to+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html&title=How+to+Configure+a+SQL+Server+Linked+Server+to+Connect+to+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html&title=How+to+Configure+a+SQL+Server+Linked+Server+to+Connect+to+MySQL) [Copy URL](https://blog.devart.com/how-to-configure-a-sql-server-linked-server-to-connect-to-mysql.html) RELATED ARTICLES [ODBC](https://blog.devart.com/category/odbc) [Meet Our Newly Updated ODBC Solutions](https://blog.devart.com/meet-our-newly-updated-odbc-solutions.html) April 14, 2025 [ODBC](https://blog.devart.com/category/odbc) [Best Data Management Solutions: Features, Pros, and Cons](https://blog.devart.com/data-management-solutions.html) April 10, 2025 [ODBC](https://blog.devart.com/category/odbc) [What is Data Integration? Definition, Types, Examples & Use Cases](https://blog.devart.com/what-is-data-integration.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-configure-oracle-instant-client.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How To Configure Oracle Instant Client By [dbForge Team](https://blog.devart.com/author/dbforge) January 4, 2011 [4](https://blog.devart.com/how-to-configure-oracle-instant-client.html#comments) 35219 This article is a step-by-step instruction for configuring Oracle Instant Client so that our [tools for Oracle](https://www.devart.com/dbforge/oracle/) could work with it. 1. Download Oracle Instant Client You can download Basic Instant Client package using one of the following links (depending on your platform): Windows (32-bit) – [https://www.oracle.com/technetwork/topics/winsoft-085727.html](https://www.oracle.com/technetwork/topics/winsoft-085727.html) Windows (x64) – [https://www.oracle.com/technetwork/topics/winx64soft-089540.html](https://www.oracle.com/technetwork/topics/winx64soft-089540.html) Any other platform – [https://www.oracle.com/technetwork/database/features/instant-client/index-097480.html](https://www.oracle.com/technetwork/database/database-technologies/instant-client/overview/index.html) 2. Extract client files Extract content of the .zip archive to the desired destination folder. For example, C:\\ORACLE\\INSTANT 3. Add client folder to PATH Add the full path to Instant Client folder (C:\\ORACLE\\INSTANT in our case) to PATH environment variable. A reboot may be required. 4. Add TNS_ADMIN variable Add TNS_ADMIN environment variable that tells the client where to look for tnsnames.ora file. In our example, it will point to the client’s folder. TNS_ADMIN = C:\\ORACLE\\INSTANT 5. Add NLS_LANG variable Add NLS_LANG environment variable if localization is required. For example, NLS_LANG=American_America.CL8MSWIN1251 Reboot your computer. 6. Create registry file Create the inst_ora.reg file with the following content: Windows Registry Editor Version 5.00\n \n[HKEY_LOCAL_MACHINE\\SOFTWARE\\ORACLE]\n\"ORACLE_HOME\"=\"C:\\ORACLE\\INSTANT\"\n@=\"\"\n\"ORACLE_HOME_NAME\"=\"OraHome\"\n\"ORACLE_GROUP_NAME\"=\"Oracle - OraHome\"\n\"NLS_LANG\"=\"NA\"\n \n[HKEY_LOCAL_MACHINE\\SOFTWARE\\ORACLE\\ALL_HOMES]\n\"HOME_COUNTER\"=\"1\"\n\"DEFAULT_HOME\"=\"OraHome\"\n\"LAST_HOME\"=\"0\"\n@=\"\"\n \n[HKEY_LOCAL_MACHINE\\SOFTWARE\\ORACLE\\ALL_HOMESID0]\n\"NAME\"=\"OraHome\"\n\"PATH\"=\"C:\\ORACLE\\INSTANT\"\n\"NLS_LANG\"=\"NA\"\n \n[HKEY_LOCAL_MACHINE\\SOFTWARE\\ORACLE\\HOME0]\n\"ORACLE_HOME\"=\"C:\\ORACLE\\INSTANT\"\n\"ORACLE_SID\"=\"ORCL1120\"\n\"ID\"=\"0\"\n\"ORACLE_GROUP_NAME\"=\"Oracle - OraHome\"\n\"ORACLE_HOME_NAME\"=\"OraHome\"\n\"NLS_LANG\"=\"American_America.CL8MSWIN1251\"\n\"ORACLE_HOME_KEY\"=\"Software\\ORACLE\\HOME0\" Note: ORACLE_HOME value is set for this example. You’ll need to set your own value. Same for ORACLE_SID and NLS_LANG variables. 7. Register client Launch inst_ora.reg file adding data to the registry. Ready Oracle Instant Client is ready for work. Tags [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-oracle-instant-client.html) [Twitter](https://twitter.com/intent/tweet?text=How+To+Configure+Oracle+Instant+Client&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-configure-oracle-instant-client.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-configure-oracle-instant-client.html&title=How+To+Configure+Oracle+Instant+Client) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-configure-oracle-instant-client.html&title=How+To+Configure+Oracle+Instant+Client) [Copy URL](https://blog.devart.com/how-to-configure-oracle-instant-client.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 4 COMMENTS Jeroen June 10, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:45 am Some “\\” are not displaying in your registry settings dbForge Team January 12, 2015\t\t\t\t\t\t At\t\t\t\t\t\t 12:59 pm We have fixed this problem. Thank you. Tom August 14, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 6:21 pm How on earth do Devart think large corporations, that have large purchasing power, are able to allow their developers to make such extensive changes to their locked down terminals in order to trial dbForge? The time it would take to get approval and have these changes applied would be ridiculous. Looks like I’m back to diff’ing using SQL output text! Tom October 9, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:57 am Missing a lot of backslashes… Couldn’t this work just using ORACLE_HOME like everything else seems to? Comments are closed."} {"url": "https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Python Connectors](https://blog.devart.com/category/products/python-connectors) How to Connect MongoDB Using Python Connector From Devart to Perform DML Operations By [Victoria Shyrokova](https://blog.devart.com/author/victorias) January 29, 2025 [0](https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html#respond) 654 Many Python developers choose MongoDB for its speed, flexibility, and scalability. This NoSQL database adapts to different types of data, letting you build features regardless of data structure complexity. It’s a natural choice for applications that have to store lots of different information, handle constant changes, or manage large volumes of data. From this tutorial, you’ll learn how to connect MongoDB using Python Connector from Devart to create, read, update, and delete data. Keep reading for detailed guidance! Table of contents Technology background and preparation Connectivity architecture How to install Devart Python Connector for MongoDB How to create a user in the MongoDB database How to connect a Python application How to create a connection and query data from MongoDB Conclusion Technology background and preparation For this tutorial, we will use the [Python Connector for MongoDB from Devart](https://www.devart.com/python/mongodb/) , which will provide you with an easy way to connect to a MongoDB database from your Python application through TCP/IP, using the libmongoc client library. Also, for demonstrational purposes, we’ll use the Anaconda open-source platform to develop and deploy secure Python solutions. Note that you can choose any other IDE you’re familiar with (e.g., PyCharm, Jupyter, etc.) to develop Python applications, the flow will be the same, so you can still use this tutorial as a reference to learn how to connect your preferred development tool with a MongoDB database. Another important detail is that we’re going to use the [Python Connector for MongoDB from Devart](https://www.devart.com/python/mongodb/) , which will provide you with an easy way to connect to a MongoDB database from your Python application through TCP/IP, using the libmongoc client library. If you haven’t installed all the needed tools and drivers yet, make sure to do it before you proceed further. For this tutorial, you’ll need to install the [MongoDB Community edition](https://www.mongodb.com/docs/manual/installation/) , and get the [Devart Python Connector for MongoDB](https://www.devart.com/python/mongodb/) . Depending on the Python version on your machine, you’ll also have to download the Python Connector version that supports it. If you’re using Windows, to check the Python version, run the command in the Command Prompt: python --version In the output, you’ll see what Python version is installed. Now you’ll be able to download the specific Python Connector for MongoDB that will work for you. Connectivity architecture As we proceed to the process of setting up the connection to MongoDB using Python Connector, let’s explore the architecture in detail. In the image below, you can see an illustration of how this connection is supposed to work. In our case, we’ll have a Python application built on the Anaconda platform, which uses Devart Python Connector for MongoDB driver as a middle layer to connect with MongoDB database, that is hosted in the Cloud, or on-prem. After we set up the connection, we’ll perform CREATE, UPDATE, and DELETE operations on data stored in the MongoDB database. These are the common operations that are often used to modify data. Now that the task at hand is clear, let’s move on to the practical section of this guide. How to install Devart Python Connector for MongoDB First off, let’s start with installing Devart Python Connector for MongoDB. The easiest way to install Devart Python Connector for MongoDB is to get it from the [Python Package Index](https://pypi.org/project/devart-mongodb-connector/) . To do this, if you’re using Anaconda, follow the steps described below. Let’s open Anaconda Prompt in the Navigator. Enter the command line to check the pip library . py -m pip --version If pip library is not installed, you can run the command to do it. python -m ensurepip --upgrade Install Devart Python Connector for MongoDB using pip: python -m ensurepip --upgrade You can also install Devart Python Connector for MongoDB from the archive file that you have previously downloaded. First, unzip the archive file that you have previously downloaded. In the unzipped folder, there are two .whl files for Windows 32-bit and 64-bit. You have to install the package that matches your architecture. Ensure you have pip library installed as described above. Then follow these steps: Run the command to install Python Connector for MongoDB. Don’t forget to replace with an actual path to the connector. pip install / devart_mongodb_connector-1.1.0-cp312-cp312-win_amd64.whl If you encounter the error message, similar to the one shown below, run the command “ pip debug -verbose .” Now, go to Compatible Tags and check if the tag cp312-cp312-win_amd64 is supported. In the list of compatible tags, pip only supports versions till cp311-cp311-win_amd64. Let’s rename the file devart_mongodb_connector-1.1.0-cp312-cp312-win_amd64.whl to devart_mongodb_connector-1.1.0-cp311-cp311-win_amd64.whl in the zipped folder. To do it, run the command with the new file name. pip install C:\\Working\\11-Tech-Published-Articles\\29-Devart-Python-Connector-MongoDB\\Sources\\DevartPythonMongoDB\\whl\\devart_mongodb_connector-1.1.0-cp311-cp311-win_amd64.whl After that, the message about the successful installation will appear. Now, you are ready to build a Python application that’s going to be connected to MongoDB. How to create a user in the MongoDB database Most likely, you already have installed the MongoDB database to your system. Now, for demonstrational purposes, we are going to show how to create a demo mongodb database (we’ll name it “Netflix”) and a database user with the owner rights to operate this database. Note that if you already have a database, you might want to skip this step. Alternatively, you might want to follow this guide to test the waters. Use mongosh to connect to MongoDB . mongosh --port 27017 As a result you should see the output like in the screenshot below. Use the “use Netflix” command  to create a database. Alternatively, you can use a different database name that suits your project. Switch to the admin database mode and create a new user with the following commands. use admin\ndb.createUser(   \n {     \n user: \"netflix\",     pwd: \"netflix#123\", cleartext password     roles: [        { role: \"dbOwner\", db: \"Netflix\" }     ]   \n } \n) Now, you have created a new database user, and you’ve got one MongoDB database with empty collections. How to connect a Python application Now, it’s time to connect a database with the Python application. Open Anaconda Navigator , then select base (root) and click the Launch button to start Jupyter Notebook. Create a Notebook to proceed. Use the command below to import the Devart MongoDB module. import devart.mongodb The output should look like in the screenshot shown below. Use the connect method to establish a connection to the Netflix MongoDB database (or another database you already have). Check the credentials below to proceed. Server: localhost Database: Netflix User: netflix Password: netflix#123 ClientLibrary and BSONLibrary: here there should be the location path to .dll files where you have installed Devart Python Connector for MongoDB. By default, these are stored here: ( C:\\Users\\dtdinh\\anaconda3\\Lib\\site-packages\\devart\\mongodb ). Feel free to copy libmongoc-1.0.dll and libbson-1.0.dll to another location for convenience. How to create a connection and query data from MongoDB Your Netflix MongoDB database is still empty, but now you can create a table within it to test the connectivity. Here’s a simple way of doing it with execute method of cursor class. This will let you create a table using an approach similar to DDL statements in SQL. Let’s add attributes to our users collection: name , email , and password . # Create a table: users in Netflix database # Create a cusor from my_connection cursor = my_connection.cursor() cursor.execute(\"create table users(name , email, password)\") Check the screenshot below to see how this should work. Next, you will need to insert one record into users table in the Netflix database by using the execute method and INSERT command. #Insert one record into users tablecursor.execute(\"insert into users(name , email, password) values(:parameter1,:parameter2,:parameter3)\" ,(\"John Doe\",\"john.doe@email.com\",\"password\")) After the insertion, let’s query data by using the SELECT table statement. cursor.execute(\"SELECT * FROM users\")for row in cursor.fetchall():     print(row) Conclusion Now that you have a clear idea of how to connect MongoDB using [Python Connector from Devart](https://www.devart.com/python/mongodb/) to perform DML operations, you will be able to use this approach when building data-rich Python applications, saving substantial amounts of time on integration tasks. Apart from simplicity, Python Connectors from Devart will also provide you with support of all MongoDB and Python data types, providing extra options for data mapping. With this connectivity solution, you can submit multiple update statements to the MongoDB database server for processing as a batch to improve execution time, and improve overall security by using encryption support feature. Feel free to explore [MongoDB Python Connectors from Devart](https://www.devart.com/python/mongodb/) to learn more about the benefits you can get! Tags [Python Connector for MongoDB](https://blog.devart.com/tag/python-connector-for-mongodb) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+MongoDB+Using+Python+Connector+From+Devart+to+Perform+DML+Operations&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html&title=How+to+Connect+MongoDB+Using+Python+Connector+From+Devart+to+Perform+DML+Operations) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html&title=How+to+Connect+MongoDB+Using+Python+Connector+From+Devart+to+Perform+DML+Operations) [Copy URL](https://blog.devart.com/how-to-connect-mongodb-using-python-connector-from-devart-to-perform-dml-operations.html) RELATED ARTICLES [Product Release](https://blog.devart.com/category/product-release) [Meet Our New Python Connectors for Dynamics 365 Business Central, Excel Online, and Google Sheets](https://blog.devart.com/meet-our-new-python-connectors-for-dynamics-365-business-central-excel-online-and-google-sheets.html) February 19, 2025 [Python Connectors](https://blog.devart.com/category/products/python-connectors) [How to Connect to Oracle Database From Python With Devart Python Connector](https://blog.devart.com/connect-python-to-oracle-quickly-and-effortlessly.html) April 10, 2025 [Python Connectors](https://blog.devart.com/category/products/python-connectors) [How to Connect to a Salesforce Database From Python With Devart Python Connector](https://blog.devart.com/connect-salesforce-python.html) April 10, 2025"} {"url": "https://blog.devart.com/how-to-connect-oracle-autonomous-with-oracle-client.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Connect Oracle Autonomous with Oracle Client By [dbForge Team](https://blog.devart.com/author/dbforge) November 3, 2023 [0](https://blog.devart.com/how-to-connect-oracle-autonomous-with-oracle-client.html#respond) 5062 Oracle Autonomous Database is a powerful technology that provides data processing and management in the cloud for self-driving, self-securing and self-repairing databases which can operate with no human intervention. So, it’s quite unsurprising that many want to know how to establish a connect ion between Oracle Client and Oracle Autonomous. Let’s take a look at how this c an be achieved. Various applications can connect to Autonomous Transaction Processing through any of the connection types supported by Oracle Net Services. In this particular case, we’ll need to use an Oracle Call Interface connection also known as OCI. The following steps outline the basic process of making any OCI-type connection: Installing the client software Downloading the client credentials Configuring certain files and environment variables Setting up the connection in the application For the last step, we’ll use [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) by Devart – an IDE that allows synchronizing data between different Oracle servers and automating schema change management during the development process. Now, we’ll go through each of the steps in detail. Installing the client software Firs t, you will need to install the Oracle Client software on your computer. If you want to use the full Oracle Database Client edition, download version 11.2.0.4 or higher. Alternatively, you can [use Oracle Instant Client](https://blog.devart.com/how-to-configure-oracle-instant-client.html) . It contains the minimum required set of components for making an Oracle Call Interface connection. For most applications, Instant Client version 12.1.0.2 or higher will be sufficient. Downloading the client credentials Once the sufficient version of Oracle Client is installed, the next step would be to download the client credentials and store them on your machine. Oracle client credentials, or wallet files, provide access to data in your Autonomous Transaction Processing database – so they should only be stored in a secure location. In addition, only authorized users should have access to credentials. Feel free to follow [this guide](https://docs.oracle.com/en/cloud/paas/atp-cloud/atpug/connect-download-wallet.html#GUID-B06202D2-0597-41AA-9481-3B174F75D4B1) for detailed information on how to download client credentials via the Oracle Cloud Infrastructure console. When the archive with necessary files is successfully downloaded, extract it to a secure folder on your computer – for example, D:/Wallet_DB201904201312 Configuring files and environment variables Navigate to D:/Wallet_DB201904201312 and copy the content of tnsnames.ora and sqlnet.ora files. Then, go to the folder where you have downloaded the Oracle Instant Client and paste the copied content into the files with the same names. Finally, open sqlnet.ora and replace ?/network/admin with the path to the unzipped wallet in this part, for example: WALLET_LOCATION = (SOURCE = (METHOD = file) (METHOD_DATA = (DIRECTORY=\"D:/Wallet_DB201904201312\")))\nSSL_SERVER_DN_MATCH=yes Where D:/Wallet_DB201904201312 is the actual directory of the wallet. As a result, the files will look as follows, for example: tnsnames.ora TNSNAMES.ORA Network Configuration File\nGenerated by Oracle configuration tools. \n\nOLYMPICGAMESTEST = \n (description = \n (retry_count=20) \n (retry_delay=3) \n (address=(protocol=tcps)(port=1522)(host=adb.us-phoenix-1.oraclecloud.com))\n (connect_data=(service_name= olympicgame_high.adb.oraclecloud.com)) (security=(ssl_server_dn_match=yes))\n ) sqlnet.ora sqlnet.ora Network Configuration File: D:\\Oracle\\product\\21.0.0\\client_1\\NETWORK\\ADMIN\\sqlnet.ora\nGenerated by Oracle configuration tools. \n\nThis file as actually generated by netca. But if customers choose to install \"Software Only\", this filw wont exist and without the native authentication, they will not be able to connect ot the database on NT.\n\nSQLNET.AUTHENTICATION.SERVICES= (NONE) \n\nNAMES. DIRECTORY_PATH= (TNSNAMES, EZCONNECT) \n\nSQLNET.ENCRYPTION_CLIENT = REQUIRED \nSQLNET.ENCRYPTION_TYPES_CLIENT = (AES256) \n\nWALLET_LOCATION = (SOURCE = (METHOD = file) (METHOD_DATA = (DIRECTORY=\"D:/Wallet_DB201904201312\"))) SSL_SERVER_DN_MATCH=yes Setting up the connection With everything properly configured, we can finally set up the connection in dbForge and connect to the server. To create a new connection, go to Database and click New Connection . Specify the necessary information such as connection type, server name, username and password. Once the connection is created, you will be able to connect to the server by double-clicking the connection in the Database Explorer section situated in dbForge’s top left corner: Summary As we have seen, connecting Oracle Client to Oracle Autonomous Transaction Processing is not a complicated process as it only requires a few rather simple steps. However, you should still be diligent when performing them – especially when it comes to client credentials. You will need to make sure that all wallet files are stored in a secure place and are only accessible to authorized users. If all steps are followed properly, the process should not cause any difficulties. As for [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/download.html) – you can download it and try out its robust database development and maintenance features for free. Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [studio for oracle](https://blog.devart.com/tag/studio-for-oracle) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-oracle-autonomous-with-oracle-client.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+Oracle+Autonomous+with+Oracle+Client&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-oracle-autonomous-with-oracle-client.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-oracle-autonomous-with-oracle-client.html&title=How+to+Connect+Oracle+Autonomous+with+Oracle+Client) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-oracle-autonomous-with-oracle-client.html&title=How+to+Connect+Oracle+Autonomous+with+Oracle+Client) [Copy URL](https://blog.devart.com/how-to-connect-oracle-autonomous-with-oracle-client.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [Products](https://blog.devart.com/category/products) [SQL Aggregate Functions: Syntax, Use Cases, and Examples](https://blog.devart.com/sql-aggregate-functions.html) April 10, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [ODBC](https://blog.devart.com/category/odbc) [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) How to Connect to a Remote Oracle Database With the Devart ODBC Driver Using HTTP/HTTPS Tunnel By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) August 28, 2024 [0](https://blog.devart.com/how-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html#respond) 894 When there is no Oracle DBA on the development team, connecting to a new remotely hosted Oracle database can present several challenges. These include dependency on the Oracle database version, limitations specific to Oracle databases, and certain customization restrictions. However, despite these obstacles, it is still possible to manage the task without any prior experience by using the [ODBC Driver for Oracle from Devart](https://www.devart.com/odbc/oracle/) . With it, you’ll be able to establish connectivity and accelerate your work with the Oracle database. Keep reading this article to learn how to connect to a remote Oracle database using the ODBC driver for Oracle from Devart. Table of Сontents The Challenges of Connecting to a Remote Oracle Database Devart ODBC Driver for Oracle Overview Connecting to a Remote Oracle Database with the Devart ODBC Driver Benefits of Connecting to a Remote Oracle Database With the ODBC Driver Conclusion The Challenges of Connecting to a Remote Oracle Database Managing a team of developers without a dedicated Oracle DBA can be tricky. For example, the team needs to connect to a newly hosted remote Oracle database but is challenged by the fact that only HTTP (port 80) and HTTPS (port 443) ports are open. Or the REST Data Services (ORDS) aren’t configured for HTTP connectivity and cannot be used. Setting up Oracle’s REST (ORDS) data service via a Remote Systems Engineer might turn out to be quite complex as well. On the other hand, while promising, ORDS are challenging for non-DBAs due to: Dependency on Oracle database version Being limited to Oracle databases Customization limitations Steep learning curve Complex configurations Security considerations JAVA version clashes These challenges can significantly slow down your progress, so let’s check the alternative solution that is capable of streamlining the connection process. The Devart ODBC Driver for Oracle Overview The difficulty of setting up ORDS makes it clear that a more straightforward and versatile tool is required. For instance, Devart’s ODBC Driver for Oracle can eliminate the need for complex configurations and enable seamless HTTP connectivity. Let’s briefly check what this tool has to offer. The [ODBC Driver for Oracle from Devart](https://www.devart.com/odbc/oracle/) is designed to facilitate connections between applications and Oracle databases. It supports a wide range of database versions and provides advanced functionality, including support for HTTP and HTTPS connections for remote databases with restricted port access. This flexibility makes it a perfect fit for scenarios where traditional Oracle database connectivity methods fall short. Connecting to a Remote Oracle Database with the Devart ODBC Driver Follow the steps described below to learn how to connect to a remote Oracle database using this ODBC driver from Devart. Install XAMPP and start the Apache service. Ensure ports 80 and 443 are free for successful startup. Download and install the [Devart ODBC driver for Oracle](https://www.devart.com/odbc/oracle/download.html) .  Here you can find detailed [guide how to set the driver](https://docs.devart.com/odbc/oracle/installation-dbms.htm) . Configure the ODBC Data Source. To do it, open the ODBC Data Sources configuration dialog and add the Devart ODBC driver for Oracle. Next, configure the driver with the Oracle database’s host, port, service name, username, and password. Place the tunnel.php file in the Apache server’s root directory. As you can see, the ODBC Driver for Oracle from Devart not only resolves immediate connectivity issues but also ensures that the development process remains on track. Learn how to [connect Oracle database to Excel](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html) and import your data easily in this article. Benefits of Connecting to a Remote Oracle Database With the ODBC Driver Using the Devart ODBC Driver for Oracle provides several key advantages that can make all the difference in managing remote database connections. One of the most significant benefits is the ease of setup. “The straightforward configuration process saved us a great deal of time and eliminated the need for complex ORDS setups. It was a relief to avoid the steep learning curve and potential security pitfalls associated with other methods.” Ibrahim Khaleel, Project Manager Another major benefit is the flexibility of the ODBC drivers. These tools support HTTP and HTTPS connections, which are essential for environments with limited port availability. Finally, the Devart ODBC Driver for Oracle [works across different Oracle database versions](https://www.devart.com/odbc/oracle/compatibility.htm) without requiring extensive customization or dealing with JAVA version conflicts. Conclusion The Devart ODBC Driver for Oracle is a practical choice for connecting to a remote Oracle database when other options are too complicated or unavailable. It simplifies the connectivity process, enabling the development team to proceed without delays and minimizing the need for DBA resources. Interested in using ODBC protocol to set up connectivity with other databases? The [ODBC Universal Bundle from Devart](https://www.devart.com/odbc/universal-bundle/) offers more than 20 drivers for popular DBMSs, helping you optimize time and resource usage. Tags [odbc driver](https://blog.devart.com/tag/odbc-driver) [Oracle](https://blog.devart.com/tag/oracle) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+a+Remote+Oracle+Database+With+the+Devart+ODBC+Driver%C2%A0Using+HTTP%2FHTTPS+Tunnel&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html&title=How+to+Connect+to+a+Remote+Oracle+Database+With+the+Devart+ODBC+Driver%C2%A0Using+HTTP%2FHTTPS+Tunnel) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html&title=How+to+Connect+to+a+Remote+Oracle+Database+With+the+Devart+ODBC+Driver%C2%A0Using+HTTP%2FHTTPS+Tunnel) [Copy URL](https://blog.devart.com/how-to-connect-to-a-remote-oracle-database-with-the-devart-odbc-driver-using-http-https-tunnel.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-azure-sql-database-using-azure-private-link.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Connect to Azure SQL Database Using Azure Private Link By [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) July 23, 2024 [0](https://blog.devart.com/how-to-connect-to-azure-sql-database-using-azure-private-link.html#respond) 1057 Azure Private Link is a secure means of accessing Azure PaaS Services (including Azure SQL Database and Azure Storage) over a private endpoint in a virtual network. In other words, you can create your own private link service in your virtual network and deliver it to your customers without exposing it to the public internet. And if you need a tool to develop and manage Azure databases in this environment, there’s no better option than [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . Contents The benefits of using Azure Private Link How to create an Azure Private Link How to connect to Azure SQL Database in dbForge Studio for SQL Server The benefits of using Azure Private Link Azure Private Link is best illustrated through the benefits it delivers. Here are some of them. You can use private endpoints to connect your virtual network to all services that can be used as application components in Azure. You can easily access services running in Azure from on-premises over ExpressRoute private peering, VPN tunnels, and peered virtual networks—all via private endpoints. You are well-protected against possible data leakage, since a private endpoint is mapped to an instance of a PaaS resource instead of the entire service. Thus, unwanted access to any other resource in the service is blocked. You can enable the same experience and functionality to render your service privately to consumers in Azure. By placing your service behind a standard Azure Load Balancer, you can enable it for Private Link. The consumer can then connect directly to your service using a private endpoint in their own virtual network. If you would like to learn more about Azure Private Link, refer to [What is Azure Private Link](https://learn.microsoft.com/en-us/azure/private-link/private-link-overview) . How to create an Azure Private Link To get started, log in to your Azure Portal account. Note that you can create an Azure Private Link for your existing Azure services as well as during the initiation of a new server or database. Here, we’ll focus on the case where you are using an already existing database server. Now let’s create an Azure Private Link for it. 1. Go to All Services > Networking > Private Link . 2. You will be taken to the Private Link Center page. Scroll down and click Create private endpoint at the bottom. 3. You will be greeted by the Create a private endpoint wizard. On its first page, called Basics , go to Project details and indicate your Subscription and Resource group . Then, under Instance details , enter the Name of your Azure instance; the Network Interface Name will be generated automatically. Make sure your Region is correct and hit Next : Resource . 4. On the Resource page, enter the details about the Azure resource you want to create a Private Link for. Once it’s done, proceed to the next page. 5. On the Virtual Network page, select your Virtual network from the list and move on. Note that if you don’t have a virtual network, you will need to create and configure it via Azure services > Virtual networks . To learn more about it, refer to Microsoft’s [Azure Virtual Network documentation](https://learn.microsoft.com/en-us/azure/virtual-network/) . 6. On the DNS page, select to Integrate with private DNS zone , make sure you have picked the correct Subscription and Resource group , and proceed to the next page. 7. On the Tags page, configure your Name/Value pairs, if necessary, and go to the final page. 8. The final page is called Review + create . There, you can double-check all the details you have configured in this wizard. If something needs further adjustments, go back to any required page via Previous . If everything is correct, click Create . That’s how your Azure Private Link is created. This is what its page looks like. Now you can use this Private Link to get connected to Azure SQL Database via Azure VM or Azure VPN. How to connect to Azure SQL Database in dbForge Studio for SQL Server Everything is even easier with connecting to your Azure SQL Database instance via [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) , an all-in-one database IDE that perfectly suits SQL Server and Azure SQL alike. When you open the Studio for the first time, it will automatically suggest you connect to a database. Everything takes place in the Database Connection Properties window (which can be accessed at any moment from the Database menu > New Connection ), where you simply enter the credentials to your Azure SQL Database instance. No additional tricks required. Optionally, you can click Test Connection . If the credentials are correct, you will be shown a corresponding message. Click Connect , and that will be it! Now you are free to manage your Azure databases from the Studio. Download dbForge Studio for a free 30-day trial today! Thank you for making it this far, and this is where we’d like to suggest you get some firsthand experience with dbForge Studio. After all, it delivers quite a few tools that are simply indispensable when it comes to database development and administration. Context-aware SQL code completion, formatting, and refactoring Debugging of T-SQL scripts, stored procedures, triggers, and functions Visual query building on diagrams that eliminates the need for coding Query performance optimization Version control of database schemas and static table data Comparison and synchronization of databases Data aggregation in pivot tables Streamlined data migration (including import and export in multiple formats) Generation of complete database documentation Generation of column-intelligent, realistic test data Automated unit testing Server performance monitoring in real time Automation of recurring operations from the command line This is not an exhaustive list of the Studio’s features; a truly exhaustive one would take up too much space, so we’d rather like to invite you to [download the Studio for a free 30-day trial](https://www.devart.com/dbforge/sql/studio/download.html) and start exploring its boundless feature set today. It might as well become your one-in-a-million perfect toolset for Azure SQL Database. Give it a go and see for yourself. Tags [Azure SQL](https://blog.devart.com/tag/azure-sql) [connect to database](https://blog.devart.com/tag/connect-to-database) [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [Valentine Mostsevoy](https://blog.devart.com/author/valentine-winters) Writer, translator, editor, coffee-loving wordsmith. Explaining complex things in simple words. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-azure-sql-database-using-azure-private-link.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+Azure+SQL+Database+Using+Azure+Private+Link&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-azure-sql-database-using-azure-private-link.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-azure-sql-database-using-azure-private-link.html&title=How+to+Connect+to+Azure+SQL+Database+Using+Azure+Private+Link) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-azure-sql-database-using-azure-private-link.html&title=How+to+Connect+to+Azure+SQL+Database+Using+Azure+Private+Link) [Copy URL](https://blog.devart.com/how-to-connect-to-azure-sql-database-using-azure-private-link.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-mysql-in-delphi-with-mydac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Connect to MySQL in Delphi with MyDAC: A Comprehensive Guide By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) September 29, 2023 [0](https://blog.devart.com/how-to-connect-to-mysql-in-delphi-with-mydac.html#respond) 2199 Connecting to MySQL databases from Delphi is crucial to many software development projects. This article will explore how to achieve this using MyDAC, a powerful component for Delphi developers. We will provide step-by-step instructions on installing MyDAC, demonstrate its usage with practical examples, and compare it with FireDAC to highlight its advantages. Installing MyDAC About the product [MyDAC](https://www.devart.com/mydac/) is a set of database components for Delphi, C++Builder, and Lazarus that provides native connectivity to MySQL databases. Devart develops MyDAC and offers a wide range of benefits and features, making it a popular choice for developers working with MySQL databases in the Delphi environment. Benefits of MyDAC: High Performance: MyDAC is optimized for performance, ensuring efficient data access to MySQL databases. It utilizes native MySQL client libraries, which means it can perform better than generic database components. Stability and Reliability: MyDAC is known for its stability and reliability. It undergoes rigorous testing to ensure it works seamlessly with MySQL databases, reducing the risk of application crashes or data corruption. Cross-Platform Compatibility: MyDAC supports multiple Delphi and C++Builder versions and platforms, including Windows, macOS, iOS, Android, and Linux. This cross-platform compatibility allows developers to create applications for various operating systems. Full MySQL Compatibility: MyDAC provides comprehensive support for MySQL-specific features and data types, ensuring you can fully utilize MySQL’s application capabilities. Advanced Connection Pooling: MyDAC includes built-in connection pooling, which can significantly improve the efficiency of database connections in multi-user applications. Connection pooling helps manage and reuse database connections, reducing overhead and improving performance. Unicode Support: MyDAC fully supports Unicode, making it suitable for applications that require internationalization and localization. Rich Data Access Components: MyDAC includes many components for working with MySQL databases, including TMyQuery, TMyTable, TMyStoredProc, and more. These components simplify database operations and provide a consistent and intuitive API. Visual Query Builder: MyDAC includes a visual query builder allowing you to graphically create SQL queries. This feature is handy for developers who are not SQL experts. Detailed Documentation: Devart provides comprehensive documentation, including user guides, tutorials, and reference materials, to help developers get started with MyDAC and make the most of its features. Responsive Support: Devart offers responsive customer support to assist developers with any issues or questions they may have while using MyDAC. Features of MyDAC: Direct Connectivity: MyDAC establishes direct connections to MySQL servers, eliminating the need for additional middleware or database drivers. SQL Support: MyDAC supports SQL for creating, retrieving, updating, and deleting data in MySQL databases. Transaction Support: MyDAC allows you to work with transactions, ensuring data consistency and integrity in your applications. BLOB Streaming: MyDAC provides efficient support for working with binary large objects (BLOBs) and allows for streaming of BLOB data. Data Compression: MyDAC offers data compression options, reducing the amount of data transferred between the application and the database server, which can lead to improved performance. Database Encryption: MyDAC supports MySQL server encryption features, enhancing the security of data stored in the database. DataSet Integration: MyDAC seamlessly integrates with Delphi’s TDataSet-based data access architecture, making it easy to work with data-aware components in your user interface. Event Handling: MyDAC includes event handlers that allow you to respond to database events, such as data changes or errors. Automatic Error Handling: MyDAC provides automatic error handling and reporting, simplifying the debugging process. Data Export and Import: MyDAC allows easy export and import between MySQL databases and various data formats. Installation To get started with MyDAC, visit the official Devart website and [download the latest version of MyDAC](https://www.devart.com/mydac/download.html) for Delphi. Run the MyDAC installer and follow the installation wizard. During installation, select the Delphi versions you want to integrate MyDAC with. Integration with Delphi Open Delphi IDE. Navigate to “Component” > “Install Packages” in the IDE menu. Click the “Add” button and browse to the MyDAC package (e.g., “MyDACXE12.dpk”) located in the installation directory. Click “Open” and then “Compile.” After successful compilation, click “Install.” MyDAC will now be integrated into Delphi. Connecting to MySQL with MyDAC Now that we have MyDAC installed, let’s explore how to connect to a MySQL database. Step 1: Create a New Delphi Application Launch Delphi and create a new VCL Forms Application. Step 2: Add MyDAC Components Go to the “Tool Palette” on the Delphi form and locate the “MyDAC” tab. Drag and drop the TMyConnection component onto the form. This will be used to establish a connection to the MySQL database. Step 3: Configure MyDAC Connection Select the TMyConnection component on the form. In the Object Inspector, set the Server property to the MySQL server’s address. Set the Username and Password properties to your MySQL credentials. Specify the Database you want to connect to. Step 4: Establish the Connection Create a button on the form for connecting to MySQL. Double-click the button to open the code editor. Use the following code to establish the connection: procedure TForm1.ConnectButtonClick(Sender: TObject);\nbegin\n MyConnection1.Connected := True;\n if MyConnection1.Connected then\n ShowMessage('Connected to MySQL!')\n else\n ShowMessage('Failed to connect.');\nend; Step 5: Disconnecting from MySQL You can also add a button to disconnect from the MySQL server. Here’s an example of how to do it: procedure TForm1.DisconnectButtonClick(Sender: TObject);\nbegin\n MyConnection1.Connected := False;\n ShowMessage('Disconnected from MySQL.');\nend; MyDAC vs. FireDAC: Advantages of MyDAC Performance: MyDAC is known for its high performance, making it suitable for demanding applications that require efficient database access. Stability: MyDAC offers stable and reliable database connectivity, reducing the risk of unexpected crashes. Cross-Platform Support: MyDAC supports various Delphi versions and platforms, ensuring flexibility in development. Rich Feature Set: MyDAC provides many features, including advanced connection pooling, data compression, and support for MySQL-specific features. Support and Documentation: Devart, the company behind MyDAC, offers excellent support and comprehensive documentation, making it easier for developers to get assistance and learn. Conclusion In this article, we’ve explored the numerous benefits and features of MyDAC, a robust and efficient set of database components designed for Delphi, C++Builder, and Lazarus. MyDAC offers native connectivity to MySQL databases, ensuring high performance, stability, and cross-platform compatibility. With advanced features like connection pooling, Unicode support, a visual query builder, and comprehensive documentation, MyDAC is an excellent choice for developers seeking seamless MySQL integration in their applications. It’s important to note that while MyDAC is a powerful [DAC](https://www.devart.com/dac.html) solution for MySQL, other DAC products are also available in the market, each tailored to specific database systems. When choosing a DAC for your project, it’s essential to consider the particular requirements of your database and development environment. Devart, the company behind MyDAC, offers a range of DAC products for different databases, so you can explore their offerings and select the one that best fits your needs. Whether you’re working with MySQL or other database systems, DAC components can significantly simplify database interactions and enhance the efficiency of your applications. Besides, Devart is also the creator of the dbForge product line comprising professional database tools that allow you to [connect to MySQL](https://blog.devart.com/how-to-connect-to-mysql-server.html) and other major database systems and perform all kinds of tasks on database development, management, and administration. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [mydac](https://blog.devart.com/tag/mydac) [MySQL](https://blog.devart.com/tag/mysql) [SQL Server](https://blog.devart.com/tag/sql-server) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-mysql-in-delphi-with-mydac.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+MySQL+in+Delphi+with+MyDAC%3A+A+Comprehensive+Guide&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-mysql-in-delphi-with-mydac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-mysql-in-delphi-with-mydac.html&title=How+to+Connect+to+MySQL+in+Delphi+with+MyDAC%3A+A+Comprehensive+Guide) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-mysql-in-delphi-with-mydac.html&title=How+to+Connect+to+MySQL+in+Delphi+with+MyDAC%3A+A+Comprehensive+Guide) [Copy URL](https://blog.devart.com/how-to-connect-to-mysql-in-delphi-with-mydac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-mysql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Connect to a MySQL Database By [dbForge Team](https://blog.devart.com/author/dbforge) October 17, 2024 [0](https://blog.devart.com/how-to-connect-to-mysql-server.html#respond) 65414 You can connect to MySQL Server using the MySQL Command-Line Client or tools with graphical user interfaces. In this article, we consider each method in detail. How to connect to MySQL using the MySQL Command-Line Client How to connect to MySQL using dbForge Studio for MySQL How to connect to MySQL using MySQL Workbench How to connect to MySQL using Sequel Ace How to connect to MySQL using the Command-Line Client In the first article of our series, we provided a detailed walkthrough outlining how to install MySQL Server on Windows. This guide builds on your MySQL server being up and running. [MySQL Command-Line Client](https://blog.devart.com/mysql-command-line-client.html) is the default CLI utility included with every MySQL installation. This solution allows you to perform standard operations such as connecting to the database, creating, editing, and deleting databases and tables, retrieving and filtering data, etc. By default, MySQL Command-Line Client is installed together with the MySQL Server. To check if you have it on the machine, search for it in the Apps section (we are using Windows 11, if you have a different Windows version, search for this utility using the standard methods): Launch the MySQL Command-Line Client. You will be asked to enter the root user password you set during the MySQL installation. It is necessary to connect to your MySQL server. After successfully connecting to the MySQL server, you can execute commands. Assume that you want to [check the list of databases](https://www.devart.com/dbforge/mysql/studio/how-to-show-all-database-list-in-mysql.html) on the server. Execute the SHOW DATABASES command: You can [create a new database](https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html) with the help of the CREATE DATABASE command: To connect to a specific MySQL database and work with it, execute the USE database command and specify the name of the database you want to access: You can [create a new table](https://blog.devart.com/mysql-create-table-query.html) and then populate it with data using the CREATE TABLE and INSERT INTO commands: Finally, when all tasks you wanted to perform during this session are complete, type QUIT and click Enter to leave the MySQL client. How to connect to a MySQL database with a GUI Graphical user interface (GUI) tools are increasingly replacing command-line tools due to their ease of use. These tools simplify database operations by offering a visual mode, significantly reducing the learning curve. As a result, users can perform essential database tasks without needing to write code, relying on buttons, controls, and drag-and-drop functionality. Even experienced database professionals with strong SQL skills are turning to GUI tools more often. Such tools help them speed up workflows and reduce manual routine tasks. Advanced solutions like dbForge Studio for MySQL, for instance, offer robust scheduling and automation capacities, saving the users’ time and letting them focus on more challenging and essential tasks instead. Can GUI tools connect to MySQL databases? Absolutely. Since GUI tools are designed to handle various database-related tasks, connecting to databases is a fundamental feature. Let’s explore how to connect to databases using popular tools like dbForge Studio for MySQL and MySQL Workbench. How to connect using dbForge Studio for MySQL [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) is a comprehensive integrated development environment (IDE) designed for MySQL and MariaDB databases. It offers a powerful set of tools that enable users to perform a wide range of tasks, from [writing SQL code](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) to [managing source control](https://www.devart.com/dbforge/mysql/studio/mysql-version-control.html) , all within a single solution. The user-friendly, customizable graphical interface allows for quick mastery of the tools and configuring tasks in just a few clicks. Additionally, the ability to automate tasks with precision makes dbForge Studio a preferred choice among MySQL professionals of all skill levels, helping them save both time and resources. To connect to MySQL Server using Studio for MySQL: 1. Open the Database Connection Properties dialog box in one of the following ways: click New Connection on the Database menu or click the New Connection button on the Connection toolbar 2. Fill the connection details: specify the connection type, enter the host, the port, the user, and the password. Optionally, you can also specify the default database to connect to. The Connection Name will be generated automatically from the Host name. However, you can set a distinctive name for your new connection, if required. 3. In most cases, it is enough to enter the connection details on the General tab to successfully connect to the MySQL server and access the databases. Click Test Connection to verify the details, and dbForge Studio for MySQL will connect to the server. Optionally, you can adjust connection settings more precisely in the Database Connection Properties window. On the Advanced tab, you can specify Connection timeout and Execute timeout values in seconds. Here, you can also select the Encoding type from a drop-down list and enable the Detect MySQL character set , Use compression , and Keep connection alive options. On the Security tab, you can configure security properties, such as SSL or [SSH](https://blog.devart.com/connecting-to-mysql-with-putty-and-ssh-tunnels.html) security properties. This tab also allows you to enter the client key, the client certificate, and the authority certificate to increase the security level. On the HTTP tab, you can configure HTTP tunnel properties. and provide the additional details to establish the connection securely. When all settings are configured, click Test Connection or Connect . As you can see, dbForge Studio for MySQL offers a visual and straightforward method to connect to MySQL Server. It also gives you more control over connection configurations. How to connect using MySQL Workbench MySQL Workbench is a popular visual tool for database architects, developers, and DBAs. It is the default IDE for MySQL, a free and highly functional tool, though its [functionality is not as robust](https://www.devart.com/dbforge/mysql/studio/alternative-to-mysql-workbench.html) as in dbForge Studio. To access MySQL Server using Workbench: 1. Run MySQL Workbench. On the Database menu, click Connect to Database. Alternatively, click the plus icon next to MySQL Connections label. 2. In the Setup New Connection window, specify the Connection Name and provide the hostname, port, and username. 3. (Optional) Go to the SSL tab to configure SSL connection settings. 4. (Optional) Go to the Advanced tab to configure advanced connection settings. 5. You can click Test Connection to check the parameters that you’ve entered. In case you are sure that all the credentials are correct, click OK . Enter the password. After connecting, you will see the list of databases on the left. How to connect using Sequel Ace Sequel Ace is a widely used open-source GUI tool designed for managing MySQL and MariaDB databases on macOS, appreciated by MySQL specialists for its functionality and ease of use. To connect to MySQL using Sequel Ace, follow these steps: 1. Launch Sequel Ace and click Quick Connect . 2. In the connection dialog window, select TCP/IP to establish a standard connection. 3. Provide a name for the connection and enter the following credentials: Host , Username , Password (if applicable), Database (optional), Port . 4. Click Test Connection to verify the details or click Connect to proceed directly. Note: If your environment requires an SSH connection, choose the SSH option in the connection dialog window and provide the following additional details: SSH Host, SSH User, SSH Password, SSH Port. Conclusion In this article, we have presented three ways to connect to MySQL Server, including the MySQL Command-Line Client and the visual tools, dbForge Studio for MySQL, MySQL Workbench, and Sequel Ace. The choice of the solution depends on your personal preferences and work requirements, though, modern GUI tools offer significant benefits far exceeding the task of connecting to the databases. In particular, dbForge Studio for MySQL offers an all-embracing toolset to perform all tasks on database development, management, and administration with one software. You can try dbForge Studio for MySQL in your workflow and evaluate its powers under full workload. The [fully functional trial of the Studio](https://www.devart.com/dbforge/mysql/studio/download.html) is available for 30 days. Tags [command line](https://blog.devart.com/tag/command-line) [connect to database](https://blog.devart.com/tag/connect-to-database) [connect to mysql](https://blog.devart.com/tag/connect-to-mysql) [database administration](https://blog.devart.com/tag/database-administration) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-mysql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+a+MySQL+Database&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-mysql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-mysql-server.html&title=How+to+Connect+to+a+MySQL+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-mysql-server.html&title=How+to+Connect+to+a+MySQL+Database) [Copy URL](https://blog.devart.com/how-to-connect-to-mysql-server.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Connect to Oracle in Delphi with Devart ODAC By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) September 30, 2023 [0](https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html#respond) 1984 Delphi is a powerful programming language for developing Windows applications, and Oracle is a popular database management system. Connecting Delphi to Oracle databases is a common requirement for many software developers. This article will explore how to connect to Oracle in Delphi using the Devart ODAC library. We’ll cover the installation of Devart ODAC, provide concrete examples of its usage, and even compare it to FireDAC, another popular database access framework for Delphi. Installing Devart ODAC [Devart ODAC](https://www.devart.com/odac/) is a set of components and libraries for Delphi and C++Builder that provides native connectivity to Oracle databases. ODAC is designed to simplify database application development, offering a wide range of features and benefits: Native Oracle Connectivity: ODAC offers native, direct access to Oracle databases without additional middleware or ODBC drivers. You can take full advantage of Oracle’s features and performance optimizations. High Performance: ODAC is optimized for performance, making it suitable for applications that require fast and efficient data access. It leverages Oracle-specific features to achieve high-speed data retrieval and manipulation. Broad Compatibility: ODAC is compatible with various versions of Oracle Database, including Oracle 8, 8i, 9i, 10g, 11g, 12c, and 19c. It ensures your applications can connect to older and newer Oracle database systems. Oracle-specific Components: ODAC includes components like TOracleConnection, TOracleQuery, and TOracleStoredProc. These components make working with Oracle-specific features such as PL/SQL, LOBs (Large Objects) easier. Advanced Querying: ODAC supports advanced querying capabilities, including working with complex SQL queries and executing stored procedures. It also provides features like query parameter binding for enhanced security and performance. Unicode Support: ODAC fully supports Unicode, allowing you to work with multilingual data and develop internationalized applications without compatibility issues. Data Type Mapping: ODAC automatically maps Delphi data types to Oracle data types, simplifying data type conversions and ensuring data integrity. Secure Connectivity: ODAC supports secure connections to Oracle databases using features like SSL/TLS encryption and SSH tunneling, ensuring the confidentiality and integrity of data during transmission. Integrated Development Environment (IDE) Integration: ODAC seamlessly integrates with the Delphi and C++Builder IDEs, providing a familiar and efficient development experience. Cross-platform Compatibility: ODAC best suits Windows development but can also be used in cross-platform applications when targeting Windows platforms. Support and Documentation: Devart provides comprehensive documentation, tutorials, and a support forum to help developers get started with ODAC and troubleshoot any issues they encounter. Commercial and Free Editions: ODAC offers both commercial and free editions. The free edition includes core features and is suitable for small projects or learning purposes, while the commercial edition provides additional features and support for more extensive, production-grade applications. Before you can start using Devart ODAC to connect to Oracle databases in Delphi, you need to install the components. Follow these steps: Download Devart ODAC: Visit the Devart website to [download the ODAC](https://www.devart.com/odac/download.html) package suitable for your Delphi version. Run the Installer: Execute the downloaded installer and follow the installation wizard’s instructions. Select the appropriate Delphi versions you want to integrate ODAC with during installation. Verify Installation: Open Delphi and check if the Devart ODAC components are available in the Delphi IDE after installation. Connecting to Oracle Database Now that Devart ODAC is installed, let’s connect to an Oracle database using Delphi. uses\n ..., ODAC.Oracle;\n\nprocedure ConnectToOracle;\nvar\n OracleConnection: TOracleConnection;\nbegin\n OracleConnection := TOracleConnection.Create(nil);\n try\n OracleConnection.Server := 'YourOracleServerAddress';\n OracleConnection.Username := 'YourUsername';\n OracleConnection.Password := 'YourPassword';\n OracleConnection.Connect;\n if OracleConnection.Connected then\n ShowMessage('Connected to Oracle Database!')\n else\n ShowMessage('Failed to connect to Oracle Database.');\n finally\n OracleConnection.Free;\n end;\nend; In the code snippet above, we import the ODAC components and establish a connection to an Oracle database. Replace ‘YourOracleServerAddress’, ‘YourUsername’, and ‘YourPassword’ with the appropriate database server information. Working with Devart ODAC Devart ODAC provides many features for working with Oracle databases in Delphi. Here are a few common tasks: Querying the Database You can use the TOracleQuery component to execute SQL queries against the Oracle database. uses\n ..., ODAC.Oracle;\n\nprocedure ExecuteSQLQuery;\nvar\n OracleQuery: TOracleQuery;\nbegin\n OracleQuery := TOracleQuery.Create(nil);\n try\n OracleQuery.Connection := OracleConnection; // Use the previously established connection\n OracleQuery.SQL.Text := 'SELECT * FROM YourTable';\n OracleQuery.Open;\n // Process the query results\n finally\n OracleQuery.Free;\n end;\nend; Executing Stored Procedures Devart ODAC supports calling Oracle stored procedures with ease. uses\n ..., ODAC.Oracle;\n\nprocedure ExecuteStoredProcedure;\nvar\n OracleStoredProc: TOracleStoredProc;\nbegin\n OracleStoredProc := TOracleStoredProc.Create(nil);\n try\n OracleStoredProc.Connection := OracleConnection; // Use the established connection\n OracleStoredProc.StoredProcName := 'YourProcedure';\n OracleStoredProc.Prepare;\n OracleStoredProc.ExecProc;\n // Process the stored procedure results or output parameters\n finally\n OracleStoredProc.Free;\n end;\nend; Comparing Devart ODAC with FireDAC FireDAC is a database access framework developed by Embarcadero Technologies for Delphi and C++Builder, two popular integrated development environments (IDEs) for Windows application development. FireDAC stands for “Firebird/InterBase, Database Access Components,” but it’s not limited to Firebird and InterBase databases; it provides access to a wide range of database management systems, making it a versatile and comprehensive tool for database connectivity. While Devart ODAC and FireDAC are popular choices for database access in Delphi, they differ. Devart ODAC: Offers direct access to Oracle databases Optimized for Oracle-specific features Provides a comprehensive set of Oracle-specific components and features Is preferred when Oracle database compatibility is essential FireDAC: Offers universal database access, supporting multiple database management systems. Provides a consistent API for various databases, making it easier to switch between databases. Ideal for projects that require flexibility in terms of database platforms. The choice between Devart ODAC and FireDAC depends on your specific project requirements. If you primarily work with Oracle databases, Devart ODAC may be the better choice due to its tailored Oracle support. Conclusion In this article, we explored how to connect to Oracle databases in Delphi using [Devart ODAC](https://www.devart.com/dac.html) . We covered the installation process, connecting to the database, and working with Devart ODAC components. Additionally, we compared Devart ODAC to FireDAC, highlighting the strengths of each library. Devart ODAC is a robust choice for Delphi developers who need efficient and feature-rich access to Oracle databases. By following the steps and examples provided in this article, you can seamlessly integrate Devart ODAC into your Delphi projects and easily start working with Oracle databases. Tags [delphi](https://blog.devart.com/tag/delphi) [MySQL](https://blog.devart.com/tag/mysql) [ODAC](https://blog.devart.com/tag/odac) [SQL Server](https://blog.devart.com/tag/sql-server) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-oracle-in-delphi-with-odac.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+Oracle+in+Delphi+with+Devart+ODAC&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-oracle-in-delphi-with-odac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html&title=How+to+Connect+to+Oracle+in+Delphi+with+Devart+ODAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html&title=How+to+Connect+to+Oracle+in+Delphi+with+Devart+ODAC) [Copy URL](https://blog.devart.com/how-to-connect-to-oracle-in-delphi-with-odac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Connect to PostgreSQL in Delphi with Devart PgDAC By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) November 23, 2023 [0](https://blog.devart.com/how-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html#respond) 1899 PostgreSQL is a popular open-source relational database management system (RDBMS) that is widely used for building robust and scalable applications. When developing applications in Delphi, a powerful tool like Devart PgDAC (PostgreSQL Data Access Components) can greatly simplify the process of connecting to and interacting with PostgreSQL databases. In this article, we will guide you through the steps to connect to PostgreSQL in Delphi using Devart PgDAC. We will also compare PgDAC with two similar products from other brands to highlight its advantages. Installing Devart PgDAC [Devart PgDAC](https://www.devart.com/pgdac/) is a set of Delphi components that facilitate seamless connectivity and interaction with PostgreSQL databases in Delphi applications. It is designed to simplify database development and offer advanced features for developers who work with PostgreSQL. Key Benefits and Features of PgDAC Direct Mode: PgDAC provides a Direct Mode that allows applications to work with PostgreSQL databases without using the PostgreSQL client library. This results in better performance and reduced dependencies. Cross-Platform Support: PgDAC supports various versions of Delphi and C++ Builder and is compatible with both 32-bit and 64-bit Windows platforms. This cross-platform support ensures that developers can work on different systems and target platforms. Visual Query Builder: PgDAC includes a visual query builder, which enables developers to design SQL queries visually without writing complex code. This feature is particularly useful for those who are new to SQL or prefer a more intuitive approach. Advanced Connection Pooling: PgDAC offers advanced connection pooling, which helps manage database connections efficiently. Connection pooling can significantly enhance application performance, especially in multi-user environments. [Batch Updates](https://blog.devart.com/using-batch-operations-in-delphi-data-access-components.html) : it increases the speed of data operations dramatically. Moreover, in contrast to using Loader, Batch operations can be used not only for insertion, but for modification and deletion as well. Data Type Mapping: It provides comprehensive data type mapping between PostgreSQL and Delphi data types, ensuring that data is handled correctly when transferring between the database and the application. SSL Support: PgDAC supports SSL encryption for secure database connections, which is essential for protecting sensitive data in applications. BLOB Streaming: PgDAC allows developers to work with BLOB (Binary Large Object) data using streaming, which is efficient for handling large binary data such as images or documents. Unicode Support: PgDAC fully supports Unicode, ensuring compatibility with international character sets and languages. Automatic Query Execution: Developers can use PgDAC’s automatic query execution feature, simplifying the process of running SQL statements and retrieving results. Support for PostgreSQL-Specific Features: PgDAC is specifically designed for PostgreSQL, so it provides easy access to PostgreSQL-specific features and functions, including JSONB support, hstore, and more. Documentation and Support: Devart offers comprehensive documentation, examples, and support resources to assist developers in using PgDAC effectively. Primary Consumers of PgDAC Delphi and C++ Builder Developers: PgDAC is primarily targeted at developers who use Embarcadero Delphi or C++ Builder for Windows application development and need to work with PostgreSQL databases. It simplifies database connectivity and management within these development environments. Business and Enterprise Application Developers: Developers working on business and enterprise-level applications that rely on PostgreSQL as the backend database can benefit from PgDAC’s features and optimizations. These applications often require secure, efficient, and feature-rich database connectivity. ISVs (Independent Software Vendors): ISVs who build software products for a broad customer base may choose PgDAC to ensure that their applications can connect to PostgreSQL seamlessly. This component helps ISVs maintain database compatibility and performance across various customer environments. Database Administrators: Database administrators responsible for maintaining PostgreSQL databases may find PgDAC useful for developing custom database management tools and utilities. Startups and Small Businesses: PgDAC can also be valuable for startups and small businesses looking to build cost-effective, high-performance applications that use PostgreSQL as the database backend. When you start working with PostgreSQL in Delphi using PgDAC, you need to install the component. Follow these steps to install PgDAC: Download and Install Devart PgDAC: Visit the Devart website to [download the latest version of PgDAC](https://www.devart.com/pgdac/download.html) . Once downloaded, run the installer and follow the on-screen instructions to complete the installation. Launch Delphi: Open Delphi, either an existing project or create a new one. Adding PgDAC to Your Project: Click on “Component” in the Delphi menu. Select “Install Packages.” Click on “Add” and browse to the location where PgDAC was installed. Usually, it’s under “C:\\Program Files (x86)\\Devart\\PgDAC for RAD Studio XE\\xx\\Bin.” Select “PgDACxxx.bpl” (where “xxx” is the version number). Click “Open” and then “OK.” Adding PgDAC Components to Your Form: Open the form you want to work with. In the “Tool Palette” on the left, you will find a list of PgDAC components. Drag and drop the TPgConnection component onto your form. Configuring PgDAC Connection: Double-click on the TPgConnection component to open its properties. Set the Database property to the name of your PostgreSQL database. Set the Server property to the hostname or IP address of your PostgreSQL server. Enter the User and Password properties with the appropriate credentials. You can also specify other connection properties like Port and Protocol if needed. Click “OK” to save the settings. Connecting to PostgreSQL Now that you have PgDAC installed and configured, let’s see how to connect to a PostgreSQL database: procedure TForm1.ConnectToPostgreSQL;\nbegin\n PgConnection1.Connected := True; // Connect to PostgreSQL\n if PgConnection1.Connected then\n ShowMessage('Connected to PostgreSQL!')\n else\n ShowMessage('Failed to connect.');\nend; In the above code, PgConnection1 is the instance of the TPgConnection component you added to your form. By setting the Connected property to True, you establish a connection to the PostgreSQL database. Working with PgDAC Once connected, you can perform various database operations using PgDAC, such as querying data, inserting records, updating, and deleting data. Here’s a simple example of executing an SQL query: procedure TForm1.ExecuteQuery;\nvar\n Query: TUniQuery;\nbegin\n Query := TUniQuery.Create(nil);\n try\n Query.Connection := PgConnection1; // Set the connection\n Query.SQL.Text := 'SELECT * FROM your_table';\n Query.Open; // Execute the query\n // Process the query results here\n finally\n Query.Free; // Release resources\n end;\nend; In this code, we create a TUniQuery object, associate it with the PgConnection1, set the SQL query, and open it to fetch the data. Comparing PgDAC with Similar Product Now, let’s compare Devart PgDAC with two similar products from other brand to highlight its advantages. PgDAC vs. FireDAC: Devart PgDAC is a specialized PostgreSQL component optimized for PostgreSQL database interactions. While FireDAC is a more generic component that supports multiple database systems, making it less optimized for PostgreSQL-specific features. PgDAC provides better integration and performance for PostgreSQL as well as lower memory consumption. Unfortunately, a direct comparison between the performance of PgDAC and UniDAC was not carried out, but there is an [article](https://blog.devart.com/unidac-vs-firedac-performance-and-memory-consumption-comparison.html#PostgreSQL) where The performance and memory consumption comparison between UniDAC and FireDAC is compared in detail and it is clear that UniDAC has much higher performance and consumes much less memory than FireDAC. And since PgDAC and UnIDAC have a common source code and therefore have the same performance and memory consumption, PgDAC is also much faster than FireDAC and consumes significantly less memory. Conclusion [Devart PgDAC](https://www.devart.com/dac.html) simplifies the process of connecting to and working with PostgreSQL databases in Delphi. By following the steps outlined in this article, you can quickly set up your development environment and leverage PgDAC’s capabilities for efficient database interactions. When compared to similar products, PgDAC stands out with its PostgreSQL-specific optimizations and advanced features, making it a valuable tool for Delphi developers working with PostgreSQL. Tags [dac](https://blog.devart.com/tag/dac) [delphi](https://blog.devart.com/tag/delphi) [pgdac](https://blog.devart.com/tag/pgdac) [PostgreSQL](https://blog.devart.com/tag/postgresql) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+PostgreSQL+in+Delphi+with+Devart+PgDAC&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html&title=How+to+Connect+to+PostgreSQL+in+Delphi+with+Devart+PgDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html&title=How+to+Connect+to+PostgreSQL+in+Delphi+with+Devart+PgDAC) [Copy URL](https://blog.devart.com/how-to-connect-to-postgresql-in-delphi-with-devart-pgdac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Connect to SQL Server in Delphi with Devart SDAC By [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) November 18, 2023 [0](https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html#respond) 1870 Delphi is a popular programming language for developing Windows applications, and connecting to databases like SQL Server is a common requirement for many software projects. Devart SDAC is a powerful set of components that simplifies database connectivity in Delphi applications. In this article, we will walk you through the process of connecting to SQL Server using Devart SDAC, including installation and examples of how to interact with the components. We will also briefly compare SDAC with FireDAC to help you choose the right tool for your project. Installing Devart SDAC Devart SDAC is a comprehensive set of [Delphi components](https://www.devart.com/dac.html) that provides native connectivity to Microsoft SQL Server databases. It simplifies and enhances database development in Delphi applications by offering a wide range of features and benefits. SDAC Features: Direct Connectivity: Devart SDAC offers direct and native access to Microsoft SQL Server databases without the need for additional libraries or drivers. This direct connectivity results in faster and more efficient data access. Cross-Version Compatibility: SDAC supports various versions of Microsoft SQL Server, ensuring that your applications can connect to older and newer database servers seamlessly. Wide Range of Data Types: SDAC provides support for a wide range of SQL Server data types, including user-defined types and table-valued parameters, making it suitable for applications dealing with complex data structures. Advanced SQL Support: Devart SDAC allows you to execute SQL queries with ease. It supports complex SQL statements, stored procedures, and functions, enabling you to work with your SQL Server databases efficiently. Performance Optimization: SDAC includes features such as connection pooling, query caching, and asynchronous queries to optimize the performance of your database operations, resulting in faster data retrieval and processing. Unified Data Access: SDAC offers a unified approach to data access, allowing you to work with SQL Server data using the same components and code structure, regardless of the Delphi version you’re using. Visual Query Builder: Devart SDAC includes a visual query builder tool that simplifies the process of creating SQL queries, reducing the need for manual coding and potential errors. Secure Data Transmission: SDAC supports data encryption and secure socket layer (SSL) connections, ensuring that your data remains secure during transmission between your Delphi application and the SQL Server database. BLOB Data Handling: Devart SDAC provides efficient handling of Binary Large Object (BLOB) data, making it suitable for applications that store and retrieve large files or multimedia content. SDAC Benefits: Improved Productivity: Devart SDAC streamlines database development in Delphi, allowing developers to focus on application logic rather than dealing with low-level database connectivity details. High Performance: The direct connectivity and optimization features of SDAC result in faster database operations, which is crucial for applications that require quick data retrieval and processing. Cross-Platform Compatibility: SDAC is compatible with multiple Delphi versions and works on both Windows and macOS, making it suitable for cross-platform development. Secure Data Handling: With support for data encryption and secure connections, Devart SDAC ensures that sensitive data remains protected throughout the data transmission process. Flexibility: Devart SDAC’s support for a wide range of SQL Server data types and features allows developers to build versatile applications that can handle various data scenarios. Visual Query Building: The visual query builder simplifies SQL query creation, making it accessible to developers of varying skill levels and reducing the chances of SQL syntax errors. Vendor Support: Devart offers excellent customer support and regular updates for SDAC, ensuring that your database connectivity remains reliable and up to date. Before you can start using Devart SDAC, you need to install it on your development machine. Follow these steps: Visit the Devart website and [download the SDAC package](https://www.devart.com/sdac/download.html) suitable for your Delphi version. Run the installer and follow the on-screen instructions. After installation, open Delphi IDE. Creating a Connection Now that you have Devart SDAC installed, let’s establish a connection to your SQL Server database. uses\n DbxSda;\n\nprocedure ConnectToSQLServer;\nvar\n Connection: TMSConnection;\nbegin\n Connection := TMSConnection.Create(nil);\n try\n Connection.Server := 'YourServerName';\n Connection.Database := 'YourDatabaseName';\n Connection.Username := 'YourUsername';\n Connection.Password := 'YourPassword';\n Connection.Connect;\n \n // Connection established successfully\n except\n on E: Exception do\n begin\n ShowMessage('Connection error: ' + E.Message);\n Connection.Free;\n end;\n end;\nend; Replace ‘YourServerName’ , ‘YourDatabaseName’ , ‘YourUsername’ , and ‘YourPassword’ with your SQL Server credentials. Executing SQL Queries Once connected, you can execute SQL queries using SDAC. Here’s an example of how to execute a simple query: uses\n MSQuery;\n\nprocedure ExecuteSQLQuery;\nvar\n Query: TMSQuery;\nbegin\n Query := TMSQuery.Create(nil);\n try\n Query.Connection := Connection; // Use the previously created connection\n Query.SQL.Text := 'SELECT * FROM YourTable';\n Query.Open;\n\n // Process the query results here\n\n Query.Close;\n finally\n Query.Free;\n end;\nend; Fetching Data To fetch data from the query result, you can use a loop. Here’s how you can retrieve records: while not Query.Eof do\nbegin\n // Access fields using Query.FieldByName('ColumnName').Value\n ShowMessage('Name: ' + Query.FieldByName('Name').AsString);\n Query.Next;\nend; Inserting, Updating, and Deleting Records Devart SDAC also supports data manipulation. Here’s an example of how to insert, update, and delete records: procedure InsertUpdateDeleteRecords;\nbegin\n // Insert\n Query.SQL.Text := 'INSERT INTO YourTable (Name, Age) VALUES ('John Doe', 30)';\n Query.ExecSQL;\n\n // Update\n Query.SQL.Text := 'UPDATE YourTable SET Age = 31 WHERE Name = 'John Doe'';\n Query.ExecSQL;\n\n // Delete\n Query.SQL.Text := 'DELETE FROM YourTable WHERE Name = 'John Doe'';\n Query.ExecSQL;\nend; Comparing Devart SDAC with FireDAC FireDAC is a powerful and flexible database access framework developed by Embarcadero Technologies. It is an integral part of the Embarcadero RAD Studio (formerly known as Borland Delphi) and C++Builder integrated development environments. FireDAC simplifies database connectivity and provides a unified, high-performance access layer for a wide range of DBMS, making it a popular choice among Delphi and C++Builder developers. Devart SDAC and FireDAC are both excellent database connectivity components for Delphi. Here’s a brief comparison to help you choose the right one for your project: Devart SDAC: Specialized for various databases, including SQL Server. Rich set of features, including advanced data types support. Optimized for high-performance database access. May require a separate license. FireDAC: Part of the Delphi RAD Studio, no separate installation needed. Supports a wide range of databases, including SQL Server. Offers advanced features like data encryption and cross-platform development. Included with RAD Studio, reducing licensing costs for some users. Ultimately, the choice between Devart SDAC and FireDAC depends on your specific project requirements and licensing considerations. Conclusion We’ve demonstrated how to connect to SQL Server in Delphi using [Devart SDAC](https://www.devart.com/sdac/) . You’ve learned how to install the components, establish a database connection, execute SQL queries, fetch data, and manipulate records. Additionally, we provided a brief comparison between Devart SDAC and FireDAC to help you make an informed choice for your database connectivity needs in Delphi projects. With these skills, you can confidently integrate SQL Server databases into your Delphi applications using Devart SDAC. Good luck and feel free to contact our support team if you need any help with settings! Tags [delphi](https://blog.devart.com/tag/delphi) [MySQL](https://blog.devart.com/tag/mysql) [sdac](https://blog.devart.com/tag/sdac) [SQL Server](https://blog.devart.com/tag/sql-server) [Anastasiia Lijnis Huffenreuter](https://blog.devart.com/author/anastasiiam) A true connectivity enthusiast, always on the lookout for smarter ways to link platforms and systems. Passionate about sharing the latest solutions and best practices to help you set up seamless, efficient integrations. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-sql-server-in-delphi-with-devart-sdac.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+SQL+Server+in+Delphi+with+Devart+SDAC&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-sql-server-in-delphi-with-devart-sdac.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html&title=How+to+Connect+to+SQL+Server+in+Delphi+with+Devart+SDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html&title=How+to+Connect+to+SQL+Server+in+Delphi+with+Devart+SDAC) [Copy URL](https://blog.devart.com/how-to-connect-to-sql-server-in-delphi-with-devart-sdac.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-connect-to-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Connect to SQL Server Using SSMS, sqlcmd Utility, and dbForge Studio for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) December 29, 2021 [0](https://blog.devart.com/how-to-connect-to-sql-server.html#respond) 6857 In the article, we are going to provide a step-by-step guide on how to connect to SQL Server and execute SQL queries using the sqlcmd utility, SQL Server Management Studio, and dbForge Studio for SQL Server. Contents Connecting to MS SQL Server using SSMS Connect to SQL Server using Windows Authentication Connect to SQL Server using SQL Server Authentication View a list of databases available on the server in SSMS View a list of tables available in the selected database in SSMS Retrieve data from the table in SSMS Connecting to SQL Server using the sqlcmd utility Connect to a SQL Server instance using Windows Authentication Connect to a SQL Server instance using SQL Server Authentication View a list of databases available on the server using the sqlcmd utility View a list of tables available in the selected database using the sqlcmd utility Retrieve data from the table using the sqlcmd utility Connecting to SQL Server using dbForge Studio for SQL Server View a list of databases available on the server using dbForge Studio for SQL Server View a list of tables available in the selected database using dbForge Studio for SQL Server Retrieve data from the table using dbForge Studio for SQL Server Connecting to MS SQL Server using SSMS SQL Server Management Studio (SSMS) is a powerful IDE with a set of tools for the management, configuration, administration, monitoring, and development of SQL Server and Azure SQL databases. In the section, we’ll describe how to connect to SQL Server using Windows Authentication and SQL Server Authentication. Prerequisites To start working with SSMS, it should be installed on your computer. If not, [download](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms) and install it. Connect to SQL Server using Windows Authentication The Windows Authentication mode allows you to connect with the credentials from your Windows user account. To connect to the server, open the SSMS. In Object Explorer , click Connect and select Database Engine from the drop-down list. Alternatively, click Connect Object Explorer. In the Connect to Server dialog that opens, enter the following: Server name : Specify the name of the server to which you want to connect. If you don’t use the default instance MSSQLSERVER , you should specify the server name and instance name separated with a backward slash. Authentication : Select Windows Authentication from the drop-down list. Your Windows domain login and password will be pulled up automatically. If you want to change some additional connection properties, such as a network protocol, column encryption, connection string parameters, connection or execution timeout, click Options . After the connection properties are entered, click Connect . A new connection will be displayed in Object Explorer . Connect to SQL Server using SQL Server Authentication The SQL Server Authentication mode allows you to connect to the server by providing the SQL Server login and password. To connect to the server, open the connection manager by selecting Object Explorer > Connect > Database Engine from the drop-down list or by clicking Connect Object Explorer. In the Connect to Server dialog that opens, enter the following information: Server name : Specify the server name and instance name separated with a backward slash. Authentication : Select SQL Server Authentication from the drop-down list. Login : Enter the login from the server account required to log in to the SQL Server. Password : Enter the password from the server account required to log in to the SQL Server. To save the password for later use, select the Remember password checkbox. To modify connection properties, such as a network protocol, column encryption, connection string parameters, connection or execution timeout, click Options . Once done, click Connect . The new connection will be displayed and set as active in Object Explorer . After the connection is created, let’s do the following: View a list of databases available on the server View a list of tables available on the selected database Retrieve data from the table View a list of databases available on the server In SSMS, you can easily view a list of databases located on the server in Object Explorer . Select the server and expand the Databases node. View a list of tables available in the selected database To see the tables of the specific database, in Object Explorer , expand the selected database and then expand the Tables node. Retrieve data from the table To move on, let’s run a SQL query to check that everything works properly. For example, we want to retrieve only the ‘Quality Assurance’ department from the HumanResources.Department and HumanResources.EmployeeDepartmentHistory tables. To do so, on the toolbar, click New Query . In the SQL document that opens, enter the following SQL statement and then click Execute . SELECT\n\t*\nFROM HumanResources.Department d\nJOIN HumanResources.EmployeeDepartmentHistory edh\n\tON d.DepartmentID = edh.DepartmentID\nWHERE d.GroupName = 'Quality Assurance'; The output is as follows: Connecting to SQL Server using the sqlcmd utility The sqlcmd utility allows you to execute SQL queries, T-SQL statements, system procedures, and script files using the command line. The utility uses the OLEDB provider to connect to the server. In this section, we’ll describe how to connect to SQL Server using Windows Authentication and SQL Server Authentication, and to execute the query. Note: Prior to working with the sqlcmd utility, make sure that it has been [downloaded](https://docs.microsoft.com/en-us/sql/tools/sqlcmd-utility) and installed on your Windows machine. The default location of the sqlcmd utility is ‘C:\\Program Files\\Microsoft SQL Server\\150\\Tools\\Binn’. Connect to a SQL Server instance using Windows Authentication Open the Command Prompt and switch to the location of the sqlcmd utility. Then, execute the following command by replacing the connection parameters with the server ( server_name ) and instance ( instance_name ) names to which you want to connect. sqlcmd -S server_name\\instance_name -E -E indicates a trusted connection. If you have successfully connected, you will see 1> . It verifies that you are connected to the SQL Server and can execute SQL statements. To close the current sqlcmd session, type exit and press Enter . Connect to a SQL Server instance using SQL Server Authentication Now, let’s see how to connect to SQL Server using SQL Server Authentication. Except for the server and instance names to which you want to connect, you also need to specify a user name: sqlcmd -S server_name\\instance_name -U user_name After that, you will need to enter the password. If you have connected to the server, a new line starts with 1> . This means that the connection is successfully created and you can start working with queries. After we have connected to the SQL Server, we are going to verify that it works properly by executing some SQL queries that allow us to: View a list of databases available on the server View a list of tables available on the selected database Retrieve data from the table View a list of databases available on the server using the sqlcmd utility To get the list of databases located on the server, execute the following query and press Enter : SELECT name, database_id, create_date FROM sys.databases; Then, type GO and press Enter . In the output result, you will see all databases connected to SQL Server, including the database name, its ID, the creation date, and the total number of databases. View a list of tables available in the selected database using the sqlcmd utility To view a list of tables located in the database, execute the following query and press Enter : SELECT name, crdate FROM sysobjects WHERE xtype = 'U'; where name is the name of the table, crdate – a creation date, xtype – an object type for the row, and U – user table. Then, type GO and press Enter . As you can see, the query returns the table, its creation date, and the total number of tables available in the BicycleStoreDev database. Retrieve data from the table using the sqlcmd utility As mentioned, we will retrieve data from the table. For this, we are going to use the AdventureWorks2019 database and execute a SELECT statement to get data from the Person.Person and HumanResources.Employee tables with the condition that BusinessEntityID is higher than 200. To cut the list for demo purposes, we output the first 20 rows in the result. SELECT TOP 20\n p.BusinessEntityID\n ,p.FirstName\n ,p.LastName\n ,p.EmailPromotion\nFROM Person.Person p\nINNER JOIN HumanResources.Employee e\n ON p.BusinessEntityID = e.BusinessEntityID\nWHERE e.BusinessEntityID > 200; In the Command Prompt, switch to the AdventureWorks2019 database to use it, type the query, and press Enter . Then, type GO and press Enter again. The output will display the required data. Now, you can close the utility by typing exit and pressing Enter . Connecting to SQL Server using dbForge Studio for SQL Server [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) is a forefront IDE that comes with a bunch of built-in tools, features, and capabilities that allow you to enjoy the process of SQL Server database development, management, administration, and deployment and boost your productivity at the same time. To get started, [download](https://www.devart.com/dbforge/sql/studio/download.html) and [install](https://docs.devart.com/studio-for-sql-server/getting-started/installing.html) dbForge Studio for SQL Server. Keep in mind that you can use a free fully-functional 30-day trial version of the tool to evaluate all its remarkable features and functionalities. After the trial version expires, you will be offered to [purchase](https://www.devart.com/dbforge/sql/studio/ordering.html) the tool. To connect to SQL Server, launch the Studio and open the Connection Manager by using one of the following ways: On the Database Explorer toolbar, click New Connection . On the Database main menu, select New Connection . In the Database Connection Properties dialog that opens, do the following: Select the server to which you want to connect. Choose the authentication mode from the drop-down list. Depending on the authentication mode you chose, enter the login and password . Select the database you want to connect to. Optional step: Assign the environment category (development, production, sandbox, or test). If you want to configure some additional connection details such as connection and execution timeout, connection encryption, switch to the Advanced tab. After the connection details are added, click Connect . The Database Explorer will display the new connection with the green connection icon that indicates the active connection. For more information about how to connect to SQL Server with dbForge Studio for SQL Server, see the [documentation](http://How to connect to a SQL Server database) . You can also watch this [step-by-step tutorial](https://youtu.be/VeihmrFe_dQ) to see how to connect to a SQL Server instance in dbForge Studio for SQL Server. Additionally, feel free to watch this [introductory video](https://youtu.be/VeihmrFe_dQ) . After the connection is created, let’s do the following: View a list of databases available on the server View a list of tables available on the selected database Retrieve data from the table View a list of databases available on the server using dbForge Studio for SQL Server In dbForge Studio for SQL Server, you can easily view a list of databases located on the server without executing a query. To do so, go to the Database Explorer and expand the server connection. View a list of tables available in the selected database using dbForge Studio for SQL Server If you want to see tables located on a specific database, you simply need to navigate to the Database Explorer and expand the selected database > the Tables node. The list of tables available on the server will be displayed. The Tables folder will also contain the total number of tables in brackets. Retrieve data from the table using dbForge Studio for SQL Server Now, let’s execute a SQL statement to retrieve data from the Production.Product table based on the condition that the ProductID value is higher than 300. To do so, click New SQL on the toolbar. In the SQL document that opens, type the query and click Execute on the toolbar. As you can see, dbForge Studio for SQL Server is an easy and visual way to connect to SQL Server and quickly customize connection configuration up to your needs. Conclusion In the article, we described several ways to connect to SQL Server using the sqlcmd utility, SQL Server Management Studio, and dbForge Studio for SQL Server. With SSMS and dbForge Studio for SQL Server, you can easily set up connection details visually in the wizards. However, dbForge Studio for SQL Server is the best alternative to SSMS that leaves it behind due to a rich set of features and capabilities you can use in an intuitive and user-friendly GUI. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [how to connect to sql server](https://blog.devart.com/tag/how-to-connect-to-sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [sqlcmd](https://blog.devart.com/tag/sqlcmd) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Connect+to+SQL+Server+Using+SSMS%2C+sqlcmd+Utility%2C+and+dbForge+Studio+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-connect-to-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-connect-to-sql-server.html&title=How+to+Connect+to+SQL+Server+Using+SSMS%2C+sqlcmd+Utility%2C+and+dbForge+Studio+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-connect-to-sql-server.html&title=How+to+Connect+to+SQL+Server+Using+SSMS%2C+sqlcmd+Utility%2C+and+dbForge+Studio+for+SQL+Server) [Copy URL](https://blog.devart.com/how-to-connect-to-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Excel Add-ins](https://blog.devart.com/category/products/excel-addins) How to Consolidate Customer Data Into Excel Using Powerful Add-ins By [Victoria Shyrokova](https://blog.devart.com/author/victorias) December 27, 2024 [0](https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html#respond) 566 Data consolidation in Excel is what tames the wild beast of disorganized customer data. Businesses generate a lot of information on many different platforms. Think of customer data in CRMs, sales figures across ERPs, inventory sitting in cloud databases, or social media interactions tracked by various analytics tools. Without consolidation, there is just no easy way to turn this into strategic assets. Excel makes this process much easier, allowing you to unify customer data across multiple sources with one formula. Equipped with the right add-ins, you can easily integrate, manage, and update customer data in real time. This will let you study customer behavior, find high-value segments, create targeted marketing campaigns, automate reporting, and more. Table of contents The practical approach to customer data integration in Excel Types of data integration in Excel Preparing your data for consolidation Using Excel Add-ins for data consolidation Step-by-step Guide to data consolidation in Excel Conclusion The practical approach to customer data integration in Excel Excel is much simpler and more affordable than full-blown ETL software solutions, especially for smaller organizations that don’t need all this heavy functionality. Let’s say you run an e-commerce business selling through Shopify and tracking customer interaction in PostgreSQL. Using tools like [Devart Excel Add-ins](https://www.devart.com/excel-addins/) , you could directly connect to both and import all relevant data into a single spreadsheet following a simple wizard. Once inside Excel, you can find current purchase patterns and inventory turnover rates in real time. Similarly, if you are an inventory manager for a retail chain using Magento and SQL Server, you can use that same Add-in to pull your point-of-sale and inventory information into one unified Excel dashboard. After making changes, you can update your data to Magento and SQL Server without complicated interfaces or long setup times. Hands-on benefits of customer data integration in Excel Ease of Use: Excel allows quick and straightforward data handling, ideal for users who prefer a manual approach. Flexibility: Integrate data from various sources, perform custom calculations, and then re-upload the updated data back to the original or new sources. Speed: Excel’s familiar environment speeds up the data integration process, reducing the time needed to combine and analyze customer information. Data Accuracy: You can easily clean, validate, and standardize various formats using straightforward functions and formulas right in Excel. Customer Behavior Analysis: Excel makes analyzing customer data and patterns easy in one clear view for faster reporting. Improved Decision-Making: Having all your data in one Excel sheet allows you to discover new opportunities, predict future sales trends, and stay at the forefront of industry developments. Targeted Outreach: Excel lets you segment your audience based on their preferences so you can create personalized marketing campaigns. Types of data integration in Excel There are three types of data integration available in Excel. These include consolidation, propagation, and federation. Consolidation: Bringing data together from different sources into one place, like a single Excel sheet or workbook. This is the most common type and the one we will be covering. It’s ideal for streamlining your data storage, so you don’t have to dig through multiple sources. Propagation: Automatically copying data from one place to another. For example, between different Excel sheets or workbooks. If you refresh the source data, the linked cells will automatically update. It’s great if you have a few key data sources — you can update quickly without having to merge everything. Federation: Accessing and querying data from multiple sources without physically combining them. Excel does not natively support this, but it’s possible with the help of an appropriate add-in. It works well for companies managing vast datasets that need real-time insights. Whichever you go with, Excel lets you integrate various types of customer data, including names, contact details, demographics, purchase history, order timestamps, and email or website interactions. These can be in a text, number, date, JSON, or XML format. Manual data consolidation in Excel Instead of relying on cumbersome ETL processes and coding, you can achieve data consolidation in Excel with just a few clicks. Simply follow these steps: Open a new Excel workbook and import your data from CSV or Excel files. Create a new sheet for consolidation, such as Consolidated Data, and add headers for the combined data. Copy relevant data from each source sheet into the consolidated sheet and remove duplicates. Use the VLOOKUP formula to retrieve and calculate the values you need from the original datasets. The problem in the manual method is that any changes in the original data source will not be reflected in Excel. If your underlying data refreshes, your consolidated sheet won’t update automatically. That’s why it’s better to use Excel Add-ins. They let you pull data into Excel in real time, so you don’t have to worry about it being outdated by the time you upload it. Fetching and uploading data between sources There are other ways you can use Excel to fetch, alter, and upload data. Excel allows you to import data directly from various databases such as SQL Server and Microsoft Access. You can pull in data from other sources like CRM systems and e-commerce platforms too. However, you’ll have to export your data as a CSV or Excel file first, unlike when using Add-ins. Excel also contains plenty of built-in tools and formulas for cleaning and standardizing your data. What’s more, turning your data into meaningful insights is pretty easy using features such as Pivot Tables. When you’re finished, you can combine datasets based on common fields and refresh the updated data to its source in just a few clicks. Preparing your data for consolidation Before consolidating, you first have to identify, gather, and clean all the data you need from the relevant sources. Identifying data sources Determining what type of data sources you have to work with is essential to achieve your goals. For example, you may need customer data from a CRM (like Salesforce) and transaction records from a cloud platform (like SQL Server) to get useful insights about customer behavior and conversion rates. Other common data sources include: CRM systems: HubSpot, Zoho, Microsoft Dynamics. E-commerce platforms: Shopify, Magento, BigCommerce. Accounting software: QuickBooks, NetSuite, FreshBooks. Marketing platforms: Mailchimp, Marketo, Salesforce Marketing Cloud. Cloud databases: DB2, PostgreSQL, MySQL, Oracle Cloud. Analytics tools: Zendesk, Freshdesk. Cleaning and standardizing data Data from different sources often comes in inconsistent formats, leading to errors. To make sure your data is accurate, here are some best practices to follow: Check for and remove any duplicate entries in your dataset. Make sure that data formats are consistent, such as dates being in the same format, like MM/DD/YYYY. Handle missing values, either filling in gaps with default values, interpolating, or removing records that are incomplete. Apply Excel functions s uch as TRIM to remove extra spaces, CLEAN to remove non-printable characters, and UPPER/LOWER for standardizing text cases. Use conditional formatting to point out any anomalies or outliers. Using Excel Add-ins for data consolidation Excel Add-ins are handy tools that boost what you can do in Excel, letting you tackle tasks that the standard features can’t handle. Overview of Excel Add-ins Excel Add-ins can make data consolidation and other processes easier, connect with more external data systems, and improve how you import, analyze, and visualize data. They also help with complicated calculations and data changes, simplifying data management and reporting. Popular Excel Add-ins for data integration There are plenty of add-ins for Excel data consolidation out there. The right one depends on your specific needs and the complexity of your data. Take a look at the most popular options below. Add-in Description Devart Excel Add-ins Connects Excel to most popular cloud platforms like Quickbooks, HubsSpot, and Mailchimp, as well as major databases, allowing for instant data integration, management, and analysis. XLTools Offers features for merging, cleaning, and analyzing data directly within Excel. Ablebits Data Tools Provides a suite of tools for data cleaning, merging duplicates, and advanced filtering. Fuzzy Lookup Add-in Helps match and merge records from different datasets based on similar text values. Step-by-step guide to data consolidation in Excel Now, suppose you want to consolidate sales data from SQL Server with customer interaction data from [HubSpot](https://blog.devart.com/how-the-integration-of-hubspot-and-excel-powers-business-growth.html) to create targeted email campaigns. We have already covered how to do it manually, but let’s find out how to make the process more efficient. Here’s [how to add Devart Excel Add-ins](https://www.devart.com/excel-addins/how-to-add-excel-add-ins.html) to consolidate customer data: Note: The steps below can be applied to work with various cloud platforms, databases, or any other data sources you use that the Devart Add-ins support. Connecting to data sources Install the Devart Excel Add-ins Download the [Devart Excel Add-ins Universal Package](https://www.devart.com/excel-addins/universal-pack/) . Then, run the installer and follow the instructions in the Setup Wizard. Open Excel After installation, launch Excel and navigate to the “Devart” tab. Connect to your database For this example, select “ Get Data. ” This will open the Import Data Wizard. In the Data Source dropdown, select SQL Server Database . Enter your server name and database credentials, then click “ Next .” Follow the instructions to import the data. Connect to the cloud platform Click on “Get Data” again, but this time select HubSpot as your data source. Enter your HubSpot API key, and follow the instructions to import your data. Merging data sets With both datasets in different worksheets, follow the steps to manually consolidate data in Excel, as mentioned above. That is, create a new sheet, copy and paste both sheets’ relevant columns, and then clean up the data by removing duplicates and inconsistencies. Then, you can merge them using either Excel’s VLOOKUP function or the Consolidate tool. Using Excel’s Consolidate Suppose you want to consolidate multiple rows in a spreadsheet , such as customer data with repeated IDs but different purchase histories. Here’s how to combine them to get the total purchases per customer: Head over to the Data tab and choose Consolidate. In the dialog box, pick the SUM function and select your data ranges. Hit the Add button, check the Top row or Left column boxes depending on depending on where your labels are located. In this case, select the second to get the purchase amounts with the same customer ID, and click OK . If your data spans multiple ranges (e.g., Q1 sales data in one range and Q2 sales data in another), the consolidation process is similar. However, make sure to select both Top row and Left column to ensure accurate aggregation. When your sales data is in different spreadsheets , just switch to each sheet, highlight the ranges you want to include, and click Add in the Console box. Want to consolidate data from multiple workbooks instead? It’s pretty much the same process. Just open both workbooks (for example, Q1 Sales.xlsx and Q2 Sales.xlsx ). Then, follow the steps outlined earlier, but switch between the two workbooks to highlight the ranges you want to consolidate. Using the VLOOKUP function Excel’s VLOOKUP’s formula is a better fit for those cases when you need to get specific information from different tables, especially when dealing with large datasets. It’s important to note, though, that it can only return values from one column at a time, so it won’t work for multiple row and range consolidation. Now, let’s say you want to consolidate sales data from Sheet1 and customer interaction data from Sheet2 . You can use the following formula: =VLOOKUP(A2, Sheet2!$A:$D, 2, FALSE) In this formula: A2 is the value you want to search for, such as a customer ID. Sheet2!$A:$D is the range you want to search in. 2 tells Excel that you want to return the second column of that range. FALSE indicates an exact match. In this example, if you want to retrieve only the “Converted” status, you’d have to modify the formula to: =IF(VLOOKUP(A2, Sheet1!$A:$D, 3, FALSE)=\"Converted\", \"Converted\", \"\") In the next empty column in your consolidated sheet, use: =IFERROR(SUMIF(Sheet2!$B:$B, A2, Sheet2!$E:$E), 0) Learn more methods to set up an [Excel Oracle database connection](https://blog.devart.com/connect-oracle-database-to-excel-import-your-data-in-minutes.html) in this article. Conclusion Data consolidation in Excel is ideal for anyone managing diverse datasets, especially when it comes to customer information across multiple platforms. With the right add-in, you can simplify your data handling and make your workflow much smoother. [Try Devart Excel Add-ins](https://www.devart.com/excel-addins/universal-pack/) to easily connect to various databases and cloud services, streamline your data import processes, and refresh your data in real time with just a click. Tags [data consolidation](https://blog.devart.com/tag/data-consolidation) [excel add-ins](https://blog.devart.com/tag/excel-add-ins) [Victoria Shyrokova](https://blog.devart.com/author/victorias) I'm a content manager with a huge passion for SQL coding, database development, connectivity, and making complex stuff simpler. Check out my articles for hands-on tips, real-world use cases, and ideas to boost your workflow. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Consolidate+Customer+Data+Into+Excel+Using+Powerful+Add-ins&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html&title=How+to+Consolidate+Customer+Data+Into+Excel+Using+Powerful+Add-ins) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html&title=How+to+Consolidate+Customer+Data+Into+Excel+Using+Powerful+Add-ins) [Copy URL](https://blog.devart.com/how-to-consolidate-customer-data-into-excel-using-powerful-add-ins.html) RELATED ARTICLES [ODBC Drivers](https://blog.devart.com/category/products/odbc-drivers) [Best Data Integration Tools for 2025: Features, Pricing, and Use Cases](https://blog.devart.com/best-data-integration-tools.html) April 10, 2025 [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [Easy Magento Bulk Order Processing Using Excel and Add-in](https://blog.devart.com/easy-magento-bulk-order-processing-using-excel-and-add-in.html) April 10, 2025 [Excel Add-ins](https://blog.devart.com/category/products/excel-addins) [Excel Add-ins 2.10 Are Coming Soon](https://blog.devart.com/excel-add-ins-2-10-are-coming-soon.html) November 4, 2024"} {"url": "https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Easily Convert Your MS Access Data to MySQL By [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) March 1, 2024 [0](https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html#respond) 16360 Microsoft Access is a relational system for managing databases that is used to create small-scale databases for a single user or small teams. MySQL is a robust open-source relational database management system for more extensive data volumes and web applications. With the help of dbForge Studio for MySQL, you can easily [migrate data](https://www.devart.com/dbforge/mysql/studio/migrate-database.html) from Microsoft Access to MySQL and preserve data and functional integrity. This process will allow you to utilize a more scalable and powerful MySQL infrastructure for convenient management of your database contents. Contents Why migrate from Microsoft Access to a MySQL database? Prerequisites How to import data from Microsoft Access into a MySQL database Import Microsoft Access data Configure constraints Create or edit a foreign key Create or edit a primary key Conclusion Why migrate from Microsoft Access to a MySQL database? Migrating from Microsoft Access to MySQL can be useful for several reasons. Let’s review each of them. Scalability: MySQL provides better scalability for big data and more complicated projects in comparison with Microsoft Access. Performance : MySQL can deliver greater speed and productivity for database operations, especially under conditions involving a large volume of concurrent queries. Reliability and security: MySQL has advanced functionality to guarantee data protection and consistency, including backup and replication, to ensure uninterrupted operation. Support for web applications : MySQL is more commonly used in web development, facilitating integration with web servers and the development of web-oriented applications. Cross-platform compatibility : MySQL is available across multiple platforms (Windows, Linux, macOS), offering more flexibility in deployment options compared to the Windows-centric Microsoft Access. Prerequisites Before proceeding with the steps outlined in this article, make sure you have the following tools and components ready: Exported data from Microsoft Access [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) [Microsoft Access Database Engine](https://www.microsoft.com/en-us/download/details.aspx?id=13255) . It will install components that facilitate data transfer between Microsoft Access files and non-Microsoft Office applications. How to import data from Microsoft Access into a MySQL database Importing a database from Microsoft Access into MySQL involves a series of steps to seamlessly transfer data. This process is essential for migrating from Access limited capacity to MySQL robust database management system. This guide provides a comprehensive walkthrough detailing the necessary steps and considerations to successfully execute the migration process and harness the advanced functionalities offered by MySQL. Import Microsoft Access data We’re going to show you how to import the film_actor table exported from Microsoft Access to the sakila database in dbForge Studio for MySQL with the help of Data Import Wizard. 1. In Database Explorer , right-click the required database and navigate to Tasks > Import Data . 2. Select MS Access and click three dots to specify the path to the file you want to import. Note that you can save the import settings as a template for future uses. For this, click Save Template on any Wizard page. 3. Select the file and click Next . If the file is protected with a password, the Open MS Access Database dialog appears where you should enter the password. 4. If you want data to be imported to a new table, select New table and provide a name for it. To [import data](https://www.devart.com/dbforge/mysql/studio/data-export-import.html) to an existing table, select Existing table and choose the desired one. After that, click Next . 5. Configure data formats for the source data and click Next . 6. Map the Source columns to the Target ones. If you are importing the data into a new table, dbForge Studio will automatically create and map all the columns. If you are importing into an existing table, only columns with the same names will be mapped, the rest should be mapped manually. (If no columns with the same name are found, they are mapped in succession – the 1st column in Source with the 1st column in Target, etc.). See the Target columns at the top and the Source columns at the bottom of the wizard page. Click Source column fields and select the required columns from the drop-down list. To cancel mapping of all the columns, click Clear Mappings on the toolbar. To restore it, click Fill Mapping . If you are importing to a new table, you can edit the Target column properties by double-clicking them in the top grid. Select the Key check box for a column with a primary Key and click Next . You should select at least one column with a primary key. Otherwise, some of the import modes on the Modes wizard page will be disabled. 7. Select an import mode to define how dbForge Studio should import the data. Click Next . 8. Select an output option and click Next : Open the data import script in the internal editor : The script will get opened in dbForge Studio for MySQL after the import process. Save the data import script to a file : The script will be saved to the specified file. Import data directly to the database : Data will be added to the selected database. 9. Select how dbForge Studio should handle errors during import and whether you want to get a log file with details about the import session. 10. Click Import . 11. After the import process is completed, click Finish . The imported table will be visible in the database. Configure constraints After importing all necessary tables, you can set up or correct relations between the converted tables by creating/editing foreign keys if required. Create or edit a foreign key 1. Right-click the necessary table and select Open Editor . 2. Click Constraints . 3. To create a new foreign key, right-click and select Add Foreign Key . Learn [how to add, show, and drop foreign keys in MySQL](https://blog.devart.com/mysql-foreign-key.html) to level up your workflows. Type your name for the key and choose the desired constraint column. Choose the required referenced table and its column, and then click Apply Changes . To modify a foreign key, click it, add the required changes, and click Apply Changes . Create or edit a primary key To create a new primary key, on the Constraints tab, right-click and select Add Primary Key . Choose the required column and click Apply Changes . To modify the primary key, just click it, add or delete a column, and click Apply Changes . Learn [how to create and alter table statements with MySQL primary keys](https://blog.devart.com/mysql-primary-key-create-table-and-alter-table-statements.html) to improve database design and ensure the quality of data. Conclusion Overall, dbForge Studio for MySQL stands as a comprehensive and reliable solution for migrating data from Microsoft Access to MySQL. The tool ensures data integrity during migration, minimizing the risk of data loss or corruption throughout the import process. Also, dbForge Studio efficiently handles large volumes of data, enabling the seamless transfer of substantial information from Access to MySQL databases. [Download dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/download.html) and unlock the power of efficient database management and seamless migration! Additionally, you may be interested in other related topics: [How to Migrate Data from Oracle to MySQL: Step-by-Step Guide](https://blog.devart.com/migrate-from-oracle-to-mysql.html) [How to Convert MySQL Data to PostgreSQL](https://blog.devart.com/convert-from-mysql-to-postgresql.html) [Data migration from MySQL to Oracle server](https://blog.devart.com/data-migration-from-mysql-to-other-dbms.html) Tags [data export](https://blog.devart.com/tag/data-export) [data import](https://blog.devart.com/tag/data-import) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) When writing articles, Hanna Khyzhnia follows two main rules: any technical information should be presented in a way that even a child could understand it, and the language of the text must be as simple and accessible as possible for users. She aims to help readers dive into details, make decisions, and find answers without unnecessary confusion. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-convert-a-database-from-microsoft-access-to-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Easily+Convert+Your+MS+Access+Data+to+MySQL&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-convert-a-database-from-microsoft-access-to-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html&title=How+to+Easily+Convert+Your+MS+Access+Data+to+MySQL) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html&title=How+to+Easily+Convert+Your+MS+Access+Data+to+MySQL) [Copy URL](https://blog.devart.com/how-to-convert-a-database-from-microsoft-access-to-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-create-a-database-diagram-using-a-sketch-image.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How To: Create a MySQL Database Diagram Using a Sketch Image By [dbForge Team](https://blog.devart.com/author/dbforge) October 5, 2010 [2](https://blog.devart.com/how-to-create-a-database-diagram-using-a-sketch-image.html#comments) 6253 Often we make some sketches of the database we plan to create on a blackboard or a sheet of paper before we actually design its structure on computer. After that we discuss the entities we’ve got, normalize them and repeat these actions several times. As a result we get a completely approved database structure in the form of an image file in the project documentation. Question: How to create a database diagram basing on the image available? Let’s try to do this using Database Designer of dbForge Studio for MySQL . Let us suppose that you have a sketch of the future database : Database Structure To place this picture onto an empty diagram you should create an empty document, for example, Diagram1.dbd by pressing New Database Diagram on the Standard toolbar. After that you should press the New Image button on the Database Diagram toolbar. The mouse pointer will change to an icon with a picture. Click on any part of the diagram. In the Open dialog window that appeared select the image with the diagram structure sketch. Database Designer: Open New Image Now as you see the database sketch you can recreate the database from it. Let’s create the necessary tables with Primary Key and indexes one by one. For example, to create the Sessions table press the New Table button on the Database Diagram toolbar. The mouse pointer should change to an icon with a table. Click on any part of the diagram. A window for editing the Table1 table should appear. Database Designer: Create New Table Using the database editor window you should do the following: On the General tab edit the table name; add a key column (in this column you should edit its name, datatype, and set the Primary option); add all other columns (uncheck the additional Allow nulls(*) option) On the Indexes tab let’s create indexes for all key columns and uncheck the Unique option As a result we’ve got a new entity on the diagram – the Sessions table. Database Designer: Design New Table Move the table on the diagram not far from its presentation on the sketch. Then create the next table, for example, Hits, in the same way and move it not far from its presentation on the sketch. Now we can add a relation between the Hits and Sessions tables. To do this, you should: press the New Relation button on the Database Diagram toolbar. The mouse pointer should change to an icon with an arrow. Then click the Hits table, and, without releasing the mouse button, drag the cursor to any part of the Sessions table and release the mouse button(**). in the Foreign Key Properties window that appeared select the SessionID column from the “Table Columns” columns list and press the [→] button. The SessionID column was moved to the “Constraints Columns” column list. Save these changes by pressing OK. Database Designer: Create New Relation As a result, we’ve bound two tables – “Hits” and “Sessions” using the foreign key “hits_FK”. Database Designer: Display Relation Now we should repeat the same operations as creating and designing tables, creating indexes and relations between tables. An important part of the database design process is logical division of database objects into groups. [Database Designer](https://www.devart.com/dbforge/mysql/studio/database-designer.html) available in dbForge Studio for MySQL has a special Container component for this purpose. To create a new container and move the necessary objects into it you should: Press the New Container button on the Database Diagram toolbar. The mouse pointer should change to an icon with three squares. Click on an empty place on the diagram. A container with the Group1 name appeared. Let’s change the container name; Select the tables you want to move to the container. For example, let’s select Users, Registrars, Products, and OrderLinks tables; Move the selected tables onto the container; Database Designer: New Container And the final step in the process of database creation using a sketch is the optimization of database objects location on the diagram. The algorithm used by Layout Diagram is designed so that the program redraws the relations between tables so that they would not intersect each other. This allows to save space on the diagram and makes it readable. Database Designer: Layout Diagram As a result of the actions described above we’ve created a database using a sketch without switching over to other applications displaying the image of the diagram using Alt+Tab or printing the sketch owing to the unique functionality of [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) . (*) On the diagram, columns with the Not Null property enabled are displayed in bold (for example, the HitDate column of the SpiderHits table) unlike other columns (for example, the HitUrl column of the SpiderHits table). (**) To create Foreign Key between tables both these tables should have been created with Engine=InnoDB. You can [download](https://www.devart.com/dbforge/mysql/studio/download.html) a free 30-day evaluation of dbForge Studio for MySQL. Tags [database diagram](https://blog.devart.com/tag/database-diagram) [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-database-diagram-using-a-sketch-image.html) [Twitter](https://twitter.com/intent/tweet?text=How+To%3A+Create+a+MySQL+Database+Diagram+Using+a+Sketch+Image&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-database-diagram-using-a-sketch-image.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-a-database-diagram-using-a-sketch-image.html&title=How+To%3A+Create+a+MySQL+Database+Diagram+Using+a+Sketch+Image) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-a-database-diagram-using-a-sketch-image.html&title=How+To%3A+Create+a+MySQL+Database+Diagram+Using+a+Sketch+Image) [Copy URL](https://blog.devart.com/how-to-create-a-database-diagram-using-a-sketch-image.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025 2 COMMENTS Justin Swanhart October 6, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 7:21 am I don’t understand why you wouldn’t just make the model in dbForge (or MySQL workbench)? You really build your schema in an image application before you build it in a modeling application? .jp October 6, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 10:35 am Thank you for the good question. Really, if one developer works on database structure and its further implementation, than it is not necessary for him to create a database sketch with the help of another tool and to repeat the same actions in a database designer. It’s easier for him to start creating a database with the tool right away. But in most cases a sketch of the future database is drawn on a blackboard or paper before database implementation. After that it is being discussed, corrected, approved. And if the project is a distributed one, and analytics, managers and developers work in different offices, tasks from a database designer come to a developer as a set of documents with requirements among which can be this database sketch as a set of entities drawn on paper (or prepared as a digital document, for example, in MS Visio). And it is more convenient for a developer to have such sketch in front of him directly in the instrument, as it was described in our article. Comments are closed."} {"url": "https://blog.devart.com/how-to-create-a-new-user-and-grant-privileges.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Create a New User Account in MySQL and Grant Permissions on a Database By [dbForge Team](https://blog.devart.com/author/dbforge) May 19, 2021 [0](https://blog.devart.com/how-to-create-a-new-user-and-grant-privileges.html#respond) 19260 This article provides a complete overview of how to create a new user account in MySQL and grant different types of privileges on a MySQL database. Learn the basics of user account management and find hints. Introduction First, let’s figure out why we need users and privileges. When you install MySQL Server on your system and [create a MySQL database](https://blog.devart.com/creating-a-new-database-in-mysql-tutorial-with-examples.html) , you run the required commands in the MySQL shell as root or administrative user, meaning that you have the full power to control every aspect of your database. However, once you decide to let more people access and modify your MySQL database, you need to create users and grant them privileges. By doing so, you can give permissions or introduce restrictions within the databases or certain database objects. To put it simply, you will decide on who can do what and with what object types. This article provides insight into how to create a new user and grant appropriate privileges on a MySQL database. But not only that. You will also find out more about the creation of custom privileges and learn what types of privileges exist in MySQL. Moreover, you will have extra practice on how to revoke privileges and remove users from MySQL. Contents On top of that, you will have an opportunity to view examples within MySQL shell (MySQL command-line client) as well as within dbForge Studio for MySQL, a [GUI tool for MySQL and MariaDB](https://www.devart.com/dbforge/mysql/studio/) database development, management, and administration. 1. Create a new MySQL user account via MySQL Shell 2. Grant privileges and add permissions to user 3. Show all MySQL user account privileges 4. Revoke all privileges from user 5. Remove user from MySQL 6. Manage users and privileges via dbForge Studio for MySQL 7. Create a user account via Security Manager 8. Grant and revoke privileges via Security Manager How to Create a New MySQL User Account via MySQL Shell To get started, you need to connect to your MySQL Server instance and log in as a root user via MySQL command-line interface: mysql -u root -p When you do, you also need to type the password for the root account and press Enter : Enter password: ******** To create a new MySQL user account via the MySQL shell, you need to execute the CREATE USER statement. Let’s have a look at its basic syntax: CREATE USER [IF NOT EXISTS] 'new_user_name'@'host_name'\nIDENTIFIED BY 'user_password' In the syntax above, replace new_user_name with the name of the new user and host_name with the name of the host from which the user connects to the MySQL Server. Optionally, set the host_name to ‘localhost’ if you want the user to be able to connect to MySQL Server only from the localhost, which means “this computer”. If that’s not the case, you can use the remote machine IP address as hostname, for instance: CREATE USER 'new_user_name'@'10.8.0.5'\nIDENTIFIED BY 'user_password'; If you want the user to be able to connect from any host, use the ‘%’ wildcard as host_name . Finally, set a password for the new user after the IDENTIFIED BY keywords. Note that the IF NOT EXISTS option allows to ensure that the same user has not been created before. Once you are done with the new user creation, remember to grant privileges to the user to let them access the MySQL database. Otherwise, the user will not have any permissions to reach or manipulate the database in any way. How to Grant Privileges and Add Permissions to User To provide a user with access to the database and give permissions, you generally need to use the following GRANT statement: GRANT permission_type\nON privilege_level\nTO 'new_user_name'@'host_name'; Although the above-mentioned syntax is rather basic and doesn’t show all subtleties, it points to the conclusion that there are multiple types of privileges (or permissions) that can be provided to a new user. Hence, let’s illustrate the most common types of permissible privileges that can be used for the GRANT and REVOKE statement: ALL PRIVILEGES – The user gains all privileges at an access-specified level. CREATE – The user gains permission to create databases and tables. DROP – The user gains permission to drop databases and tables. DELETE – The user gains permission to delete rows from a specific table. INSERT – The user gains permission to insert rows into a specific table. SELECT – The user gains permission to read a database. UPDATE – The user gains permission to update table rows. Thus, we have clarified what types of permissions exist and defined what to put in the first part of the GRANT command. Now, let’s talk about the second part that follows the ON keyword, namely, the privilege level. By means of the privilege level, you can determine which MySQL objects can be manipulated by the user account: all databases, a specified database, specified tables, certain columns, or certain stored routines within a database. In the following paragraph, we will talk about that in greater detail and provide examples. Grant Privileges on Database to User To grant all privileges to a user account on all databases via MySQL command prompt, you need to assign global privileges and use the *.* syntax after the ON keyword: GRANT ALL PRIVILEGES \nON *.* \nTO new_user_name@host_name; In this example, the new user is granted the maximum privilege level possible: they gain the permission to read, modify, execute commands and perform any task across all databases and tables. Be careful as this can compromise your database security and lead to negative consequences. Instead, you might want to grant limited permissions. For instance, you would like to allow your new user access only a certain table within the database: GRANT ALL PRIVILEGES \nON database_name.table_name \nTO user_name@host_name; In this case, the user is granted table-level privileges, which apply to all columns within the table. Hence, they gain permission to read, edit, and modify the table as required. However, it may also be necessary to restrict such access and give the possibility to perform certain operations within specified database objects. For example, below, you can see that the user is granted multiple permissions: they have permission to use the SELECT statement across two columns in the database, execute UPDATE on a third column, and run INSERT across the fourth column within the same database: GRANT \nSELECT (column1,column2), \nUPDATE(column3),\nINSERT (column4) \nON database_name\nTO user_name@host_name; Just as you are finished providing the database access to the new users, make sure to reload all the privileges by running: FLUSH PRIVILEGES; After that, your changes will take effect. Show All MySQL User Account Privileges To display the privileges granted to MySQL user accounts, you need to apply the SHOW GRANTS command: SHOW GRANTS FOR user_name@host_name; The output of the command looks similar to the following: +---------------------------------------------------------------------------+\n| Grants for user_name@host_name |\n+---------------------------------------------------------------------------+\n| GRANT USAGE ON *.* TO 'user_name'@'host_name' |\n| GRANT ALL PRIVILEGES ON `database_name`.* TO 'user_name'@'host_name' |\n+---------------------------------------------------------------------------+\n2 rows in set (0.00 sec) Also, you can learn more about [how to create a list of privileges of MySQL database users](https://blog.devart.com/how-to-get-a-list-of-permissions-of-mysql-users.html) . How to Revoke All Privileges from User If you need to revoke privileges from the user account on a database, apply the syntax that is similar to the one you used when granting permissions: REVOKE permission_type\nON privilege_level\nFROM 'user_name'@'host_name'; So, if you, for example, intend to revoke all permissions from the user on a specific database, use the following: REVOKE ALL PRIVILEGES\nON database_name.*\nFROM 'user_name'@'host_name'; Remove User from MySQL If you decide to remove a MySQL user account, execute the DROP USER command through the command line: DROP USER [IF EXISTS] 'user_name'@'host_name' The command above will remove the user account together with all of its privileges. Manage Users and Privileges via dbForge Studio for MySQL While it is feasible to manage users and their privileges via a command-line interface of MySQL Server, it is a lot more convenient and secure to accomplish the same tasks with the help of a reliable database administration tool. The reason for that is that accurate management of user accounts plays an important role in database security, and since the number of such accounts can reach large numbers, it can get difficult for a DBA to have stable control over them. Therefore, if you are looking to strengthen your database security and improve administration, you can use a professional [tool for easy management of MySQL user accounts](https://www.devart.com/dbforge/mysql/studio/securitymanager.html) . dbForge Studio for MySQL is a universal solution for database development, management, and administration that has an integrated Security Manager tool. The Security Manager tool is designed to simplify administration and avoid errors. Within a handy graphical interface, you get the possibility to create, edit, and delete user accounts as well as grant or revoke privileges either at global or object levels with just a few clicks. To open Security Manager, go to the Administration tab of the Start page and select Manage Server Security : After that, the Security Manager window opens. It consists of two parts — the left one displays a list of all user accounts, and the right one allows you to enter and modify the user account’s data. As can be seen above, the user account’s data is divided into six groups placed on separate tabs. They are as follows: General — contains the name, host, password, the maximum quantity of connections/queries/ updates per hour related to the user account. Roles — contains roles that may be assigned to a user account (this refers to MariaDB connections only). Users — allows applying granted privileges of other users to the user account. Global Privileges — allows setting global privileges for the user account. Object Privileges — allows setting object privileges for the user account. SSL – contains SSL-connection options related to the user account. Create a User Account via Security Manager Creating a new user account within Security Manager is as easy as ABC. Click the Create User button above the list of user accounts, and immediately start inserting the necessary information and setting options on the corresponding tabs: On the General tab enter the aforementioned user account parameters into the corresponding fields and click Save on the toolbar. That’s all, you can now see the newly-created account in the left part of the Security Manager window. Next time you need to modify any user’s parameters, select the required user in the left part of the Security Manager window and effortlessly edit its parameters in the right part. Grant and Revoke Privileges via Security Manager With Security Manager, the management of user accounts becomes time-saving and efficient. Let’s consider an example. Supposing you would like to grant some global privileges (e.g., Create and Insert) and object ones to the Justin@% user. Instead of going into the details of the GRANT statement and running the command via the command prompt, you can simply navigate to the tabs inside dbForge Studio for MySQL and select the necessary options. On the Users tab of Security Manager, you can choose to apply the privileges granted to other users with a simple mouse click in the corresponding checkbox: On the Global Privileges tab, select the appropriate checkboxes to grant the required global privileges and clear the selection to revoke the privileges from the user account. In the example below, we choose to provide the Justin@% user with the CREATE and INSERT privileges that apply to all databases: Next, switch to the Object Privileges tab. Here, you can grant privileges at the object level. First, select the necessary schema on the left and expand the schema tree. Select the objects (such as tables, views, etc.)  and specify the required privileges by selecting the checkboxes. At the bottom of the window, you can see a list of object privileges for the current user account. Besides, you can click the Cancel icon to revoke the selected privileges or revoke all object privileges. As you can see on the screenshot below, the Justin@% user account obtains permission to run SELECT, INSERT, AND UPDATE scripts across the Actor table. Also, the enabled GRANT OPTION allows the user to grant permissions on this table to other users: You can save the changes immediately by clicking Save or preview the script by clicking Script Changes to refresh all the changes in your memory: What’s more, dbForge Studio for MySQL allows you to create a new user based on the existing one, therefore, saving your time and effort. To do this, right-click the user account and select Duplicate Object on the shortcut menu: Conclusion To sum up, we have provided a detailed overview of how to best manage users and privileges in MySQL. This information should be useful for beginners as well as experienced DBAs as it contains basic information and useful tips on efficient management of users’ accounts in MySQL databases. Use dbForge Studio for MySQL to bring your database security to a whole new level and significantly boost your [database administration](https://www.devart.com/dbforge/mysql/studio/database-administration.html) . Also, you can watch this video tutorial: Tags [create user account](https://blog.devart.com/tag/create-user-account) [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [grant privilege](https://blog.devart.com/tag/grant-privilege) [MySQL](https://blog.devart.com/tag/mysql) [revoke privilege](https://blog.devart.com/tag/revoke-privilege) [user account](https://blog.devart.com/tag/user-account) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-new-user-and-grant-privileges.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+New+User+Account+in+MySQL+and+Grant+Permissions+on+a+Database&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-new-user-and-grant-privileges.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-a-new-user-and-grant-privileges.html&title=How+to+Create+a+New+User+Account+in+MySQL+and+Grant+Permissions+on+a+Database) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-a-new-user-and-grant-privileges.html&title=How+to+Create+a+New+User+Account+in+MySQL+and+Grant+Permissions+on+a+Database) [Copy URL](https://blog.devart.com/how-to-create-a-new-user-and-grant-privileges.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-create-a-pivot-table-in-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) How to Create a Pivot Table in SQL Server By [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) November 23, 2023 [0](https://blog.devart.com/how-to-create-a-pivot-table-in-sql-server.html#respond) 1863 Working with databases involves dealing with a substantial amount of data. Every day, database administrators face challenges on how to analyze and summarize this data. Pivot tables can assist in solving this “brainteaser” and improve the efficiency of data analysis and reporting. In this article, we are going to demonstrate how to easily create a pivot table with the help of the Pivot Table Designer feature delivered by dbForge Studio for SQL Server. Additionally, we’ll show how to filter results and create a pivot chart. Contents What are the advantages of using pivot tables? Build a pivot table with Pivot Table Designer Filter results in a pivot table Create a chart based on values from a pivot table Conclusion What are the advantages of using pivot tables? A pivot table is a powerful tool for aggregating, organizing, grouping, or extracting insights from complex datasets. It allows you to structure large volume of data and display it in a plain and easily understandable form. Let’s review cases when pivot tables can be very efficient: Summarize data : With pivot tables, you’re able to summarize data through SUM , COUNT , AVERAGE , etc. Restructure data : Pivot tables convert data from a long format (rows) into a wide format (columns). Facilitate cross-tabulation : You can compare data across different variables. Reduce manual work : Pivot tables automate the process of data reorganization and reduce the need for manual data manipulation. Establish consistent reporting : Pivot tables help arrange a standardized reporting format for different team members. Interact directly : You can click, drag and drop elements while creating a pivot table. This process is very dynamic and simple. Filter data : For quick navigation through data values, there are different types of filters in pivot tables. Present results in an organized manner : Pivot tables provide the possibility to format and customize their appearance. This enhances the clarity of the displayed information. In summary, pivot tables come in handy when you work with extensive datasets and you need to analyze and visualize this information in a more efficient way. These characteristics make pivot tables an indispensable tool for specialists in various fields. Build a pivot table with Pivot Table Designer Imagine we have a query that returns the names of salespeople, goods, brands, product categories, and total sales numbers per each salesperson. SELECT\n\tCONCAT(s.first_name, ' ', s.last_name) AS sales_person\n ,c.category_name\n ,b.brand_name\n ,p.product_name\n ,o.order_status\n ,o.order_date\n ,FORMAT(oi.quantity * oi.list_price - oi.quantity * oi.list_price * oi.discount, '0.#0') AS total_sale\nFROM sales.order_items oi\nINNER JOIN sales.[orders] o\n\tON oi.order_id = o.order_id\nINNER JOIN production.products p\n\tON oi.product_id = p.product_id\nINNER JOIN production.brands b\n\tON p.brand_id = b.brand_id\nINNER JOIN production.categories c\n\tON p.category_id = c.category_id\nINNER JOIN sales.staffs s\n\tON o.staff_id = s.staff_id; However, there is a little nuance – data in this form in the table is hard to grasp and analyze. For better understanding and visualization of data, it’s required to convert the information into a pivot table. To build the table, we’ll use the Pivot Table Designer provided in dbForge Studio for SQL Server. This feature offers a wide range of options for a quick and easy table creation. Click the plus icon and select Pivot Table . The Pivot Table Designer will open. Click Refresh for the query columns to appear on the right. Drag the total_sale column. And drop it into the Designer. As you can see, the total sales amount for all goods over the entire period has been calculated. Next, drag and drop the order_date (Year) column in a similar manner. As a result, we’ll see a breakdown of the total sales amount by years. Let’s enhance the informativeness of the data by adding more details about quarterly sales. Group all sales per quarter. And then, drag the o rder_date ( Quarter) column onto the Pivot Table Designer. Do you agree that it’s much easier to perceive information in such a configuration? If you want to see the sales details for a specific year, you can collapse the others. For this, click the icon shown in the screenshot. To add a breakdown by product categories, drag the category_name column into the Designer. Also, we can group results by brands. To do this, drag and drop the brand_name column. Filter results in a pivot table To quickly extract the required data, you can apply filters to any column in a pivot table. For example, to check sales data for a specific salesperson, drag and drop the sales_person column. Then click the filter icon and choose any name. Voila! Now you can clearly see what Layla Terrell has sold, for how much, and when. And pay attention to how easily data is visualized in the pivot table. Everything is clear and straightforward. Now, let’s quickly identify Layla’s sales that amount to less than $2000. We don’t need to waste our time visually searching for these sums. We’re going to use the conditional styles available in the Pivot Table Designer and highlight the desired values in red. To call the context menu, right-click the grid and select Conditional Styles . Select total_sale from Field , set Less for the Condition field, type 2000 in Value 1 , and adjust the fore color. To save the settings, click OK . Perfect! That’s what we need. Create a chart based on vales from a pivot table Additionally, it’s possible to represent data from a pivot table in a diagram. The Pivot Table Designer provides an extensive range of chart types, guaranteeing your ability to create the specific diagram you need. Also, there is the Chart Wizard that allows customizing literally everything related to your diagram. Let’s see how it works and how informative the data will look. We’ll create the Pie 3D chart based on Layla’s Grand Total sums. Click Show Chart . Select Pie 3D from Type and select the values in the Grand Total column. That’s it, a few clicks and the chart is ready! Conclusion Pivot tables are an essential tool for anyone working with large datasets. They enable users to analyze and present data in a more insightful and comprehensible format. As you may have noticed, creating a pivot table in dbForge Studio for SQL Server is a straightforward process, requiring minimal time and expertise. The user-friendly design of the Pivot Table Designer makes it accessible even to those new to pivot tables.  Give it a try yourself, download [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) and start building your tables without writing complex queries! Tags [#create a pivot table](https://blog.devart.com/tag/create-a-pivot-table) [#dbForge SQL Studio for SQL Server](https://blog.devart.com/tag/dbforge-sql-studio-for-sql-server) [#pivot chart](https://blog.devart.com/tag/pivot-chart) [#pivot tables](https://blog.devart.com/tag/pivot-tables) [Hanna Khyzhnia](https://blog.devart.com/author/anna-lee) When writing articles, Hanna Khyzhnia follows two main rules: any technical information should be presented in a way that even a child could understand it, and the language of the text must be as simple and accessible as possible for users. She aims to help readers dive into details, make decisions, and find answers without unnecessary confusion. Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-pivot-table-in-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+Pivot+Table+in+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-pivot-table-in-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-a-pivot-table-in-sql-server.html&title=How+to+Create+a+Pivot+Table+in+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-a-pivot-table-in-sql-server.html&title=How+to+Create+a+Pivot+Table+in+SQL+Server) [Copy URL](https://blog.devart.com/how-to-create-a-pivot-table-in-sql-server.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-create-a-query-in-one-shot.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How To Create SQL Query in One Shot By [dbForge Team](https://blog.devart.com/author/dbforge) August 31, 2010 [0](https://blog.devart.com/how-to-create-a-query-in-one-shot.html#respond) 6088 To get information from a database it is necessary to execute a query to get this data. Usually an ordinary [SQL editor](https://www.devart.com/dbforge/sql/studio/sql-editor.html) is used to create queries. To use such editor, one should remember the syntax of the SELECT operator and the names of tables and columns. Let’s use a visual instrument developed specially to design SQL queries, and see that it’s much easier to create SQL queries visually instead of typing them in an editor. Task: It’s necessary to show the salaries of the employees of departments situated in different cities for the 2008 year in descending order. We will do this on a MySQL server database. The process of creating this database was described in the How to: Create MySQL Database in One Shot article. You can Download MySQL Demo Database (or for SQL Server Download SQL Server Demo Database ). Solution: Let’s create an empty document in [dbForge Query Builder for MySQL](https://www.devart.com/dbforge/mysql/querybuilder/) ( [dbForge Query Builder for SQL Server](https://www.devart.com/dbforge/sql/querybuilder/) ). After this let’s drag tables from Database Explorer to the diagram, the order of tables during dragging doesn’t matter. As we can see, the application joins these tables automatically. Query Builder: Query Diagram Now let’s select the columns you need to get data from. Click the checkbox near the Loc column of the dept table on the diagram, and after that the SalAmount column of the sal table. You can see the selected columns on the Selection tab. Query Builder: Selection Tab Now let’s select the sum function on this tab in the column with aggregate functions for the SumAmount column. Query Builder: Aggregate Now it is necessary to set grouping by the Loc column, but the application selected to group data by the Loc column automatically. Let’s make sure of that by going to the Group By tab. Query Builder: Group By Tab Now we should cut the selection and keep only data of the 2008 year in the result. To do this, let’s go to the Where tab and click the button with the green plus on it. The “=” symbol should appear. Let’s click the first phrase – . Query Builder: Enter Value After this the Operand Editor form should appear. Let’s select the Date and Time group from the Function list and double click the year(date) function in the list. After that let’s choose and double click the SalDate column in the other list. Query Builder: Operand Editor Let’s close the form and click the second one. Let’s enter 2008 there. Query Builder: Type Constant It’s time to execute the query we’ve created visually. To do this, let’s click F5. Query Builder: Query Result Now let’s look at the structure of the query we’ve created. To do this, let’s open Document Outline and open all nods. Query Builder: Document Outline Now let’s look at the DML of the created query. To do this, let’s go to the Text tab. Query Builder: SELECT Query Conclusion: As we can see, the usage of a visual tool for building SQL queries allows to solve the task visually without going deep into the refinements of syntax of the SELECT statement itself and of the specifics connected with differences between MySQL and SQL Server syntax, to look at the syntax of the created query, to decrease the duration of the data selection process, and to look at the structure of the available query as a tree. Tags [MySQL](https://blog.devart.com/tag/mysql) [query builder](https://blog.devart.com/tag/query-builder) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-query-in-one-shot.html) [Twitter](https://twitter.com/intent/tweet?text=How+To+Create+SQL+Query+in+One+Shot&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-query-in-one-shot.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-a-query-in-one-shot.html&title=How+To+Create+SQL+Query+in+One+Shot) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-a-query-in-one-shot.html&title=How+To+Create+SQL+Query+in+One+Shot) [Copy URL](https://blog.devart.com/how-to-create-a-query-in-one-shot.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-create-a-view-in-mysql.html", "product_name": "Unknown", "content_type": "Blog", "content": "[MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Create a View in MySQL: Best Practices and Tips By [dbForge Team](https://blog.devart.com/author/dbforge) January 9, 2025 [0](https://blog.devart.com/how-to-create-a-view-in-mysql.html#respond) 41799 Looking for ways to simplify data access and enhance security? By creating views in MySQL, you can efficiently manage complex queries and control user access to sensitive data. Instead of working directly with raw tables, views offer a streamlined, consistent interface that helps businesses make the most of their data and make it more accessible with fewer risks. In this article, we’ll explore how to create view in MySQL, delve into the intricacies of MySQL CREATE VIEW syntax,  learn how to create or replace view in MySQL database, and explore some of the most common use cases to sophisticate your journey in database management. Table of contents How to create a simple MySQL view with the CREATE VIEW statement How to create a view with JOINs to combine data from multiple tables How to update a MySQL view How to drop a MySQL view How to create a view in dbForge Studio for MySQL What is a MySQL view? A MySQL view is a virtual table that simplifies data access by storing a predefined query. It lets users retrieve data without directly accessing the underlying tables, essentially improving security and abstraction. With MySQL views, you can easily use simplified queries that encapsulate complex SQL logic and make data retrieval sleeker, restrict access to specific data, and limit its exposure to some of the team members, and boost overall readability. How to create a simple MySQL view with the CREATE VIEW statement The basic view syntax in MySQL is as follows: CREATE VIEW [db_name.]view_name [(column_list)]\nAS\n select-statement; [db_name.] is the name of the database where your view will be created; if not specified, the view will be created in the current database view_name is a unique name of the view you are creating [(column_list)] defines the required list of columns that can be indicated in parentheses after the view name; by default, the list of columns is retrieved from the select list of the SELECT statement select-statement is a specified SELECT statement that can query data from tables or views Here is the simplest example. If we have a table called customers in our current database, and we would like to request a list of customers with the transaction dates of their orders, the script may look as follows: CREATE VIEW transactions AS\n SELECT \n id_number,\n name,\n transaction_date\n FROM\n customers; After we execute this statement, the transactions object will be available in Views. Now we can move on and execute a statement that selects all the fields in this view: SELECT * FROM transactions; The output will constitute a table containing three columns: id_number , name , and transaction_date . How to create a view with JOINs to combine data from multiple tables Our next example is somewhat more complicated since it involves multiple tables (there will be three in our case): CREATE VIEW order_incomes AS\nSELECT\n order_id,\n customer_name,\n SUM(ordered_quantity * product_price) total\nFROM\n order_details\nINNER JOIN orders USING (order_id)\nINNER JOIN customers USING (customer_name)\nGROUP BY order_id; This view gives us information on order income per customer, grouped by order ID. For that purpose, we calculate the total income using the order_details table data and use the INNER JOIN clause to retrieve order IDs from the orders table and customer names from the customers table. Updatable and insertable views In MySQL, views can be classified as updatable or insertable, depending on the operations you want to perform on the underlying data. These types of views provide significant flexibility in how you manage data while offering a higher level of abstraction. Updatable views When you create view in MySQL, that is updatable, it allows you to change the underlying data using data manipulation statements (UPDATE, DELETE, and INSERT). The key requirement for a view to be updatable is that there must be a one-to-one relationship between the rows in the view and the rows in the underlying table. If a view includes complex SQL constructs like GROUP BY, DISTINCT, or UNION, it may become non-updatable because the relationship between the view and the table becomes complex. If you want to create an updatable view, make sure it doesn’t feature: Aggregate functions like SUM() or COUNT() DISTINCT or GROUP BY clauses Non-dependent subqueries in the select list Additionally, ensure that the view references only a single table if it’s a join view. Insertable views An insertable view allows you to insert data into the underlying tables. For the view to be insertable, it must satisfy additional conditions: Reference simple columns, not expressions (e.g., col1 + 3). Feature only unique column names Include all columns of the base table that do not have default values Even if a view is updatable, it might not necessarily be insertable. For example, when you create view in MySQL that contains an expression or derived column, it is not insertable because the expression cannot be directly mapped to a table column. Restrictions on views in MySQL While MySQL views offer powerful tools for simplifying queries and enhancing data abstraction, there are several restrictions to be aware of when working with them. These limitations impact the types of operations you can perform and the conditions under which a view can be used effectively. Restriction Explanation Number of Tables in a View A view can reference up to 61 tables . Exceeding this limit results in an error, so complex queries with multiple joins should be planned carefully. View Processing and Indexing Views cannot have their own indexes . They rely on indexes from underlying tables when using the merge algorithm. Views using the temptable algorithm do not benefit from indexing, which may lead to performance issues with large datasets. Restrictions on Subqueries and Modifications A view using the merge algorithm cannot modify a table that it selects from in a subquery . If processed with the temptable algorithm, this restriction is bypassed. Invalidating Views If an underlying table is altered or dropped , the view becomes invalid, but MySQL does not issue a warning until the view is queried. Use CHECK TABLE to verify view integrity. Updatability Limitations Not all views are updatable. Views containing aggregate functions, joins, or subqueries may not support INSERT, UPDATE, or DELETE. Privileges and Backup Issues Users with CREATE VIEW and SELECT privileges may not view the definition unless they also have SHOW VIEW. This affects backups using tools like mysqldump. View Aliases and Length Limitations Column aliases in a view cannot exceed 64 characters . Long aliases may cause replication or backup issues. Use shorter names to avoid errors. How to update a MySQL view If you need to update tables through views, you can use the INSERT, UPDATE, and DELETE statements to perform the corresponding operations with the rows of the underlying table. However, please note that in order to be updatable, your view must not include any of the following: Aggregate functions, e.g. MIN, MAX, COUNT, AVG, or SUM Such clauses as GROUP BY, DISTINCT, HAVING, UNION or UNION ALL Left or outer JOINs Multiple references to any column of the base table Subqueries in the SELECT or WHERE clause referring to the table appearing in the FROM clause References to non-updatable views in the FROM clause References to non-literal values Now let’s create an updatable view called warehouse_details based on the warehouses table. CREATE VIEW warehouse_details AS\n SELECT warehouse_id, phone, city\n FROM warehouses; Now we can query data from this view: SELECT * FROM warehouse_details; Let’s say we want to change the phone number of the warehouse with the warehouse_id ’55’ through the warehouse_details view using the following UPDATE statement. UPDATE warehouse_details \nSET \n phone = '(555) 555-1234'\nWHERE\n warehouse_id = 55; Finally, we can check whether the change has been applied using the following query: SELECT * FROM warehouse_details\nWHERE\n warehouse_id = 55; How to drop a MySQL view If we no longer need a certain view, we can delete it with a simple DROP statement: DROP VIEW warehouse_details; The view WITH CHECK OPTION clause The WITH CHECK OPTION clause in MySQL is a powerful feature that ensures data integrity when using updatable views. This clause restricts the rows that can be inserted or updated through a view, enforcing that only rows that satisfy the view’s SELECT statement can be modified. Let’s break down how the WITH CHECK OPTION clause works and its different configurations. For demonstrational purposes, let’s create the following table with employee data (ID, department, and age columns): CREATE TABLE employees (id INT AUTO_INCREMENT PRIMARY KEY, department VARCHAR(50), age INT); Now, let’s create a parent view without using the WITH CHECK OPTION, as follows: CREATE VIEW dept_view\nAS\nSELECT\n *\nFROM employees\nWHERE department = 'HR'; When working with it, you can insert into dept_view without any restrictions. Now, let’s imagine that you want HR representatives to insert some data into the table, but you don’t want them to be able to edit other department records. Thus, you can use the WITH CHECK OPTION to solve this case. When creating a new view ( strong_dept_view ), use the following WHERE condition: CREATE VIEW strong_dept_view\nAS\nSELECT\n *\nFROM employees\nWHERE department = 'HR' \nWITH CHECK OPTION; As a result, through this view, it will be possible to insert only the values to HR department. The WITH CHECK OPTION checks the WHERE condition and lets you insert values only into rows that match it. LOCAL and CASCADED options When you create view in MySQL defined in terms of another view, and you have parent and child views, the WITH CHECK OPTION clause can be configured with two different keywords: LOCAL and CASCADED. These keywords determine how MySQL applies the check across hierarchically related views. Let’s explore how they are different. CASCADED CHECK OPTION If you want the view users to make records only to the cells that match both the WHERE condition from the parent view as well as the condition from the child view that was derived from the parent one, you should opt for the CASCADED CHECK OPTION. For instance, let’s imagine that we have a child view that was derived from the parent view that had WHERE department = ‘HR’ condition applied: CREATE VIEW cascaded_under_strong\nAS\nSELECT\n *\nFROM strong_dept_view\nWHERE age <= 30 \nWITH CASCADED CHECK OPTION; As a result, you’ll enable editing only those cells that satisfy age <= 30 (child) and department = ‘HR’ (parent) requirements. For example, you’ll be able to insert value: INSERT INTO cascaded_under_strong (department, age) VALUES ('HR', 28); But these insertions don’t match both conditions, and thus they won’t be added: INSERT INTO cascaded_under_strong (department, age) VALUES ('HR', 35); INSERT INTO cascaded_under_strong (department, age) VALUES ('IT', 22); Generally, WITH CHECK OPTION and WITH CASCADED CHECK OPTION work the same. If you do not specify a view that should have LOCAL CHECK OPTION, it’s treated as CASCADED by default. Also, if your parent view has conditions without CHECK OPTION, and the CASCADED child view is applied to it, the CASCADED condition from the child view will still be applied to insertions, considering both parent and child view conditions. E.g., here’s the view: CREATE VIEW cascaded_check_view\nAS\nSELECT\n *\nFROM dept_view\nWHERE age <= 30 \nWITH CASCADED CHECK OPTION; This view will provide an option to insert the values to records that match both the department and age conditions: INSERT INTO cascaded_check_view (department, age) VALUES ('HR', 27) However, these insertions will fail because both conditions are applied: INSERT INTO cascaded_check_view (department, age) VALUES ('HR', 35); INSERT INTO cascaded_check_view (department, age) VALUES ('IT', 28); The child view will inherit the parent view condition (department = ‘HR’ ) and the insetrions will be checked against it as well as against the conditions of the child view. LOCAL CHECK OPTION WITH LOCAL CHECK OPTION checks only the condition that is stated in the child view, disregarding the additional parent view conditions if there was no CHECK OPTION. Let’s see how it works with our parent view example that featured WHERE department = ‘HR’. As you use the LOCAL CHECK OPTION in the script, and you have no CHECK OPTION for the parent view, you check only one condition (in our case, it’s going to be WHERE age <= 30 ): CREATE VIEW local_check_view\nAS\nSELECT\n *\nFROM dept_view\nWHERE age <= 30 \nWITH LOCAL CHECK OPTION; As a result, you will be able to insert values to every record that satisfies the age <= 30 condition, regardless of the department, e.g.,: INSERT INTO local_check_view (department, age) VALUES ('Finance', 25); INSERT INTO local_check_view (department, age) VALUES ('HR', 30); However, when the records don’t meet the age condition, you’ll get an error: INSERT INTO local_check_view (department, age) VALUES ('IT', 40); However, in case we are going to use the parent view that has CHECK OPTION, the child view with the LOCAL CHECK OPTION will consider it. E.g., here’s the child view based on the strong_dept_view (already contains the WITH CHECK OPTION): CREATE VIEW local_under_strong\nAS\nSELECT\n *\nFROM strong_dept_view\nWHERE age <= 30 \nWITH LOCAL CHECK OPTION; As you try inserting values that meet both of the conditions, there are no errors: INSERT INTO local_under_strong (department, age) VALUES ('HR', 25); However, whenever you try to insert values that do not satisfy either age or department conditions, there are going to be errors, e.g.,: INSERT INTO local_under_strong (department, age) VALUES ('HR', 35); INSERT INTO local_under_strong (department, age) VALUES ('IT', 22); As you can see, there is no difference in LOCAL CHECK OPTION and CSCADED CHECK OPTION applied to child view, if these views are based on the parent views with the CASCADED CHECK OPTION. To sum it up, the WITH CHECK OPTION clause is an essential tool for ensuring data integrity when working with updatable views in MySQL. Restricting inserts and updates using LOCAL and CASCADED views helps limit inserts only those rows that meet the conditions of the view’s WHERE clause, preventing unintended changes to your data. Using CREATE OR REPLACE VIEW in MySQL As you learn how to create views in MySQL, you might encounter the CREATE OR REPLACE VIEW statement. It allows you to update an existing view by defining a new query. This is useful when you need to modify a view’s structure or data but want to maintain the same view name. Syntax for CREATE OR REPLACE VIEW The basic VIEW syntax in MySQL for CREATE OR REPLACE VIEW is as follows: CREATE OR REPLACE VIEW view_name\nAS\nSELECT\n column1\n ,column2\nFROM table_name\nWHERE condition; This command replaces an existing view with the same name if it exists or creates a new view if it doesn’t. Learn more about using CREATE OR REPLACE VIEW syntax and use case examples from an article on [SQL CREATE VIEW Statement](https://blog.devart.com/sql-create-view-statement.html) . To sum it up, to manage data more efficiently, we can use the CREATE OR REPLACE VIEW MySQL statement to update our existing views without dropping them first. Common errors when creating views in MySQL Creating views in MySQL can be a straightforward process, but there are still some common mistakes that can occur and disrupt your workflow. These errors typically stem from syntax issues, permission problems, or misunderstandings about how views interact with underlying tables. Let’s explore some of these common errors and how to troubleshoot them. Common Error Explanation Example & Solution Syntax Errors Occur due to incorrect SQL structure, misplaced clauses, or missing keywords. Error: ERROR 1064 (42000): You have an error in your SQL syntax. Solution: Ensure correct syntax. Permission Issues Happen when the user lacks CREATE VIEW or SELECT privileges. Error: ERROR 1142 (42000): CREATE VIEW command denied. Solution: Grant necessary privileges (GRANT CREATE VIEW, SELECT ON database_name.* TO ‘user’@’localhost’;) Non-Updatable Views Views with aggregates, joins, or subqueries may not support UPDATE or INSERT. Error: ERROR 1351 (HY000): View ‘database.view_name’ is not updatable. Solution: Simplify the view structure or use triggers to handle updates. Incorrect Column Aliases Aliases may conflict with existing names, exceed 64-character limits, or reference unsupported data types. Error: ERROR 1170 (42000): BLOB/TEXT column ‘column_name’ used in key specification without a key length. Solution : Use unique aliases within length limits and ensure proper data types. Views with Invalid References A view becomes invalid if a referenced table or view is dropped or altered. Error: ERROR 1146 (42S02): Table ‘database_name.table_name’ doesn’t exist. Solution: Ensure all referenced tables/views exist or recreate/update the view. Best practices for creating views in MySQL Creating views in MySQL can simplify complex queries, and provide lots of advantages to business. However, there are still some essential tips that can help you ensure you handle the MySQL view the right way. Let’s explore them. Keep views straightforward. Simplify view definitions to enhance performance and readability. Enhance security with views. Use views to control access, exposing only necessary data while restricting direct table access. Keep views up to date. Regularly review and modify views to align with schema updates or business rule changes. Minimize nesting. Avoid layering views within views to prevent unnecessary performance overhead. Optimize for efficiency. Complex views can slow down queries, so, if possible, use indexes and refine underlying SQL for better performance. How to create a view in dbForge Studio for MySQL Now that we know the basic syntax, we need to find a tool that will help us manage our databases and views most effectively. We suggest you try [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) , a toolset that covers nearly any operation with MySQL databases you can think of. [Download a free trial](https://www.devart.com/dbforge/mysql/studio/download.html) , spend a couple of minutes installing it, and let’s get started. Once you are connected to your MySQL database ( [look here to see how it is done](https://docs.devart.com/studio-for-mysql/getting-started/connecting-to-db.html) ), you can create a view using one of the two following ways. The first way is writing and executing a query in a SQL document. Here dbForge Studio for MySQL [delivers context-sensitive code completion, automatic syntax check, code snippets, quick navigation through large scripts](https://www.devart.com/dbforge/mysql/studio/sql-coding.html) , and customizable formatting profiles. In other words, you get every feature you might need in a single convenient IDE. If you want to master this functionality with easy step-by-step guides, feel free to check the [Writing and Executing SQL Statements](https://docs.devart.com/studio-for-mysql/writing-and-executing-sql-statements/sql-doc-overview.html) section of our documentation. You will find everything there, from the creation of a new SQL document to the automated execution of your queries via the command-line interface. The second way is one of the most notable tools of dbForge Studio — [Query Builder](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) . It presents your queries visually as diagrams, generates the abovementioned JOINs, and enables the interactive building of the INSERT, UPDATE, and DELETE statements to update your views. Again, describing the workflow is more than this article can handle, but we have a special [Query Builder](https://docs.devart.com/studio-for-mysql/building-queries-with-query-builder/query-builder-overview.html) section in our documentation, where you can get detailed guides to building and managing visual diagrams. Conclusion dbForge Studio for MySQL is an all-encompassing IDE for database development, management, and administration. It offers the easiest ways of [building queries](https://www.devart.com/dbforge/mysql/studio/mysql-query-builder.html) , comparing and analyzing data, developing and debugging stored procedures, comparing and syncing database schemas, and much more. [Get a 30-day free trial](https://www.devart.com/dbforge/mysql/studio/download.html) of dbForge Studio for MySQL today and see how irreplaceable it can become for your daily operations. FAQ How do I create a view in MySQL? Use the CREATE VIEW statement to define a virtual table based on a SELECT query. Adjust the following syntax example with your database table values to create your first view: CREATE VIEW view_name\nAS\nSELECT\n column1\n ,column2\nFROM table_name\nWHERE condition; What is the CREATE VIEW command? The CREATE VIEW command defines a virtual table (view) based on a SELECT query, allowing users to simplify queries and restrict direct table access. What is the difference between a view and a table in MySQL? A table stores data physically, while a view is a virtual representation of a query result. Views do not store data but reflect changes in underlying tables. How can I create a view in MySQL using dbForge Studio for MySQL? In dbForge Studio for MySQL, navigate Database in the app main menu and proceed to New Database Object . In the New Object modal window, choose the database from the Location list. Then, choose View from the Type list and proceed to the visual View editor. How do I modify an existing MySQL view using CREATE OR REPLACE VIEW? Use CREATE OR REPLACE VIEW to update a view’s definition without dropping and recreating it. Here’s the syntax example you can build upon to perform this action: CREATE OR REPLACE VIEW view_name\nAS\nSELECT\n column1\n ,column2\nFROM table_name\nWHERE condition; What will happen to the view if the table used in its selection query has been deleted? The view becomes invalid. MySQL does not warn you in advance, but an error occurs when querying the view. Can I create views in MySQL that include data from multiple tables? Yes, you can use JOIN in a SELECT statement within the view definition to combine data from multiple tables. Tags [dbForge Studio for MySQL](https://blog.devart.com/tag/dbforge-studio-for-mysql) [MySQL](https://blog.devart.com/tag/mysql) [MySQL Tutorial](https://blog.devart.com/tag/mysql-tutorial) [query builder](https://blog.devart.com/tag/query-builder) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-view-in-mysql.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+View+in+MySQL%3A+Best+Practices+and+Tips&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-a-view-in-mysql.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-a-view-in-mysql.html&title=How+to+Create+a+View+in+MySQL%3A+Best+Practices+and+Tips) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-a-view-in-mysql.html&title=How+to+Create+a+View+in+MySQL%3A+Best+Practices+and+Tips) [Copy URL](https://blog.devart.com/how-to-create-a-view-in-mysql.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-create-custom-sql-server-replication-for-read-only-databases.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Create Custom SQL Server Replication for Read-only Databases By [dbForge Team](https://blog.devart.com/author/dbforge) March 19, 2020 [0](https://blog.devart.com/how-to-create-custom-sql-server-replication-for-read-only-databases.html#respond) 3979 Quite often there’s a need to create a read-only replica of a SQL Server database. This might be required, for example, for the purpose of separating analytical and operative tasks. The first cause high load on databases and in order to reduce it, the replica of the primary database is created to perform analytical read-only queries. Usually, these read-only replicas can be created with in-built DBMS tools: [Log Shipping](https://docs.microsoft.com/en-us/sql/database-engine/log-shipping/configure-log-shipping-sql-server?view=sql-server-ver15) [SQL Server Replication](https://docs.microsoft.com/en-us/sql/relational-databases/replication/sql-server-replication?view=sql-server-ver15) [AlwaysOn Availability Groups](https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/overview-of-always-on-availability-groups-sql-server?view=sql-server-ver15) . But what if you don’t need the entire database but only a few tables from it? In this case, you can create replication by yourself. And as long as data sampling is the main goal, database replication in one direction (master-to-slave) would be enough. Several methods including SSIS and .NET can be used to perform that kind of replication. In this article, we will use the JobEmpl recruiting service database to demonstrate how to create database replication in the master-to-slave direction using T-SQL. Creating SQL Server replication in one direction using T-SQL To start with, let’s describe the main principle and the algorithm of this replication. During every iteration, we need to compare the data in the selected tables between the Source and the Target databases. This means that we need to enter a unique surrogate key to compare the tables. To speed up the comparison process, we will also need to create an index on that key. And adding a calculated field for every replicated table that calculates CHECKSUM for every row will also be required. It is also important to pick fixed portions of data, for example, a certain number of rows at a time (per iteration). Thus, we need to perform the following steps: On the source tables, create a REPL_GUID column and a unique REPL_GUID index on it to enforce a one-to-one relationship between source and destination tables. You should also create a calculated CheckSumVal column that will calculate the [CHECKSUM](https://docs.microsoft.com/en-us/sql/t-sql/functions/checksum-transact-sql?view=sql-server-ver15) value for every row. Create a new destination database named Target. Synchronize schemas of the replication tables across the Source and Target databases and remove all references to nonexistent objects. Disable foreign keys for the Target database. Run the replication and monitor how many rows differ between the Source and Target databases. Let’s now review each step in detail using the JobEmpl database that was created for hiring employees. Fig. 1 The schema of a job seekers database We only need to replicate the Employee and the JobHistory tables. Then, the 1st step of the mentioned algorithm can be performed with the help of the following script. USE JobEmpl\nGO\n\nSET QUOTED_IDENTIFIER ON;\n \nDECLARE @src NVARCHAR(255) = N'JobEmpl';\nDECLARE @sch NVARCHAR(255) = N'dbo';\nDECLARE @sql NVARCHAR(MAX);\nDECLARE @name NVARCHAR(255);\nDECLARE @listcols NVARCHAR(MAX);\nCREATE TABLE #cols (\n [Name] NVARCHAR(255)\n);\n \nSELECT\n [Name] INTO #tbl\nFROM sys.tables\nWHERE [Name] IN (\nN'Employee',\nN'JobHistory'\n);\n \nDECLARE sql_cursor CURSOR LOCAL FOR SELECT\n [Name]\nFROM #tbl;\n \nOPEN sql_cursor;\n \nFETCH NEXT FROM sql_cursor\nINTO @name;\n \nWHILE (@@fetch_status = 0)\nBEGIN\n DELETE FROM #cols;\n \n SET @sql = N'SET QUOTED_IDENTIFIER ON; select N''COALESCE(CAST([''+col.[name]+N''] AS NVARCHAR(MAX)), N'''''''')'' ' +\n N'from [' + @src + N'].sys.columns as col ' +\n N'inner join [' + @src + N'].sys.tables as tbl on col.[object_id]=tbl.[object_id] ' +\n N'where tbl.[name]=''' + @name + ''' and col.[is_identity]=0';\n \n INSERT INTO #cols ([Name])\n EXEC sys.sp_executesql @sql;\n \n SET @listcols = N'';\n \n SELECT\n @listcols = @listcols + CAST([Name] AS NVARCHAR(MAX)) + N'+ '\n FROM #cols;\n \n SET @listcols = SUBSTRING(@listcols, 1, LEN(@listcols) - 1);\n \n SET @sql=N'SET QUOTED_IDENTIFIER ON; ALTER TABLE ['+@sch+N'].['+@name+N'] ADD [CheckSumVal] AS CHECKSUM('+@listcols+N');'\n \n --PRINT @sql;\n EXEC sys.sp_executesql @sql;\n \n SET @sql=N'SET QUOTED_IDENTIFIER ON; ALTER TABLE [dbo].['+@name+N'] ADD [REPL_GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT ['+@name+N'_DEF_REPL_GUID] DEFAULT (NEWSEQUENTIALID());';\n \n --PRINT @sql;\n EXEC sys.sp_executesql @sql;\n \n SET @sql=N'SET QUOTED_IDENTIFIER ON; CREATE UNIQUE NONCLUSTERED INDEX [indREPL_GUID] ON [dbo].['+@name+N']([REPL_GUID] ASC);';\n \n --PRINT @sql;\n EXEC sys.sp_executesql @sql;\n \n FETCH NEXT FROM sql_cursor\n INTO @name;\nEND\n \nCLOSE sql_cursor;\nDEALLOCATE sql_cursor;\n \nDROP TABLE #cols;\n \nDROP TABLE #tbl; From the script, you can see that it has to be run on the source JobEmpl database and you should specify the source database and the schema in @src and @sch variables accordingly. The @sql variable is required for building dynamic SQL, meanwhile, @name is needed for saving the name of the replicated table. First, we collect the replicated tables names into the temporary #tbl table. Next, we go through every table name with a cursor and fetch the table name into the @name variable. After that, for every table, a list of columns that are not the IDENTITY type is formed and the result is inserted into the @listcols variable with the “+” sign. It’s worth mentioning that every table name is, at first, converted with the CAST function to NVACHAR(MAX) type and then the COALESCE function ([], N’’) is used. It’s done to form a single string from all the column values for every row. Next, the calculated CheckSumVal field, the REPL_GUID field, and its unique  indREPL_GUID index are created. In our case, we got the following script. SET QUOTED_IDENTIFIER ON;\nALTER TABLE [dbo].[Employee]\nADD [CheckSumVal] AS CHECKSUM(COALESCE(CAST([FirstName] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([LastName] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([Address] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([CheckSumVal] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([REPL_GUID] AS NVARCHAR(MAX)), N''));\n\nSET QUOTED_IDENTIFIER ON;\nALTER TABLE [dbo].[Employee]\nADD [REPL_GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [Employee_DEF_REPL_GUID] DEFAULT (NEWSEQUENTIALID());\n\nSET QUOTED_IDENTIFIER ON;\nCREATE UNIQUE NONCLUSTERED INDEX [indREPL_GUID] ON [dbo].[Employee]([REPL_GUID] ASC);\n\nSET QUOTED_IDENTIFIER ON;\nALTER TABLE [dbo].[JobHistory]\nADD [CheckSumVal] AS CHECKSUM(COALESCE(CAST([EmployeeID] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([CompanyID] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([PositionID] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([ProjectID] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([StartDate] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([FinishDate] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([Description] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([Achievements] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([ReasonsForLeavingTheProject] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([ReasonsForLeavingTheCompany] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([CheckSumVal] AS NVARCHAR(MAX)), N'')+ COALESCE(CAST([REPL_GUID] AS NVARCHAR(MAX)), N''));\n\nSET QUOTED_IDENTIFIER ON;\nALTER TABLE [dbo].[JobHistory] ADD [REPL_GUID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [JobHistory_DEF_REPL_GUID] DEFAULT (NEWSEQUENTIALID());\n\nSET QUOTED_IDENTIFIER ON;\nCREATE UNIQUE NONCLUSTERED INDEX [indREPL_GUID] ON [dbo].[JobHistory]([REPL_GUID] ASC); You can later delete the created tables and indexes from the databases with the help of the following scripts. DECLARE @name NVARCHAR(255);\nDECLARE @sql NVARCHAR(MAX);\n\nCREATE TABLE #tbl (\n\t[name] NVARCHAR(255)\n);\n\nINSERT INTO #tbl ([name])\n\tSELECT\n\t\t[name]\n\tFROM sys.tables\n\tWHERE [name] IN (\n\tN'Employee',\n\tN'JobHistory'\n\t);\n\nDECLARE sql_cursor CURSOR LOCAL FOR SELECT\n\t[name]\nFROM #tbl;\n\nOPEN sql_cursor;\n\nFETCH NEXT FROM sql_cursor\nINTO @name;\n\nWHILE (@@fetch_status = 0)\nBEGIN\n\tSET @sql = N'DROP INDEX [indREPL_GUID] ON [dbo].[' + @name + N'];';\n\t\n\t--print @sql\n\tEXEC sys.sp_executesql @sql;\n\t\n\tSET @sql = N'ALTER TABLE [dbo].[' + @name + N'] DROP CONSTRAINT [' + @name + N'_DEF_REPL_GUID], COLUMN [CheckSumVal], COLUMN [REPL_GUID];';\n\t\n\t--print @sql\n\tEXEC sys.sp_executesql @sql;\n\t\n\tFETCH NEXT FROM sql_cursor\n\tINTO @name;\nEND\n\nCLOSE sql_cursor;\nDEALLOCATE sql_cursor;\n\nDROP TABLE #tbl; The replicated tables are also here, and for each of them the indREPL_GUID index, as well as REPL_GUID and CheckSumVal columns, get deleted. In our case, the following T-SQL code was created. DROP INDEX [indREPL_GUID] ON [dbo].[Employee];\nALTER TABLE [dbo].[Employee] DROP CONSTRAINT [Employee_DEF_REPL_GUID], COLUMN [CheckSumVal], COLUMN [REPL_GUID];\n\nDROP INDEX [indREPL_GUID] ON [dbo].[JobHistory];\nALTER TABLE [dbo].[JobHistory] DROP CONSTRAINT [JobHistory_DEF_REPL_GUID], COLUMN [CheckSumVal], COLUMN [REPL_GUID]; Let’s now create a new JobEmplRead database for receiving data according to the 2nd step of the algorithm, mentioned above. Then, we synchronize schemas for the replicated tables. To perform synchronization, use the DbForge Schema Compare tool: select JobEmpl as the data source and jobEmplRead as the data target. Fig. 2 Databases selection for schema synchronization Then click Compare . Once the metadata creation process for the comparison is done, select the required tables and start configuring the database synchronization process. Fig. 3 Selecting tables for schema synchronization Next, we select the default value – script generation. Fig. 4 Selecting script generation as a synchronization output Let’s now clear the backup creation option. Fig. 5 Unselecting the backup creation option Next, we uncheck all the dependencies as we don’t need to create other objects. And we will later delete foreign keys manually in a generated schema synchronization script. Fig. 6 Unselecting all the dependencies Now, click Synchronize and ignore the warnings on the Summary tab. Fig. 7 Warnings Delete the following foreign keys in the generated script: FK_JobHistory_Company_CompanyID FK_JobHistory_Position_PositionID FK_JobHistory_Project_ProjectID We need to do this because we didn’t transfer the Company , Position , and Project tables. As a result, we got a script for moving replicated schema tables. SET CONCAT_NULL_YIELDS_NULL, ANSI_NULLS, ANSI_PADDING, QUOTED_IDENTIFIER, ANSI_WARNINGS, ARITHABORT, XACT_ABORT ON\nSET NUMERIC_ROUNDABORT, IMPLICIT_TRANSACTIONS OFF\nGO\n\nUSE [JobEmplRead]\nGO\n\nIF DB_NAME() <> N'JobEmplRead' SET NOEXEC ON\nGO\n\n\n--\n-- Set transaction isolation level\n--\nSET TRANSACTION ISOLATION LEVEL SERIALIZABLE\nGO\n\n--\n-- Start Transaction\n--\nBEGIN TRANSACTION\nGO\n\n--\n-- Create table [dbo].[Employee]\n--\nCREATE TABLE [dbo].[Employee] (\n [EmployeeID] [int] IDENTITY,\n [FirstName] [nvarchar](255) NOT NULL,\n [LastName] [nvarchar](255) NOT NULL,\n [Address] [nvarchar](max) NULL,\n [CheckSumVal] AS (checksum((coalesce(CONVERT([nvarchar](max),[FirstName]),N'')+coalesce(CONVERT([nvarchar](max),[LastName]),N''))+coalesce(CONVERT([nvarchar](max),[Address]),N''))),\n [REPL_GUID] [uniqueidentifier] NOT NULL CONSTRAINT [Employee_DEF_REPL_GUID] DEFAULT (newsequentialid()) ROWGUIDCOL,\n CONSTRAINT [PK_Employee_EmployeeID] PRIMARY KEY CLUSTERED ([EmployeeID])\n)\nON [PRIMARY]\nTEXTIMAGE_ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create index [indREPL_GUID] on table [dbo].[Employee]\n--\nCREATE UNIQUE INDEX [indREPL_GUID]\n ON [dbo].[Employee] ([REPL_GUID])\n WITH (FILLFACTOR = 80)\n ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create table [dbo].[JobHistory]\n--\nCREATE TABLE [dbo].[JobHistory] (\n [EmployeeID] [int] NOT NULL,\n [CompanyID] [int] NOT NULL,\n [PositionID] [int] NOT NULL,\n [ProjectID] [int] NOT NULL,\n [StartDate] [date] NOT NULL,\n [FinishDate] [date] NULL,\n [Description] [nvarchar](max) NOT NULL,\n [Achievements] [nvarchar](max) NULL,\n [ReasonsForLeavingTheProject] [nvarchar](max) NULL,\n [ReasonsForLeavingTheCompany] [nvarchar](max) NULL,\n [CheckSumVal] AS (checksum(((((((((coalesce(CONVERT([nvarchar](max),[EmployeeID]),N'')+coalesce(CONVERT([nvarchar](max),[CompanyID]),N''))+coalesce(CONVERT([nvarchar](max),[PositionID]),N''))+coalesce(CONVERT([nvarchar](max),[ProjectID]),N''))+coalesce(CONVERT([nvarchar](max),[StartDate]),N''))+coalesce(CONVERT([nvarchar](max),[FinishDate]),N''))+coalesce(CONVERT([nvarchar](max),[Description]),N''))+coalesce(CONVERT([nvarchar](max),[Achievements]),N''))+coalesce(CONVERT([nvarchar](max),[ReasonsForLeavingTheProject]),N''))+coalesce(CONVERT([nvarchar](max),[ReasonsForLeavingTheCompany]),N''))),\n [REPL_GUID] [uniqueidentifier] NOT NULL CONSTRAINT [JobHistory_DEF_REPL_GUID] DEFAULT (newsequentialid()) ROWGUIDCOL,\n CONSTRAINT [PK_JobHistory] PRIMARY KEY CLUSTERED ([EmployeeID], [CompanyID], [PositionID], [ProjectID])\n)\nON [PRIMARY]\nTEXTIMAGE_ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create index [indREPL_GUID] on table [dbo].[JobHistory]\n--\nCREATE UNIQUE INDEX [indREPL_GUID]\n ON [dbo].[JobHistory] ([REPL_GUID])\n WITH (FILLFACTOR = 80)\n ON [PRIMARY]\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Create foreign key [FK_JobHistory_Employee_EmployeeID] on table [dbo].[JobHistory]\n--\nALTER TABLE [dbo].[JobHistory] WITH NOCHECK\n ADD CONSTRAINT [FK_JobHistory_Employee_EmployeeID] FOREIGN KEY ([EmployeeID]) REFERENCES [dbo].[Employee] ([EmployeeID])\nGO\nIF @@ERROR<>0 OR @@TRANCOUNT=0 BEGIN IF @@TRANCOUNT>0 ROLLBACK SET NOEXEC ON END\nGO\n\n--\n-- Commit Transaction\n--\nIF @@TRANCOUNT>0 COMMIT TRANSACTION\nGO\n\n--\n-- Set NOEXEC to off\n--\nSET NOEXEC OFF\nGO Run this script in the JobEmplRead database. Thus, we’ve completed step 3 of our algorithm: synchronized tables schema across the JobEmpl and JobEmplRead databases and deleted all references to nonexistent objects. Let’s use the following script for monitoring. DECLARE @src NVARCHAR(255) = N'JobEmpl';\nDECLARE @trg NVARCHAR(255) = N'JobEmplRead';\nDECLARE @sch NVARCHAR(255) = N'dbo';\nDECLARE @sql NVARCHAR(MAX);\nDECLARE @name NVARCHAR(255);\n\nCREATE TABLE #res (\n\t[TblName] NVARCHAR(255)\n ,[Count] INT\n);\n\nSELECT\n\t[name] INTO #tbl\nFROM sys.tables\nWHERE [name] IN (\nN'Employee',\nN'JobHistory'\n);\n\nDECLARE sql_cursor CURSOR LOCAL FOR SELECT\n\t[name]\nFROM #tbl;\n\nOPEN sql_cursor;\n\nFETCH NEXT FROM sql_cursor\nINTO @name;\n\nWHILE (@@fetch_status = 0)\nBEGIN\n\tSET @sql = N'SELECT ''' + @name + N''' AS [TblName], COUNT(*) as [Count] ' +\n\tN'FROM [' + @src + N'].[' + @sch + N'].[' + @name + N'] AS src WITH(READUNCOMMITTED) FULL OUTER JOIN ' +\n\tN'[' + @trg + N'].[' + @sch + N'].[' + @name + N'] AS trg WITH(READUNCOMMITTED) ON src.[REPL_GUID]=trg.[REPL_GUID] ' +\n\tN'WHERE (src.[REPL_GUID] IS NULL) OR (trg.[REPL_GUID] IS NULL) OR (src.[CheckSumVal]<>trg.[CheckSumVal])';\n\t\n\t--print @sql;\n\t\n\tINSERT INTO #res ([TblName], [Count])\n\tEXEC sys.sp_executesql @sql;\n\t\n\tFETCH NEXT FROM sql_cursor\n\tINTO @name;\nEND\n\nCLOSE sql_cursor;\nDEALLOCATE sql_cursor;\n\nDROP TABLE #tbl;\n\nSELECT\n\t*\nFROM #res\nORDER BY [TblName] ASC;\n\nDROP TABLE #res; Here we have FULL OUTER JOIN statements creation and the return of the tables list and the number of distinctive rows, including nonexisting and lacking rows. In our case, we get the following result. Fig. 8 The number of distinctive rows in replicated tables The following script was generated for comparison. SELECT 'Employee' AS [TblName], COUNT(*) as [Count] \nFROM [JobEmpl].[dbo].[Employee] AS src WITH(READUNCOMMITTED) \nFULL OUTER JOIN [JobEmplRead].[dbo].[Employee] AS trg WITH(READUNCOMMITTED) ON src.[REPL_GUID]=trg.[REPL_GUID] \nWHERE (src.[REPL_GUID] IS NULL) OR (trg.[REPL_GUID] IS NULL) OR (src.[CheckSumVal]<>trg.[CheckSumVal])\n\nSELECT 'JobHistory' AS [TblName], COUNT(*) as [Count] \nFROM [JobEmpl].[dbo].[JobHistory] AS src WITH(READUNCOMMITTED) \nFULL OUTER JOIN [JobEmplRead].[dbo].[JobHistory] AS trg WITH(READUNCOMMITTED) ON src.[REPL_GUID]=trg.[REPL_GUID] \nWHERE (src.[REPL_GUID] IS NULL) OR (trg.[REPL_GUID] IS NULL) OR (src.[CheckSumVal]<>trg.[CheckSumVal]) It’s worth noting that to reduce blockings, the transaction isolation level is Dirty Read. Let’s unite the 4th and the 5th steps of our algorithm into the following single script. USE [JobEmplRead]\nGO\n\nSET QUOTED_IDENTIFIER ON;\n\nDECLARE @count INT = 100000;\nDECLARE @src NVARCHAR(255) = N'JobEmpl';\nDECLARE @trg NVARCHAR(255) = N'JobEmplRead';\nDECLARE @sch NVARCHAR(255) = N'dbo';\nDECLARE @sql NVARCHAR(MAX);\nDECLARE @name NVARCHAR(255);\nDECLARE @upd_listcols NVARCHAR(MAX);\nDECLARE @ins_listcols NVARCHAR(MAX);\nDECLARE @listcols NVARCHAR(MAX);\nCREATE TABLE #cols (\n\t[Name] NVARCHAR(255)\n);\nCREATE TABLE #fk_list (\n\t[TblName] NVARCHAR(255)\n ,[Name] NVARCHAR(255)\n);\n\nDECLARE @ParmDefinition NVARCHAR(500);\n\nSELECT\n\t[Name] INTO #tbl\nFROM sys.tables\nWHERE [Name] IN (\nN'Employee',\nN'JobHistory'\n);\n\nINSERT INTO #fk_list ([TblName], [Name])\n\tSELECT\n\t\tt.[Name]\n\t ,fk.[Name]\n\tFROM sys.foreign_keys AS fk\n\tINNER JOIN sys.tables AS tbl\n\t\tON fk.[parent_object_id] = tbl.[object_id]\n\tINNER JOIN #tbl AS t\n\t\tON t.[Name] = tbl.[Name];\n\n--select * from #fk_list;\n\nIF (EXISTS (SELECT TOP (1)\n\t\t\t1\n\t\tFROM #fk_list)\n\t)\nBEGIN\n\tSELECT\n\t\tN'SET QUOTED_IDENTIFIER ON; ALTER TABLE [' + [TblName] + N'] NOCHECK CONSTRAINT [' + [Name] + N']; ' AS [Script] INTO #script_fk_off\n\tFROM #fk_list;\n\n\t--select *\n\t--from #script_fk_off;\n\n\tDECLARE sql_cursor0 CURSOR LOCAL FOR SELECT\n\t\t[Script]\n\tFROM #script_fk_off;\n\n\tOPEN sql_cursor0;\n\n\tFETCH NEXT FROM sql_cursor0\n\tINTO @sql;\n\n\tWHILE (@@fetch_status = 0)\n\tBEGIN\n\t--print @sql;\n\n\tEXEC sys.sp_executesql @sql;\n\n\tFETCH NEXT FROM sql_cursor0\n\tINTO @sql;\n\tEND\n\n\tCLOSE sql_cursor0;\n\tDEALLOCATE sql_cursor0;\n\n\tDROP TABLE #script_fk_off;\nEND\n\nDECLARE sql_cursor CURSOR LOCAL FOR SELECT\n\t[Name]\nFROM #tbl;\n\nOPEN sql_cursor;\n\nFETCH NEXT FROM sql_cursor\nINTO @name;\n\nWHILE (@@fetch_status = 0)\nBEGIN\n\tDELETE FROM #cols;\n\t\n\tSET @sql = N'SET QUOTED_IDENTIFIER ON; select N''[''+col.[name]+N'']'' ' +\n\tN'from [' + @src + N'].sys.columns as col ' +\n\tN'inner join [' + @src + N'].sys.tables as tbl on col.[object_id]=tbl.[object_id] ' +\n\tN'where tbl.[name]=''' + @name + ''' and col.[is_identity]=0 and (col.[name] not in (''CheckSumVal'', ''REPL_GUID''))';--+''' and [is_identity]=0';\n\t\n\tINSERT INTO #cols ([Name])\n\tEXEC sys.sp_executesql @sql;\n\t\n\tSET @upd_listcols = N'';\n\t\n\tSELECT\n\t\t@upd_listcols = @upd_listcols + N'trg.' + CAST([Name] AS NVARCHAR(MAX)) + N' = src.' + CAST([Name] AS NVARCHAR(MAX)) + N', '\n\tFROM #cols;\n\t\n\tSET @upd_listcols = SUBSTRING(@upd_listcols, 1, LEN(@upd_listcols) - 1);\n\t\n\tDELETE FROM #cols;\n\t\n\tSET @sql = N'SET QUOTED_IDENTIFIER ON; select N''[''+col.[name]+N'']'' ' +\n\tN'from [' + @src + N'].sys.columns as col ' +\n\tN'inner join [' + @src + N'].sys.tables as tbl on col.[object_id]=tbl.[object_id] ' +\n\tN'where tbl.[name]=''' + @name + ''' and (col.[name] <> ''CheckSumVal'')';--+''' and [is_identity]=0';\n\t\n\tINSERT INTO #cols ([Name])\n\tEXEC sys.sp_executesql @sql;\n\t\n\tSET @listcols = N'';\n\t\n\tSELECT\n\t\t@listcols = @listcols + CAST([Name] AS NVARCHAR(MAX)) + N', '\n\tFROM #cols;\n\t\n\tSET @listcols = SUBSTRING(@listcols, 1, LEN(@listcols) - 1);\n\t\n\tDELETE FROM #cols;\n\t\n\tSET @sql = N'SET QUOTED_IDENTIFIER ON; select N''src.[''+col.[name]+N'']'' ' +\n\tN'from [' + @src + N'].sys.columns as col ' +\n\tN'inner join [' + @src + N'].sys.tables as tbl on col.[object_id]=tbl.[object_id] ' +\n\tN'where tbl.[name]=''' + @name + ''' and (col.[name] <> ''CheckSumVal'')';--+''' and [is_identity]=0';\n\t\n\tINSERT INTO #cols ([Name])\n\tEXEC sys.sp_executesql @sql;\n\t\n\tSET @ins_listcols = N'';\n\t\n\tSELECT\n\t\t@ins_listcols = @ins_listcols + CAST([Name] AS NVARCHAR(MAX)) + N', '\n\tFROM #cols;\n\t\n\tSET @ins_listcols = SUBSTRING(@ins_listcols, 1, LEN(@ins_listcols) - 1);\n\t\n\tSET @ParmDefinition = N'@count int';\n\t\n\tSET @sql = N'SET QUOTED_IDENTIFIER ON; DECLARE @is_identity BIT = 0;\n\t \n\t declare @tbl_id int;\n\t \n\t SELECT TOP (1)\n\t @tbl_id = [object_id]\n\t FROM [' + @trg + N'].sys.objects\n\t WHERE [name] = ''' + @name + N''';\n\t \n\t SET @is_identity =\n\t CASE\n\t WHEN (EXISTS (SELECT TOP (1)\n\t 1\n\t FROM [' + @trg + N'].sys.columns\n\t WHERE [object_id] = @tbl_id\n\t AND [is_identity] = 1)\n\t ) THEN 1\n\t ELSE 0\n\t END;\n\t \n\t IF (@is_identity = 1) SET IDENTITY_INSERT [' + @trg + N'].[dbo].[' + @name + N'] ON;\n\t \n\t --BEGIN TRAN\n\t \n\t ;MERGE TOP (@count)\n\t [' + @trg + N'].[dbo].[' + @name + N'] AS trg\n\t USING [' + @src + N'].[dbo].[' + @name + N'] AS src\n\t ON src.[REPL_GUID] = trg.[REPL_GUID]\n\t WHEN MATCHED AND (src.[CheckSumVal]<>trg.[CheckSumVal])\n\t THEN UPDATE\n\t SET ' + @upd_listcols + N'\n\t WHEN NOT MATCHED BY TARGET\n\t THEN INSERT (' + @listcols + N')\n\t VALUES (' + @ins_listcols + N')\n\t WHEN NOT MATCHED BY SOURCE\n\t THEN DELETE;\n\t \n\t --ROLLBACK TRAN\n\t \n\t IF (@is_identity = 1) SET IDENTITY_INSERT [' + @trg + N'].[dbo].[' + @name + N'] OFF;';\n\t\n\t--PRINT @sql;\n\t\n\t--begin tran\n\tEXEC sys.sp_executesql @sql\n\t\t\t\t\t\t ,@ParmDefinition\n\t\t\t\t\t\t ,@count = @count;\n\t--rollback tran\n\t\n\tFETCH NEXT FROM sql_cursor\n\tINTO @name;\nEND\n\nCLOSE sql_cursor;\nDEALLOCATE sql_cursor;\n\nDROP TABLE #cols;\n\nDROP TABLE #fk_list;\n\nDROP TABLE #tbl; First, all foreign keys for the replicated table are disabled in the JobEmplRead database. Then with the MERGE statements data is copied in portions. In our case, we have 100 000 rows per iteration. This script comprises a single iteration and executes the following T-SQL code. SET QUOTED_IDENTIFIER ON; \nALTER TABLE [JobHistory] NOCHECK CONSTRAINT [FK_JobHistory_Employee_EmployeeID];\n\nSET QUOTED_IDENTIFIER ON; DECLARE @is_identity BIT = 0;\n\t \n\t declare @tbl_id int;\n\t \n\t SELECT TOP (1)\n\t @tbl_id = [object_id]\n\t FROM [JobEmplRead].sys.objects\n\t WHERE [name] = 'Employee';\n\t \n\t SET @is_identity =\n\t CASE\n\t WHEN (EXISTS (SELECT TOP (1)\n\t 1\n\t FROM [JobEmplRead].sys.columns\n\t WHERE [object_id] = @tbl_id\n\t AND [is_identity] = 1)\n\t ) THEN 1\n\t ELSE 0\n\t END;\n\t \n\t IF (@is_identity = 1) SET IDENTITY_INSERT [JobEmplRead].[dbo].[Employee] ON;\n\t \n\t --BEGIN TRAN\n\t \n\t ;MERGE TOP (@count)\n\t [JobEmplRead].[dbo].[Employee] AS trg\n\t USING [JobEmpl].[dbo].[Employee] AS src\n\t ON src.[REPL_GUID] = trg.[REPL_GUID]\n\t WHEN MATCHED AND (src.[CheckSumVal]<>trg.[CheckSumVal])\n\t THEN UPDATE\n\t SET trg.[FirstName] = src.[FirstName], trg.[LastName] = src.[LastName], trg.[Address] = src.[Address]\n\t WHEN NOT MATCHED BY TARGET\n\t THEN INSERT ([EmployeeID], [FirstName], [LastName], [Address], [REPL_GUID])\n\t VALUES (src.[EmployeeID], src.[FirstName], src.[LastName], src.[Address], src.[REPL_GUID])\n\t WHEN NOT MATCHED BY SOURCE\n\t THEN DELETE;\n\t \n\t --ROLLBACK TRAN\n\t \n\t IF (@is_identity = 1) SET IDENTITY_INSERT [JobEmplRead].[dbo].[Employee] OFF; This script should be run automatically at an interval specified in advance. For example, it can be run every minute or even more frequently depending on the analytics needs. The number of distinctive rows has to be smaller after several iterations. Fig. 9 The change in the number of distinctive rows Remember that to enable foreign keys in the disabled tables you should run the following script. USE [JobEmplRead]\nGO\n\nDECLARE @sql NVARCHAR(MAX);\n\nCREATE TABLE #fk_list (\n\t[TblName] NVARCHAR(255)\n ,[Name] NVARCHAR(255)\n);\n\nSELECT\n\t[Name] INTO #tbl\nFROM sys.tables\nWHERE [Name] IN (\nN'Employee',\nN'JobHistory'\n);\n\nINSERT INTO #fk_list ([TblName], [Name])\n\tSELECT\n\t\tt.[Name]\n\t ,fk.[Name]\n\tFROM sys.foreign_keys AS fk\n\tINNER JOIN sys.tables AS tbl\n\t\tON fk.[parent_object_id] = tbl.[object_id]\n\tINNER JOIN #tbl AS t\n\t\tON t.[Name] = tbl.[Name];\n\n--select * from #fk_list;\n\nIF (EXISTS (SELECT TOP (1)\n\t\t\t1\n\t\tFROM #fk_list)\n\t)\nBEGIN\n\tSELECT\n\t\tN'ALTER TABLE [' + [TblName] + N'] CHECK CONSTRAINT [' + [Name] + N']; ' AS [Script] INTO #script_fk_on\n\tFROM #fk_list;\n\n\t--select *\n\t--from #script_fk_on;\n\n\tDECLARE sql_cursor0 CURSOR LOCAL FOR SELECT\n\t\t[Script]\n\tFROM #script_fk_on;\n\n\tOPEN sql_cursor0;\n\n\tFETCH NEXT FROM sql_cursor0\n\tINTO @sql;\n\n\tWHILE (@@fetch_status = 0)\n\tBEGIN\n\t--print @sql;\n\n\tEXEC sys.sp_executesql @sql;\n\n\tFETCH NEXT FROM sql_cursor0\n\tINTO @sql;\n\tEND\n\n\tCLOSE sql_cursor0;\n\tDEALLOCATE sql_cursor0;\n\n\tDROP TABLE #script_fk_on;\nEND\n\nDROP TABLE #fk_list;\n\nDROP TABLE #tbl; In our case, the following script will be generated and executed. ALTER TABLE [JobHistory] CHECK CONSTRAINT [FK_JobHistory_Employee_EmployeeID]; Keep in mind that you can’t enable foreign keys on the replicated tables while the replication is running until all data is copied. Conclusion We have reviewed one of the ways to implement the process of replicating tables in one direction from the source to the destination. This approach and scripts can be applied to any database. But of course, those scripts have to be modified depending on the specifics of the replicated tables. For example, modification might be required if the tables have calculated fields. [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) was the main tool to help me build those scripts. The tool also allows code formatting as well as renaming objects and all their references. References [DbForge Schema Compare](https://www.devart.com/en/dbforge/sql/schemacompare/) [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) [SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) Tags [replication](https://blog.devart.com/tag/replication) [Schema Compare](https://blog.devart.com/tag/schema-compare) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-custom-sql-server-replication-for-read-only-databases.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+Custom+SQL+Server+Replication+for+Read-only+Databases&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-custom-sql-server-replication-for-read-only-databases.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-custom-sql-server-replication-for-read-only-databases.html&title=How+to+Create+Custom+SQL+Server+Replication+for+Read-only+Databases) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-custom-sql-server-replication-for-read-only-databases.html&title=How+to+Create+Custom+SQL+Server+Replication+for+Read-only+Databases) [Copy URL](https://blog.devart.com/how-to-create-custom-sql-server-replication-for-read-only-databases.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-create-customizable-crud-operations-with-sql-complete.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Create Customizable CRUD Operations with SQL Complete By [dbForge Team](https://blog.devart.com/author/dbforge) March 4, 2021 [0](https://blog.devart.com/how-to-create-customizable-crud-operations-with-sql-complete.html#respond) 6536 Working with SQL databases and tables usually implies performing daily data-related tasks using the CRUD operations to reuse, manipulate, and access data later. For example, you can create a new table, modify it, populate it with data, retrieve and store data for future use, or delete it if needed. To expedite routine tasks, SQL Complete provides code templates for [CRUD operations in SQL](https://www.devart.com/dbforge/sql/sqlcomplete/crud-operations-in-sql.html) that can be customized to your needs. The article covers how to create custom CRUD procedure templates in SQL Server with the help of [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) . Introduction CRUD is an acronym that stands for C REATE, R EAD, U PDATE, and D ELETE. In SQL Server, CRUD is represented by 4 operations performed on the selected data against a specific SQL database: CREATE refers to inserting columns and values into the table. READ refers to retrieving data from the table. UPDATE refers to modifying data in the table. DELETE refers to deleting data and records in the table. Generating the CRUD Procedures in SQL Complete To begin with, we will review how SQL Complete generates the CRUD procedures. On the main menu, navigate to SQL Complete and click Options on the shortcut menu. In the dbForge SQL Complete: Options window that opens, switch to the CRUD tab under which the CRUD procedure templates (Select, Insert, Update, and Delete) are located. On the General tab, you can use the following options: Select whether to include either of the CRUD procedures in the script. Arrange the order of columns either by ordinal number or alphabetically. Click Reset Page Defaults to overwrite the changes you apply and set them to the default settings. This option is also available on the Select, Insert, Update, and Delete tabs. Each code template for the Select, Insert, Update, and Delete procedures is customizable and contains placeholders for variables that can be replaced with the actual value. By default, the name of the CRUD procedure ends with the name of the operation you are using. From now on, SQL Complete supports the –region and –endregion options that are automatically generated for the CRUD code templates. The tool allows setting a name for the region and expanding or collapsing them in the CRUD procedures: Let’s explore each of the CRUD operations in depth. CREATE CRUD – Insert SQL Server Procedure The Insert procedure can be used to add new rows and their values to the table. If you want to copy data into a target table from other SQL tables, use an INSERT INTO SELECT statement. In this case, the data types of both tables should match; otherwise, it returns the error. On the CRUD menu, switch to the Insert tab to view the code template for the procedure. Instead of the variable placeholders, you should specify the following information: $schemas$ is the name of the schema to which the table belongs. $table$ is the name of the table for which the CRUD procedure is created. $columns$ is a list of columns you want to insert. $values$ is a list of values to be inserted into the columns. $where$ is a search condition by which the rows to be returned are filtered. If you want to return the result set of the inserted row, select the Return Inserted row check box. READ CRUD – Select SQL Server Procedure The Select statement can be used to retrieve data or a set of records from the table based on the primary key within the input parameter. On the CRUD menu, switch to the Select tab to view how SQL Complete generates the procedure. In the Select statement, you can replace the following placeholders: $columns$ : Define the columns from which you want to retrieve data. $schema$ : Specify a schema to which the source table belongs. $table$ : Set a table from which you want to get data. $where$ : (Optional) Set a condition by which the results set will be filtered. If you want to get all records with input parameters that equal NULL, select the Return all data if input parameters are null check box. UPDATE CRUD – Update SQL Server Procedure The Update statement can be used to modify the data in the table. The code template is located under the CRUD > Update tab. In the Update statement, you need to specify a schema and a table for which you want to execute the CRUD operation. The statement includes the SET and WHERE clauses, where you need to indicate columns to be modified and define the filters for the specific rows, respectively. If you want to return the updated row, select the Return updated row check box. DELETE CRUD – Delete SQL Server Procedure The Delete statement removes the row or rows specified in the WHERE clause of the statement. On the CRUD > Delete tab, view the code template for the Delete statement. The Delete statement contains the FROM and WHERE clauses: In the FROM clause, replace the %schema$ and $table$ placeholders with the name of the source schema and table for which the rows should be deleted. In the WHERE clause, replace $where$ with a condition by which the rows to be deleted will be filtered. Generating a CRUD Procedure After you have created a custom CRUD template, you can generate a CRUD script for the table. To generate a table script as CRUD, in Object Explorer , right-click the table you need and select SQL Complete > Script Table as CRUD on the shortcut menu. A new SQL document opens, displaying the script matching the CRUD template you have created. Conclusion With SQL Complete, you can easily modify the code template for the CRUD procedures. The tool automatically adds the name of the operation you are working on and wraps the procedure code into named regions. To evaluate all the useful features SQL Complete provides, [download](https://www.devart.com/dbforge/sql/sqlcomplete/download.html) a free 30-day trial version of the tool. Watch Tutorial For more information about the key features of dbForge SQL Complete, take a look at our video tutorial: Tags [crud](https://blog.devart.com/tag/crud) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [t-sql](https://blog.devart.com/tag/t-sql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-customizable-crud-operations-with-sql-complete.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+Customizable+CRUD+Operations+with+SQL+Complete&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-customizable-crud-operations-with-sql-complete.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-customizable-crud-operations-with-sql-complete.html&title=How+to+Create+Customizable+CRUD+Operations+with+SQL+Complete) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-customizable-crud-operations-with-sql-complete.html&title=How+to+Create+Customizable+CRUD+Operations+with+SQL+Complete) [Copy URL](https://blog.devart.com/how-to-create-customizable-crud-operations-with-sql-complete.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-create-database-in-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How to Create New Oracle Database in 4 Different Ways By [dbForge Team](https://blog.devart.com/author/dbforge) October 28, 2021 [0](https://blog.devart.com/how-to-create-database-in-oracle.html#respond) 23227 There are several approaches to creating a new database in Oracle that include GUI tools, Oracle syntax, and [database migration](https://www.devart.com/dbforge/oracle/studio/oracle-database-migration-tool.html) . This article will describe the steps to create a database in Oracle: from the first preparation activities to using CREATE DATABASE . Here, you will find four ways to create a database in Oracle step by step. Two of them will be based on the command line: using the CREATE DATABASE command, as well as generating the CREATE DATABASE script from an existing DB. The other two will describe how to use such GUI utilities as DBCA, and Oracle SQL Developer. This tutorial is relevant for Oracle versions 12c, 11g, 10g, and higher. Before creating an Oracle database Before creating a [database in Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , you will have to set an appropriate scene that will include standard PL/SQL packages installation and building views on the data dictionary tables. Without further ado, let us prepare the server for the Oracle database creation. Usually, OUI sets ORACLE_HOME and ORACLE_SID automatically in the Windows registry. However, if you installed Oracle without creating a database, you will have to configure ORACLE_SID manually. Set the Instance Identifier (SID) One of the most important environment variables is ORACLE_SID , which will be the name of the new Oracle database you are about to create. You will be able to set it using the following command: set ORACLE_SID=testdb Create the initialization parameter file The next step would be the init.ora file creation. It will serve as the initialization file for the new database. Depending on your Oracle version, there might or might not be a sample init.ora file in ORACLE_HOME/dbs . If there is, you can use it as a template and edit the values according to your needs. The initialization file for the new database should be in the following format: INIT{ORACLE_SID}.ora . In our case, the file name will be inittestdb.ora . If there is no default init.ora template in ORACLE_HOME/dbs , use the following sample: ##############################################################################\n# Example INIT.ORA file\n#\n# This file is provided by Oracle Corporation to help you start by providing\n# a starting point to customize your RDBMS installation for your site. \n# \n# NOTE: The values that are used in this file are only intended to be used\n# as a starting point. You may want to adjust/tune those values to your\n# specific hardware and needs. You may also consider using Database\n# Configuration Assistant tool (DBCA) to create INIT file and to size your\n# initial set of tablespaces based on the user input.\n###############################################################################\n \n# Change '' to point to the oracle base (the one you specify at\n# install time)\n \ndb_name='ORCL'\nmemory_target=1G\nprocesses = 150\ndb_block_size=8192\ndb_domain=''\ndb_recovery_file_dest='/flash_recovery_area'\ndb_recovery_file_dest_size=2G\ndiagnostic_dest=''\ndispatchers='(PROTOCOL=TCP) (SERVICE=ORCLXDB)'\nopen_cursors=300 \nremote_login_passwordfile='EXCLUSIVE'\nundo_tablespace='UNDOTBS1'\n# You may want to ensure that control files are created on separate physical\n# devices\ncontrol_files = (ora_control1, ora_control2)\ncompatible ='12.0.0' Note: The provided content may vary depending on the Oracle version. Whatever the name of undo_tablespace is, make sure to use the exact same one while executing the CREATE DATABASE command. Another important point is to edit the directory locations depending on your system. Do not forget to change “ testdb ” in the above example to your ORACLE_SID name. For convenience, store your initialization parameter file in the default location, using the default file name. Create an instance The next step would be creating an instance. If it does not already exist, you will need to execute the ORADIM command from Command Prompt: oradim -NEW -SID sid -STARTMODE MANUAL -PFILE file Replace the sid placeholder with the SID parameter we set earlier. As to the file parameter, that would be the full path to the text initialization parameter file. Connect to an instance Once the instance has been created, you need to connect to it. In this article, we will be using SQL*Plus to do that. If you do not have it installed on your personal computer, it is available for download on the [official Oracle website](https://www.oracle.com/database/technologies/instant-client/winx64-64-downloads.html) . Having launched the SQL*Plus, connect to your Oracle Database instance with the SYSDBA administrative privilege: $ sqlplus /nolog\nSQL> CONNECT / AS SYSDBA SQL*Plus will return Connected to an idle instance. as an output. Create server parameter file (spfile) Unlike the init.ora file, you cannot edit the server parameter file manually as it is binary. Instead, the spfile can be generated from init.ora . In SQL*Plus, run the following command: CREATE SPFILE FROM PFILE; Run idle instance Before creating the database, we must start an instance for the testdb database using the STARTUP NOMOUNT command. As you may have guessed by now, this command will not connect to the database. Instead, it will simply start an empty ORACLE_SID instance named testdb . SQL> STARTUP NOMOUNT;\nORACLE instance started.\n\nTotal System Global Area 1258291200 bytes\nFixed Size 1261564 bytes\nVariable Size 520093700 bytes\nDatabase Buffers 721420288 bytes\nRedo Buffers 15515648 bytes How to create a database in Oracle, using CREATE DATABASE command Since we have already prepared a cozy place on the server, it’s high time we let a new database inside. Using the CREATE DATABASE statement is one of the most popular ways to create a database among the developers that work with Oracle. The CREATE DATABASE statement will look somehow like this: CREATE DATABASE testdb\n USER SYS IDENTIFIED BY sys_password\n USER SYSTEM IDENTIFIED BY system_password\n LOGFILE GROUP 1 ('/u01/logs/my/redo01a.log','/u02/logs/my/redo01b.log') SIZE 100M,\n GROUP 2 ('/u01/logs/my/redo02a.log','/u02/logs/my/redo02b.log') SIZE 100M,\n GROUP 3 ('/u01/logs/my/redo03a.log','/u02/logs/my/redo03b.log') SIZE 100M\n MAXLOGHISTORY 1\n MAXLOGFILES 16\n MAXLOGMEMBERS 3\n MAXDATAFILES 1024\n CHARACTER SET AL32UTF8\n NATIONAL CHARACTER SET AL16UTF16\n EXTENT MANAGEMENT LOCAL\n DATAFILE '/u01/app/oracle/oradata/mynewdb/system01.dbf'\n SIZE 700M REUSE AUTOEXTEND ON NEXT 10240K MAXSIZE UNLIMITED\n SYSAUX DATAFILE '/u01/app/oracle/oradata/mynewdb/sysaux01.dbf'\n SIZE 550M REUSE AUTOEXTEND ON NEXT 10240K MAXSIZE UNLIMITED\n DEFAULT TABLESPACE users\n DATAFILE '/u01/app/oracle/oradata/mynewdb/users01.dbf'\n SIZE 500M REUSE AUTOEXTEND ON MAXSIZE UNLIMITED\n DEFAULT TEMPORARY TABLESPACE tempts1\n TEMPFILE '/u01/app/oracle/oradata/mynewdb/temp01.dbf'\n SIZE 20M REUSE AUTOEXTEND ON NEXT 640K MAXSIZE UNLIMITED\n UNDO TABLESPACE undotbs1\n DATAFILE '/u01/app/oracle/oradata/mynewdb/undotbs01.dbf'\n SIZE 200M REUSE AUTOEXTEND ON NEXT 5120K MAXSIZE UNLIMITED\n USER_DATA TABLESPACE usertbs\n DATAFILE '/u01/app/oracle/oradata/mynewdb/usertbs01.dbf'\n SIZE 200M REUSE AUTOEXTEND ON MAXSIZE UNLIMITED; This example creates a database named testdb using the Oracle command line. Such parameters as global database name and CONTROL_FILES are taken from the previously created initialization parameter file. In the LOGFILE clause, we specified three redo log file groups. Each of them has two members. The MAXLOGFILES , MAXLOGMEMBERS , and MAXLOGHISTORY parameters define limits for the redo log. MAXDATAFILES affects the initial sizing of the control file and limits the number of data files that can be open in the DB. As to AL32UTF8 and AL16UTF16 , these are the character sets that will be used to store data in the database we are about to create. Another important part of the CREATE DATABASE command in Oracle is the SYSTEM tablespace. It consists of the operating system file /u01/app/oracle/oradata/mynewdb/system01.dbf and is created as specified by the DATAFILE clause. The SYSTEM tablespace is created as locally-managed. SYSAUX consists of the operating system file /u01/app/oracle/oradata/mynewdb/sysaux01.dbf (as specified in the SYSAUX DATAFILE clause). DEFAULT TABLESPACE creates and names a default permanent tablespace. DEFAULT TEMPORARY TABLESPACE creates and names a default temporary tablespace. If you specified the UNDO_MANAGEMENT parameter as AUTO in the initialization parameter file, UNDO TABLESPACE creates and names an undo tablespace that stores the undo data. USER_DATA TABLESPACE creates and names the tablespace for storing user data and database options, e.g. Oracle XML DB. Since the ARCHIVELOG clause is not specified in this CREATE DATABASE Oracle line, online redo logs will not be archived. This part can be customized during database creation. You will be able to use an ALTER DATABASE query to switch to the ARCHIVELOG mode. Having executed the script above, you will create the testdb database. Create Oracle database using DBCA (Database Configuration Assistant) Now, let us describe how to create a container database in Oracle using DBCA. DBCA (Database Configuration Assistant) is a graphical tool that might be more useful for those who are not yet familiar with the command-line syntax. It can be used to create and delete databases, add options to existing ones, manage templates, etc. Let us move on to the practical part of the database creation in Oracle using DBCA. 1. First of all, open the Database Configuration Assistant on your computer. You will see a list of the operations you can perform. Select Create a database and click Next . 2. In Creation mode , opt for Advanced configuration and proceed to the next step. 3. The wizard offers you to choose the database deployment type. Select the General Purpose or Transaction Processing option. Click Next . 4. Among the different storage options, choose the first one: Use template for database storage attributes . Click Next . 5. Just to play it safe, configure the fast recovery option for your database. Specify the recovery files storage type, fast recovery area, and sizing. Hit Next . 6. At the Network Configuration step, you will need to create a new listener. Enter the name and the port number for it and proceed to the next step. 7. This is an optional step and we are going to skip it in this tutorial. For more information, feel free to refer to the [official Oracle documentation](https://docs.oracle.com/en/database/oracle/oracle-database/12.2/rilin/configuring-oracle-database-vault-using-dbca.html) . 8. At the Configuration Options step you will see five tabs: Memory , Sizing , Character sets , Connection modes , Sample schemas . Opt for Automatic Shared Memory Management , specify the SGA and PGA sizes and move on to the next tab. 9. On the Sizing tab, specify the maximum number of processes that can be connected to our database at the same time. 10. Here, choose AL32UTF8 . For the National character set, opt for AL16UTF16 . Once done, go to the next tab. 11. Select Dedicated mode on the Connection mode tab. Proceed to the next tab. 12. At the last tab at this step, select the Add sample schemas to the database checkbox and click Next . 13. Once the configuration options are set, you are going to configure the management options. Select the Configure Enterprise Manager (EM) database express checkbox. Also, specify the corresponding port. Once done, click Next . 14. Set the same administrative password for all accounts and enter the Oracle home user password. Proceed to the next step. 15. Now, the wizard allows you to choose what exactly is to be done during and after the database creation. You can specify the scripts to run after the DB is created; save the new DB as a template; generate a DB creation script so you don’t have to go through the whole process of Oracle database creation using DBCA. Tick the desired checkboxes and click Next . 16. Double-check the generated database creation summary and hit Finish . 17. Allow the wizard to finish the database creation process. You will be able to see what exactly is being done at the moment and what is up next. 18. As soon as the process is over, you will be notified accordingly. Keep in mind that all the database accounts except SYS and SYSTEM are initially locked. To unlock them, click Password Management. 19. Set the passwords for the database accounts and click OK . This is it! Your Oracle database is fresh from the oven and ready to serve its purpose. Generate CREATE DATABASE script from an existing Oracle database Creating an Oracle database might seem quite time-consuming and meticulous. To save you from going through this process over and over again, we decided to provide you with a way to generate a CREATE DATABASE script from an existing Oracle database. As mentioned earlier, you can save the corresponding script in DBTA during the DB creation. However, how do you generate the CREATE DATABASE script from a database that already exists? 1. In this article, we will be using Database Configuration Assistant to do that. To begin with, open DBCA on your computer. 2. Select Manage Templates and click Next . 3. Enter the name of the future template and choose where to store it. Proceed to the next step. 4. Now, click Create template from an existing database . From the dropdown menu, select the desired DB. Then, enter your user credentials. Hit Next once done. 5. At this step, decide whether you would like to keep the file locations as is or to convert them to use OFA structure. 6. Review the template creation summary and click Finish . 7. Allow some time for the template to be created. 8. Now, you have your database saved as a template for future use. Click Close . How to create a new database in Oracle SQL Developer Oracle SQL Developer is a free GUI that allows you to browse database objects, execute SQL statements and scripts, edit and debug PL/SQL statements, manipulate and export data, view and create reports, and the list goes on. It also integrates interfaces into several related technologies, including Oracle Data Miner, Oracle OLAP, Oracle TimesTen In-Memory Database, and SQL Developer Data Modeler (read-only). Let us demonstrate how to create a database in Oracle SQL Developer. 1. [Download](https://www.oracle.com/tools/downloads/sqldev-downloads.html) Oracle SQL Developer. 2. Extract the files from the downloaded .zip archive and run SQL Developer. 3. On opening the application, you will see the Start Page . Look for a green plus sign under Connections on the left. Click an arrow next to it and select New Database Connection . 4. The New/Select Database Connection window opens. First, enter the name of the future database. After that, fill in the corresponding boxes with the user credentials. If you are logging in as a sys user, make sure to choose SYSDBA or SYSOPER role from the dropdown. Enter the hostname, port, and SID. Click Test . If everything is okay, you will see the Status: Success message in the bottom left corner of the window. In case of any errors, there will be hints on how to fix them. Once you are ready, press Connect . 5. Now, you have created an Oracle database using SQL Developer. The new testdb database is located in the Oracle Connections list. Conclusion There are quite a lot of ways to create and manipulate Oracle databases out there. Moreover, Oracle requires knowledge, experience, and time to prepare and configure properly. We have just gone through some of the most popular and convenient ways to start a new Oracle DB: using the CREATE DATABASE statement in the command line, Database Configuration Assistant, and SQL Developer. Some of these tools require more experience than others, but we believe it is good to learn at least the basics about each one. This helps find the best solution for your business. Useful links If you are dealing with Oracle databases, these pages might be of use for you: [How to check invalid objects in Oracle](https://blog.devart.com/find-invalid-objects-in-your-databases.html) [How to rename a table in Oracle](https://blog.devart.com/rename-table-in-oracle.html) [Oracle SQL IDE](https://www.devart.com/dbforge/oracle/studio/) Tags [create database oracle](https://blog.devart.com/tag/create-database-oracle) [CREATE DATABASE statement](https://blog.devart.com/tag/create-database-statement) [dbca](https://blog.devart.com/tag/dbca) [Oracle](https://blog.devart.com/tag/oracle) [oracle database template](https://blog.devart.com/tag/oracle-database-template) [sql developer](https://blog.devart.com/tag/sql-developer) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-database-in-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+New+Oracle+Database+in+4+Different+Ways&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-database-in-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-database-in-oracle.html&title=How+to+Create+New+Oracle+Database+in+4+Different+Ways) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-database-in-oracle.html&title=How+to+Create+New+Oracle+Database+in+4+Different+Ways) [Copy URL](https://blog.devart.com/how-to-create-database-in-oracle.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-create-many-to-many-relationships-between-tables.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Create a Many-to-Many Relationship By [dbForge Team](https://blog.devart.com/author/dbforge) May 6, 2020 [0](https://blog.devart.com/how-to-create-many-to-many-relationships-between-tables.html#respond) 77462 Establishing a many-to-many relationship between the tables in a database is usually done for ensuring efficient data processing and data integrity, as well as for database normalization and data analysis tasks. Since relational databases don’t allow implementing a direct many-to-many relationship between two tables, handling that kind of relationship can be an intimidating task. In this article, we’ll consider how to quickly and easily establish a many-to-many relationship between the tables in MySQL database. What is a many-to-many relationship Let us first understand what many-to-many relationships are all about. A many-to-many relationship happens in the situation when the records in one table are associated with the records in another one.  A m any-to-many relationship example in a database can be a table containing a list of books and a table with a list of authors. Each book may have one or more than one author, and each author may have written one or more than one book. In this case, a row in one table has many related rows in a second table. And at the same time, the rows in the second table have related rows in the first table. Fig. 1 M any-to-many relationship example A many-to-many relationship between tables is accommodated in databases by means of junction tables. A junction table contains the primary key columns of the two tables you want to relate. Junction table When you need to establish a many-to-many relationship between two or more tables, the simplest way is to use a Junction Table. A Junction table in a database, also referred to as a Bridge table or Associative Table,  bridges the tables together by referencing the primary keys of each data table. Fig. 2 A Junction Table example See also: [How to add, show, and drop MySQL foreign keys](https://blog.devart.com/mysql-foreign-key.html) How to create a many-to-many relationship in dbForge Studio for MySQL With dbForge Studio for MySQL, you can create a [many-to-many relationship](https://docs.devart.com/studio-for-mysql/designing-databases-with-database-designer/many-to-many-relationships-between-tables.html) between tables in a database quickly and easily. Below is a comprehensive algorithm that will help you cope with this task. 1. To start establishing a many-to-many relationship in MySQL, first, create a new or open an existing [database diagram](https://www.devart.com/dbforge/mysql/studio/database-designer.html) . Fig. 3 Creating a database diagram 2. Add the tables you want to create a many-to-many relationship between. Fig. 4 Adding the tables to create a many-to-many relationship between 3. Create a third table: right-click the database diagram body and select New Table from the context menu that appears. This table will serve as a junction table. Fig. 5 Creating a junction table to establish a many-to-many relationship 4. In the Table Editor dialog box, enter a name for the table. For example, the junction table between the Authors table and the Books table can be named Books_Authors . Fig. 6 Entering a name for the junction table 5. Copy the primary key columns from each of the other two tables to the junction table. You can add other columns to this table, just as to any other table. Fig. 7 Building the junction table 6. In the junction table, set the primary key to include all the primary key columns from the other two tables. Fig. 8 Creating new relations 7. Define a one-to-many relationship between each of the two primary tables and the junction table. Fig. 9 Creating a many-to-many relationship between the tables Design and edit database schemas visually with the help of [Database Designer for MySQL](https://www.devart.com/dbforge/mysql/studio/database-designer.html) DBMS. Note: The creation of a junction table in a database diagram does not insert data from the related tables into the junction table. You can copy rows from one table to another or within a table using an Insert From query. INSERT INTO tbl_temp2 (fld_id)\n SELECT tbl_temp1.fld_order_id\n FROM tbl_temp1 WHERE tbl_temp1.fld_order_id > 100; As we delve into the MySQL database in this blog post, we’ve also included an essential tutorial on [how to show tables](https://www.devart.com/dbforge/mysql/studio/show-tables-list-in-mysql.html) , broadening your understanding and control over your database environment. Conclusion There are three different types of data relationships in a database: one-to-one, one-to-many, and many-to-many.  And if you can handle the first two quite simply, coping with a many-to-many relationship can be a daunting task.  In this article, we provided a step-by-step tutorial showing how to easily establish a many-to-many relationship model with the help of our superior tool, dbForge Studio for MySQL. Download a free 30-day trial of dbForge Studio for MySQL right now and try the database diagram functionality along with many other mighty features of the best MySQL GUI tool you can find! Also, you can watch this video tutorial: Tags [database diagram](https://blog.devart.com/tag/database-diagram) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-many-to-many-relationships-between-tables.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Create+a+Many-to-Many+Relationship&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-many-to-many-relationships-between-tables.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-many-to-many-relationships-between-tables.html&title=How+to+Create+a+Many-to-Many+Relationship) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-many-to-many-relationships-between-tables.html&title=How+to+Create+a+Many-to-Many+Relationship) [Copy URL](https://blog.devart.com/how-to-create-many-to-many-relationships-between-tables.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-create-mysql-database-in-one-shot.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to: Create MySQL Database in One Shot By [dbForge Team](https://blog.devart.com/author/dbforge) July 19, 2010 [2](https://blog.devart.com/how-to-create-mysql-database-in-one-shot.html#comments) 10348 This article gives step-by-step instruction for visual database creation. There is a lot of instruments that allow database developers to avoid monotonous and error-prone manual writing of scripts for tables creation. But to have a complete picture while creating a database it’s necessary not only to speed up the process of database objects creation but also to visualize relations between them. [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) offers a perfect tool for such purpose – Database Designer . Let’s create the sample demo base database visually. Task: Create a database that would contain information about the salary of employees in different company departments. Database Sketch Solution: Let’s start with [database normalization](https://en.wikipedia.org/wiki/Database_normalization) . In the result we will get three tables: Database Sketch after Normalization Now it’s time to use the visual tool for designing databases – [MySQL Database Designer](https://www.devart.com/dbforge/mysql/studio/database-designer.html) . But before let’s set a connection to MySQL server Creating a new connection and create an empty salary_db database: Creating a new database Now let’s create an empty database diagram file: Creating a new database diagram file Press the New Table button on the toolbar: Creating a new table Let’s specify the table name and database in the visual table editor that appeared. After this let’s create a column: Creating a new column and fill its parameters: Entering column properties After this let’s create all remaining columns in the same way, and after we’ve finished let’s click OK to save our first table. Let’s create the remaining EMP and DEPT tables. Tables can be created in any order -the order does not have any influence. Now let’s ensure that our database has referential integrity. To do this, let’s press the NEW RELATION button on the toolbar. After performing this action the mouse pointer should change. Let’s click the DeptNo column of the EMP table and drag the relation to the DeptNo column of the DEPT table. In the Foreign Key Properties that appeared press OK. Creating a new relation The result of these actions is that there is a relation between the tables now. Let’s add a relation between the SAL and EMP tables. To do this, let’s drag a relation from the EmpNo column of the SAL table to the column with the same name of the EMP table. Note, that it is not necessary to press the New Relation button once more. Now let’s arrange the tables of the created database for better perception. To do this, call the popup menu of the database diagram and select the Layout Diagram option. And here is the result: Database Diagram Now let’s generate and save the script of the created database for further usage. To do this, select all tables and click Generate Schema Script in the popup menu. Generate Schema Script And here is the result: Database Script Conclusion In the Database Designer tool , you see the database you are creating in the form of a database diagram . Such approach allows you to create databases of any complexity visually and save the script of the created database for future use. Watch Tutorial We also welcome you to take a look at our video tutorial for step-by-step instructions for creating a MySQL database with the help of dbForge Studio for MySQL: Tags [database diagram](https://blog.devart.com/tag/database-diagram) [MySQL](https://blog.devart.com/tag/mysql) [studio for mysql](https://blog.devart.com/tag/studio-for-mysql) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-mysql-database-in-one-shot.html) [Twitter](https://twitter.com/intent/tweet?text=How+to%3A+Create+MySQL+Database+in+One+Shot&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-mysql-database-in-one-shot.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-mysql-database-in-one-shot.html&title=How+to%3A+Create+MySQL+Database+in+One+Shot) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-mysql-database-in-one-shot.html&title=How+to%3A+Create+MySQL+Database+in+One+Shot) [Copy URL](https://blog.devart.com/how-to-create-mysql-database-in-one-shot.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025 2 COMMENTS Trevor July 25, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 9:59 pm This seems crazy to me. As a database developer, I could type out the SQL code above in a few minutes. Using this application though I would be clicking and selecting options for at least half an hour. Even if I did type some errors, it would be seconds to fix compared to the minutes wasted going through dialogs, menus, and drop downs. I wouldn’t even choose to use the tools in SQL server’s SSMS or MS Access which are still much faster to use than all these dialogs. .jp July 26, 2010\t\t\t\t\t\t At\t\t\t\t\t\t 2:36 pm Yes, you may be right on the one hand that for a small database it’s more convenient to type scripts manually instead of performing a lot of clicks on visual editors. But on the other hand, when you have to think over the design of a future database or redesign an existing one, from our point of view it’s more convenient to see the created (or modified) database structure as a database diagram with database relations displayed on it and perform the changes in the database structure on this database diagram. Comments are closed."} {"url": "https://blog.devart.com/how-to-create-oracle-user.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Create a New Oracle User and Grant Privileges: Syntax and Examples By [dbForge Team](https://blog.devart.com/author/dbforge) February 17, 2022 [0](https://blog.devart.com/how-to-create-oracle-user.html#respond) 28710 In this article, we will talk about how to create a user in [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) . You will learn how to add new database users, figure out which supplemental aspects this job involves: from the initial user creation to dropping it. Moreover, you will find some useful tips on working with IDENTIFY and TABLESPACE clauses, as well as learn how to GRANT roles and permissions in Oracle. Contents How to Create a New User in Oracle Oracle CREATE USER Syntax Examples How to Create Default Users with Default Settings Create User Identified by Clauses Create User Identified by Password Clause Externally and Globally Clauses CREATE USER with Tablespace Clause Default Tablespace Temporary Tablespace Quota Create User Attributes Profile Password Expire Account Lock/Account Unlock Grant Role to User Granting Permission in Oracle GRANT Command Syntax Oracle User Privileges How to Create and Grant All Privileges to Oracle User How to Grant Table Privilege to User in Oracle Create Oracle Users and Grant Permissions Easily with dbForge Studio for Oracle How to Delete (Drop) User in Oracle Conclusion How to Create a New User in Oracle Before we start, you need to check if you have the necessary system privilege to create users. If not, make sure to get them assigned to your account. After that, you can proceed to the practical tasks. The examples in this article relate to the create user Oracle 19c version, but the methods are the same for all Oracle versions in use (including Oracle 10g, 11g, 12c, etc.). Oracle CREATE USER Syntax Examples For starters, we will be looking into Oracle CREATE USER syntax. First, we will discuss how to create one with default settings. After that, we will move on to the different variations of the IDENTIFIED clause, tablespace clause, and other peculiarities of the CREATE USER syntax in Oracle. How to Create Default Users with Default Settings It is always best to start with the basics. Thus, let us focus on the CREATE USER command by itself. As is, it will create a user with default attributes. Further in this article, we will look at how to configure users more finely and how it boosts the safety of the database in general. Create User Identified by Clauses The IDENTIFIED clause lets you indicate how the Oracle database authenticates a user. Let us take a closer look at different examples of the IDENTIFIED syntax in Oracle. Create User Identified by Password Clause In the most straightforward case, we are creating a new local user under the username. The user will be required to enter the password to log into the system: CREATE USER IDENTIFIED BY ; The username can be anything. However, the password must consist of single-byte characters from the database character set. If the character set also has multibyte characters, it does not change the password requirement – use only single-byte characters. CREATE USER visitor\n IDENTIFIED BY psw4visits; Externally and Globally Clauses Besides identifying by password, you may use one of the two other means of user authentication. It will be configuring an external user or a global user. To do it, you need to include the EXTERNALLY or GLOBALLY clause in the CREATE USER Oracle command. EXTERNALLY allows for creating an external user. In this case, the user is authenticated by an external system, such as the operating system. For instance, an Oracle database user is a Windows user. Thus, they can access the database after getting authenticated by Windows without entering other passwords. Working under the external user is a standard option for regular database users. But such users only have standard roles (CONNECT and RESOURCE), without administrator or database operator privileges. To create an external user, we execute the below statement: CREATE USER external_user1\n IDENTIFIED EXTERNALLY\n DEFAULT TABLESPACE tbs_new_10\n QUOTA 10M ON tbs_new_10\n PROFILE external_user_profile1; This way, we have made a new external user for our database. The name is external_user1. No additional password is needed. We assigned this user the default tablespace tbs_new_10 with a quota of 10 Mb. Other limitations are defined by the external_user_profile1 applied to this user. As we mentioned earlier, different external systems can maintain and manage external users in the Oracle database. Using the capabilities of the operating system is the most common option. Thus, if we want to create an external database user accessible by the system account in the operating system, we only need to modify our statement slightly. We’ll add the ops$ prefix to the username: CREATE USER ops$external_user1\n IDENTIFIED EXTERNALLY\n DEFAULT TABLESPACE tbs_new_10\n QUOTA 10M ON tbs_new_10\n PROFILE external_user_profile1; GLOBALLY allows for creating global users. It means that their logins and passwords are stored on the Central Oracle Security Server instead of the specific database. Besides, roles assigned to global users on that central Server apply to this user in any database. It won’t be necessary to configure the user role in a separate database. Note that you need to enable the single sign-on option for global users. To create a global database user, we use the following statement: CREATE USER global_user1\n IDENTIFIED GLOBALLY AS 'CN=manager, OU=division, O=oracle, C=US'\n DEFAULT TABLESPACE USERS\n QUOTA 10M on USERS; Now we have a new global database user under the name of global_user1 . We assigned USERS default tablespace to that user with a quote of 10M. CREATE USER with Tablespace Clause Now, let us review the basic Oracle create new user script. It is below: CREATE USER username\n IDENTIFIED BY password\n DEFAULT TABLESPACE tablespace\n TEMPORARY TABLESPACE tbs_temp_01\n QUOTA {size | UNLIMITED} ON tablespace; As you see, the script includes several clauses that we should take into consideration: Default Tablespace This clause specifies the default tablespace for objects created by the user. Otherwise, such objects are stored in the default tablespace of the database. If there are not any default tablespaces specified for this particular database, the objects will get into the system tablespace. Restriction: don’t specify the locally managed temporary tablespace (such as undo tablespace or dictionary-managed temporary tablespace) to be the Oracle create user default tablespace. Temporary Tablespace This clause specifies the tablespace/tablespace group meant to contain the temporary segments of the user. Without it, those users’ temporary segments are stored in the default temporary tablespace of the database of the system tablespace. When you specify the tablespace group including the tablespace_group_name value in the script, users’ temporary segments can be saved in any tablespace of that group. Note : Make sure to specify the temporary tablespace with standard block size. It cannot be the undo tablespace or the tablespace with automatic segment-space management. Quota This clause specifies how much space this user can allocate in the tablespace. Multiple QUOTA clauses in one Oracle CREATE USER command can be present if you need to specify several tablespaces. The clause can include the UNLIMITED definition to allow this definite user to allocate the tablespace as much as needed, without bounds. Restriction : the QUOTA clause does not apply to temporary tablespaces. Create User Attributes There are additional, optional Oracle CREATE USER attributes you can include in the syntax. Have a look at the following example: CREATE USER username\n IDENTIFIED BY password\n [DEFAULT TABLESPACE tablespace]\n [QUOTA {size | UNLIMITED} ON tablespace]\n [PROFILE profile]\n [PASSWORD EXPIRE]\n [ACCOUNT {LOCK | UNLOCK}]; Let us review these optional clauses. Profile This optional clause lets you limit the database resources for this specific user at once when the limitations are defined in the particular profile. Without this clause, a new user automatically comes under the default profile. Password Expire The clause is optional, but many database administrators set it for more effective security. If included, this clause will determine the forced change of the password on the user’s side. Usually, it happens when the user tries to log into the database for the first time. Account Lock/Account Unlock You may use one of these clauses. With LOCK applied, Oracle creates the user account, but that account won’t have access to the database. If you apply the UNLOCK clause or don’t specify any of these two clauses, the account will be usable at once. The unlocked status is the default. The CREATE USER statement with these additional parameters would be as follows: CREATE USER visitor\n IDENTIFIED BY migzw23ter\n DEFAULT TABLESPACE tbs_new_10\n QUOTA 50M ON tbs_new_10\n TEMPORARY TABLESPACE tbs_temp_10\n QUOTA 5M ON system \n PROFILE qualified_user\n PASSWORD EXPIRE;\nACCOUNT UNLOCK Here, the statement creates a new Oracle database user named visitor , with the password migzw23ter . This user is assigned the default tablespace tbs_new_10 with a quota of 50Mb. This user is also allowed to use the temporary tablespace tbs_temp_10 . Grant Role to User The first step is the creation of a user. The next one is to set the user’s rights. A newly created user is not allowed to do anything, even to connect to the database. Working with Oracle databases inevitably includes the task of creating database users. There are the system user accounts that Oracle creates itself – hr , OE , sys , etc. These accounts have predefined configurations with rights and limitations. However, daily work will always require other users. One of the DBA’s duties is to create additional database users. The job includes configuring the user accounts, setting privileges, and managing users according to the business goals. Granting Permission in Oracle By using the GRANT command, you can provide the users with certain privileges and configure their roles according to your needs. In Oracle, you can grant your permission to others so that they can manipulate and manage the data in your database. GRANT is a very powerful statement with many possible options, but the core functionality is to manage the privileges of both users and roles throughout the database. GRANT Command Syntax The basic syntax of the query to grant certain privileges to the user is the following: GRANT to ; Oracle User Privileges The GRANT command can give the users privileges to create, alter, drop and manage database objects. For instance, the privileges to create tablespaces and to delete the rows of any table in a database are system privileges. Oracle has more than 100 system privileges that can be found in the SYSTEM_PRIVILEGE_MAP table. CLUSTER CREATE/CREATE ANY/ALTER ANY/DROP ANY CLUSTER DATABASE ALTER DATABASE, ALTER SYSTEM, AUDIT SYSTEM INDEX CREATE ANY/ALTER ANY/DROP ANY INDEX PROFILE CREATE/ALTER/DROP PROFILE ROLE CREATE/ALTER ANY/DROP ANY /GRANT ANY (allows REVOKE) Rollback Segment CREATE/ALTER/DROP ROLLBACK SEGMENT USER CREATE/ALTER/BECOME/DROP USER VIEW CREATE/CREATE ANY/DROP ANY VIEW SYNONYM CREATE/CREATE ANY/CREATE PUBLIC/DROP ANY/DROP PUBLIC SYNONYM SESSION CREATE/ALTER/RESTRICTED SESSION, ALTER RESOURCE COST TABLE CREATE/CREATE ANY/ALTER ANY/DROP ANY/SELECT ANY/INSERT ANY/UPDATE ANY/DELETE ANY/LOCK ANY TABLE TABLESPACE CREATE/ALTER/DROP/MANAGE TABLESPACE Usually, the administrator of a database grants the privileges to the users. However, there are cases when the administrator needs to transfer their Oracle user privileges. This is when DBA privileges come in. If a DBA needs to provide system privilege to another person, it has to be done with the admin option: GRANT create session TO user;\nGRANT create session TO user with admin option;\nRevoke create session from user; Besides the Oracle system privileges, object privileges are granted upon database objects: tables, views, procedures, and so on. How to Create and Grant All Privileges to Oracle User First, we need to grant our users the system privilege to log into the database. We use the following statement for that: GRANT CREATE SESSION to visitor; There are many permissions the database administrator can provide to the user. But it is essential to stick to the primary concept of security, which is to give users the minimum of privileges necessary to do the job efficiently. That’s why it is not recommended to provide all privileges to the user. You can apply other privileges one by one, each by a separate statement. Or, it is possible to combine these permissions into one, as shown below: GRANT CREATE VIEW, CREATE PROCEDURE, CREATE SEQUENCE, CREATE TRIGGER to visitor; If this definite user is allowed to change tables, procedures, triggers, etc., the syntax to set the necessary privilege for each case is below. Again, be very careful when allowing the user to change any elements, as this permission is global. GRANT ALTER ANY TABLE to visitor;\nGRANT ALTER ANY PROCEDURE to visitor;\nGRANT ALTER ANY TRIGGER to visitor; To allow the user to delete elements, we use the below statements: GRANT DELETE ANY TABLE to visitor;\nGRANT DROP ANY PROCEDURE to visitor;\nGRANT DROP ANY TRIGGER to visitor;\nGRANT DROP ANY VIEW to visitor; How to Grant Table Privilege to User in Oracle Before you set the privileges to the particular user, you should consider which tasks that person must perform in the database. The most common scenarios include creating tables, views, procedures, triggers. Some cases require the possibility to change or delete those elements. Depending on the situation, the administrator defines which system privileges to provide. Let us take a closer look at how to grant CREATE TABLE privilege to a user in Oracle. If we are willing to allow our user – visitor – to create tables in the database, we will use the following query: GRANT CREATE TABLE to visitor; Create Oracle Users and Grant Permissions Easily with dbForge Studio for Oracle If you are working with Oracle databases on a daily basis and looking for a convenient all-in-one powerful IDE, your search ends here. [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) helps developers to speed up PL/SQL coding, provides versatile data editing tools for managing in-database and external data. In this article, we will be creating an Oracle user and granting DBA privileges using this multi-purpose script: DECLARE\n schema_name VARCHAR2(255):='username1'; -- Insert your username instead of 'username1'\n row_count NUMBER;\nBEGIN\n FOR r IN (SELECT sid,serial# FROM v$session WHERE username = schema_name)\n LOOP\n EXECUTE IMMEDIATE 'ALTER SYSTEM DISCONNECT SESSION ''' || r.sid || ',' || r.serial# || ''''||' IMMEDIATE';\n EXECUTE IMMEDIATE 'ALTER SYSTEM KILL SESSION ''' || r.sid || ',' || r.serial# || '''';\n END LOOP;\n\n SELECT count(*) INTO row_count FROM dba_users WHERE username = schema_name;\n IF row_count > 0 THEN\n EXECUTE IMMEDIATE 'DROP USER '||schema_name||' CASCADE';\n END IF;\n EXECUTE IMMEDIATE 'CREATE USER '||schema_name||' IDENTIFIED BY ' || schema_name;\n EXECUTE IMMEDIATE 'GRANT dba TO '|| schema_name;\n EXECUTE IMMEDIATE 'ALTER SESSION SET CURRENT_SCHEMA = '||schema_name;\nEND;\n/ What we are doing in the provided script, is: creating a new user granting DBA privileges to the newly created user setting the newly created user as a default one for the current session In Oracle, users and schemas are essentially the same thing. You can consider that a user is an account to connect to a database, and a schema is the set of objects that belong to that account. The newly created schema is empty and, therefore, will not be displayed in Database Explorer . Let us create a departments table to make the new user appear in the list. To do that, feel free to use the following script: CREATE TABLE departments (\n department_id NUMBER CONSTRAINT PK_DepID PRIMARY KEY,\n department_name varchar2(255) NOT NULL,\n location_id NUMBER NOT NULL); Keep in mind that you do not have to open a different SQL document for this operation. dbForge Studio for Oracle allows you to execute the code only partially. Simply select the CREATE TABLE clause, make a right-click, and choose Execute Selection . If you have the AutoComit feature ON, the previous step will be the last one. However, if it is off, you will need to commit the changes. Then, on refreshing Database Explorer, you will see username1 in the list. There are instances when old sessions remain running on the server. They might interfere with the execution of new commands. In the code above, all the old sessions are automatically located and removed from the server. Should you require to restart your current session, you can disconnect or kill it using one of these queries: ALTER SYSTEM DISCONNECT SESSION ALTER SYSTEM KILL SESSION How to Delete (Drop) User in Oracle In case you need to remove any user for any reason, you should use the DROP USER command with the following syntax: DROP USER ; In our test case, we are removing the user visitor created earlier: DROP USER visitor; However, there are several restrictions that you need to pay attention to before dropping the user: You can’t remove users without deleting all the related objects. Thus, you must drop all tables, views, procedures, etc. that this user created before proceeding to the DROP command. You can’t remove users that are connected to the database. First, you have to clear up all sessions that the user had. After that, you can drop the user itself. There is a special command that allows for dropping the user with all its database objects in one shot: DROP USER CASCADE; Conclusion Summing up, now you can use the Oracle SQL CREATE USER command to add new users, configure, and manage them. We used the examples for manual performance. However, the capabilities of the modern software solutions for Oracle databases can simplify this job, make it faster and more accurate. Feel free to give dbForge Studio a try with a [free 30-day trial version](https://www.devart.com/dbforge/oracle/studio/download.html) . Useful Links [Setting Up a New User Account in Oracle](https://docs.devart.com/studio-for-oracle/managing-users-and-privileges/setting-up-new-user-account.html) [Oracle Database Administration Tools](https://www.devart.com/dbforge/oracle/studio/oracle-database-administration.html) [Create a Database in Oracle](https://blog.devart.com/how-to-create-database-in-oracle.html) [Oracle Rename Table](https://blog.devart.com/rename-table-in-oracle.html) [Oracle Alter Table](https://blog.devart.com/oracle-alter-table-statement.html) Tags [dbForge Studio for Oracle](https://blog.devart.com/tag/dbforge-studio-for-oracle) [Oracle](https://blog.devart.com/tag/oracle) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-oracle-user.html) [Twitter](https://twitter.com/intent/tweet?text=Create+a+New+Oracle+User+and+Grant+Privileges%3A+Syntax+and+Examples&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-oracle-user.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-oracle-user.html&title=Create+a+New+Oracle+User+and+Grant+Privileges%3A+Syntax+and+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-oracle-user.html&title=Create+a+New+Oracle+User+and+Grant+Privileges%3A+Syntax+and+Examples) [Copy URL](https://blog.devart.com/how-to-create-oracle-user.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-create-table-in-oracle.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) Oracle CREATE TABLE Command in PL/SQL with 10 Examples By [dbForge Team](https://blog.devart.com/author/dbforge) February 23, 2022 [0](https://blog.devart.com/how-to-create-table-in-oracle.html#respond) 39220 In this article, we are going to talk about the CREATE TABLE command. To be more precise, we will focus on how to create a table in Oracle with a primary and foreign key, as well as not null and date columns, take a close look at how to create a new table on a basis of the existing one, and more. Also, we will review the perks and benefits of using [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) when working with tables. Contents Oracle CREATE TABLE statement syntax CREATE TABLE example CREATE TABLE with PRIMARY KEY constraint CREATE TABLE with FOREIGN KEY CREATE TABLE with NOT NULL column CREATE TABLE with a date column Oracle alternative to SQL CREATE TABLE IF NOT EXISTS Create a new table from another table using CREATE TABLE AS SELECT CREATE TABLE from SELECT in PL/SQL How to create and insert data into a temporary table CREATE TABLE in a wink using dbForge for Oracle Conclusion [Creating tables in Oracle](https://docs.devart.com/studio-for-oracle/working-with-schema-objects/creating-editing-tables.html) is one of the ways to arrange data, so it’s critical to learn as much as possible about the CREATE TABLE command. There are several different ways to create a table in Oracle and we’ll cover them further in this guide. Please note that you must have the CREATE TABLE system privilege to create a new table. If you wish to create a table in another user’s schema, you must have the CREATE ANY TABLE system privilege. In case you are the owner of the table, you must have the UNLIMITED TABLESPACE system privilege or the quota for the tablespace that the table contains. Additionally, you must have the EXECUTE object privilege or the EXECUTE ANY TYPE system privilege to create an object table or a relational table that contains an object type column if you want to have access to all the types that are referenced by the table. The examples in this article relate to Oracle 19c version, but the methods are the same for all Oracle versions in use (including Oracle 10g, 11g, 12c, etc.). Oracle CREATE TABLE statement syntax Let us begin with the basics. To create a new table in an [Oracle database](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) , the CREATE TABLE statement can be used. The CREATE TABLE syntax in Oracle is as follows: CREATE TABLE schema_name.table_name (\n column_1 data_type column_constraint,\n column_2 data_type column_constraint,\n ...\n table_constraint\n ); Let us take a closer look at the provided syntax: schema_name.table_name : names of the table and schema the new table belongs to. column_1 and column_2 data_type column_constraints : placeholders for the column names. data_type : NUMBER, VARCHAR, etc. column_constraint : NOT NULL, primary key, check, etc. table_constraint : table constraints (primary key, foreign key, check). You will find the Oracle CREATE TABLE syntax example further in this article. Note : In addition to learning how to create a table in Oracle, you might also wish to deepen your knowledge about the [Oracle ALTER TABLE statement](https://blog.devart.com/oracle-alter-table-statement.html) . CREATE TABLE example It is always better to learn from practice. Therefore, let’s look at the Oracle CREATE TABLE example: The following example shows how to create a three-column table named employees : CREATE TABLE employees\n( employee_id number(10) NOT NULL,\n employee_name varchar2(50) NOT NULL,\n city varchar2(50)\n); Column 1 is named employee_id and has a number datatype (maximum 10 digits in length). It cannot contain null values. Column 2 is named customer_name and has a varchar2 datatype (50 maximum characters in length) and can not contain null values as well. Column 3 is named city and has a varchar2 datatype. Unlike the previous two, this column can contain null values. CREATE TABLE with PRIMARY KEY constraint Now, let’s see how to create a table in Oracle with a primary key. To define a primary key for the table, you can use our previously created table Employees , edit the Oracle SQL CREATE TABLE statement, and define the employee_id as the primary key: CREATE TABLE employees\n( employee_id number(10) NOT NULL,\n employee_name varchar2(50) NOT NULL,\n city varchar2(50),\n CONSTRAINT employees_pk PRIMARY KEY (employee_id)\n); The PRIMARY KEY clause allows specifying a column as the primary key one. You can use the primary key syntax to identify each unique row in the table. In contrast to other databases, Oracle enables a table to include only one primary key, and each field of the primary key must contain values other than NULL in order for the table to be considered valid. In the example above, we define the employee_id column as the Primary Key Column . CREATE TABLE with FOREIGN KEY Let’s take look at the syntax for the Oracle CREATE TABLE statement FOREIGN key. It can be both defined at a column level or table level: CREATE TABLE table_name\n(\ncol1 datatype [ NULL | NOT NULL ],\ncol2 datatype [ NULL | NOT NULL ],\n...\ncol_n datatype [ NULL | NOT NULL ]\nconstraint FOREIGN KEY (col1,col2) REFERENCES table(col1,col2)\n) tablespace \n\nCREATE TABLE table_name\n(\ncol1 datatype [ NULL | NOT NULL ] constraint primary key\n,\ncol2 datatype [ NULL | NOT NULL ],\n...\ncol_n datatype [ NULL | NOT NULL ]\n) tablespace ;\n\nCREATE TABLE dept\n( dept_id number(10) NOT NULL,\ndept_name varchar2(50) NOT NULL,\nCONSTRAINT dept_pk PRIMARY KEY (dept_id)\n);\n\nCREATE TABLE emp\n( emp_no number(10) NOT NULL,\nemp_name varchar2(50) NOT NULL,\ndept_id number(10),\nsalary number(6),\nCONSTRAINT emp_pk PRIMARY KEY (emp_no),\nCONSTRAINT dept_fk\nFOREIGN KEY (dept_id)\nREFERENCES dept(dept_id) ); In this syntax: Column 1 is labeled emp_no , and it is formatted as a number, which means that it cannot include any null values. Column 2 is titled emp_name , which is built as varchar2(50) and cannot include any null values. Column 3 is named dept_id and has a number datatype. Column 4 is called salary and is also formatted as a number column. Table level primary key constraint emp_pk is defined on the key ( emp_no ). Table level foreign key constraints dept_fk which references dept table dept_id . CREATE TABLE with NOT NULL column If you would like to specify that the column cannot be empty and must contain some value, you can define it as NOT NULL. Find the syntax for the CREATE TABLE command with the NOT NULL column in Oracle below: CREATE TABLE employees_bdays\n(\nemp_name VARCHAR(30),\nbday DATE\nemp_id VARCHAR(15) NOT NULL\n); As you can see, emp_id implies entering a value for each row of data as it is NOT NULL. CREATE TABLE with a date column If you need to create a table with the date column in Oracle, the following syntax might come in handy: CREATE TABLE employees_bdays (\nbday DATE\n); To efficiently insert a date into the column, you need the to_date function which accepts a character string containing the date as well as another character string instructing it on how to interpret the date received. to_date('01 December 2021','DD Month YYYY')\nto_date('01/12/2021','DD/MM/YYYY') Oracle alternative to SQL CREATE TABLE IF NOT EXISTS Those users, who are used to a very helpful SQL CREATE TABLE IF NOT EXISTS command, might be disappointed as there is no such statement in Oracle. But if you still need to determine whether a table already exists before creating it in Oracle, you will find alternative solutions to this problem below. If you receive an error (ORA-00955: name is already in use by an existing object) while trying to create a table, it can serve as an indicator that such a table is already there. Create a new table from another table using CREATE TABLE AS SELECT Sometimes, you need to [copy table data](https://www.devart.com/dbforge/oracle/studio/oracle-copy-table.html) and insert it into another one. It is quite easy to create a table like another table in Oracle, plus it’s very helpful. You save your time and effort if you want to create a table with the same structure. It also simplifies the process of testing. To see how to create a table from another table in Oracle, look at the below script: CREATE TABLE table_name AS (\nSELECT select_query\n); It’s also referred to as CREATE TABLE AS SELECT (CTAS). You can either enter the table_name for your new table or use the SELECT query to copy it. Also, you can enter SELECT * FROM old_table if you need to copy all the data to the new table. If you need to restrict the values to be copied across, the WHERE clause will be helpful for you. CREATE TABLE from SELECT in PL/SQL In Oracle, you can also create one table from another by entering the SELECT statement at the end of the CREATE TABLE statement. In this case, all the records from the old table will be copied to the new one. The syntax of the Oracle CREATE TABLE from the SELECT is as follows: CREATE TABLE new_table\n AS (SELECT * FROM old_table); As you can see, PL/SQL table creation from the SELECT command is very helpful. Note that you can also create a temporary table from SELECT . How to create and insert data into a temporary table Temporary tables are used in Oracle to store the data that belongs to one session or one transaction. ORACLE temporary table is a DDL object with all the restrictions. So how to create a temporary table in Oracle? You can use the CREATE GLOBAL TEMPORARY TABLE statement to create a temporary table. To define if the data in the table is transaction-specific (the default) or session-specific, use the ON COMMIT clause. The syntax for the transaction-specific data is as follows: CREATE GLOBAL TEMPORARY TABLE admin_work_area\n (startdate DATE,\n enddate DATE,\n operation CHAR(20))\n ON COMMIT DELETE ROWS; CREATE TABLE in a wink using dbForge for Oracle Time has always been one of the most valuable resources out there. Therefore, you might find using an appropriate IDE helpful in terms of saving your time. One of the best solutions is [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) . It is a universal tool that can be of use for developing, administrating, and managing Oracle databases. This efficient GUI client is great for DBAs, database developers, software engineers, and analysts. dbForge Studio for Oracle allows for data synchronization between several Oracle servers and facilitates database development processes automation. On opening the IDE for the first time, you will see the Database Connection Properties window. To open it manually, choose Database and click New Connection . After that, fill in the corresponding fields and hit Test Connection . If everything is configured correctly, you will see the Successfully connected message that will look somehow like this: Having connected to your Oracle server, choose the user you wish to create a new table for. Our recent [blog post on how to create a new user in Oracle](https://blog.devart.com/how-to-create-oracle-user.html) might come in handy at this step. The first step towards creating a new table is making a right-click on the required schema. Point to New Object and click the first option: Table . 1. In the Name text box of the Table Editor , enter a table name. In the example below, we are creating a table titled employees . 2. In the grid below, type in the names for the future columns, choose the data type, and whether they should be NOT NULL. 3. The Column properties will be displayed on the right-hand side of Table Editor . 4. You will see that all the actions you perform in Table Editor , are reflected in the SQL query at the bottom of the window. Once the properties of the table are configured, click Apply Changes and that’s it! Conclusion Creating tables is one of the most common tasks when working with Oracle databases as it helps organize data. In our guide, we have offered a detailed walkthrough of how to create a table in Oracle using 10 different ways. You can use the Oracle SQL CREATE TABLE statement and execute the query manually or use an appropriate IDE to automate the process. In this case, [dbForge Studio for Oracle](https://www.devart.com/dbforge/oracle/studio/) is the best choice. Also, the tool can greatly increase your productivity because it has everything required for [Oracle PL/SQL performance tuning](https://www.devart.com/dbforge/oracle/studio/performance-tuning.html) . Useful links [Create a Database in Oracle](https://blog.devart.com/how-to-create-database-in-oracle.html) [Rename the table in Oracle](https://blog.devart.com/rename-table-in-oracle.html) [PL/SQL Formatter](https://www.devart.com/dbforge/oracle/studio/) Tags [create table](https://blog.devart.com/tag/create-table) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [Oracle Tutorial](https://blog.devart.com/tag/oracle-tutorial) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-table-in-oracle.html) [Twitter](https://twitter.com/intent/tweet?text=Oracle+CREATE+TABLE+Command+in+PL%2FSQL+with+10+Examples&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-create-table-in-oracle.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-create-table-in-oracle.html&title=Oracle+CREATE+TABLE+Command+in+PL%2FSQL+with+10+Examples) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-create-table-in-oracle.html&title=Oracle+CREATE+TABLE+Command+in+PL%2FSQL+with+10+Examples) [Copy URL](https://blog.devart.com/how-to-create-table-in-oracle.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Products](https://blog.devart.com/category/products) [Productivity Tools](https://blog.devart.com/category/products/productivity-tools) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Debug a Stored Procedure with dbForge Studio for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) September 6, 2022 [0](https://blog.devart.com/how-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html#respond) 3612 The article provides a detailed overview of the [SQL Debugger](https://www.devart.com/dbforge/sql/studio/tsql-debugger.html) functionality built into dbForge Studio for SQL Server, a powerful IDE for efficient database development, administration, testing, and deployment boasting superb features and functionalities. Debugging is an essential part of SQL Server database development. It allows developers to detect and fix bugs in their code. With a dedicated debugger, it is easier to understand what the bug is, where it has occurred, and what has caused it. In the article, we will describe the following: How does the dbForge Studio’s Debugger work? Requirements to start debugging How to debug a stored procedure Step 1: Add variables to the Watches pane Step 2: Set breakpoints Step 3: Debug dependent objects Step 4: Analyze the result Debug a stored procedure directly from the code in a SQL document Conclusion How does the dbForge Studio’s Debugger work? [SQL Debugger](https://www.devart.com/dbforge/sql/studio/tsql-debugger.html) built into dbForge Studio for SQL Server is a reliable tool that can help you prevent issues and software faults before they might happen by analyzing the runtime behavior of database objects. The debugger allows you to find logic errors in T-SQL scripts, stored procedures, triggers, and functions by checking the code execution line by line. The tool also supports browsing call stack, setting breakpoints, as well as evaluating and modifying variables, and analyzing changes. With SQL Debugger, you can be sure that your code works as expected and won’t be stuck with any bugs. Requirements to start debugging When you try to start SQL Debugger, make sure that the following requirements are met: Server-side and client-side components of T-SQL Debugger have been installed. Debugging firewall exceptions have been enabled on the client and server sides. You are connected to a Database Engine SQL document using the Windows Authentication or SQL Server Authentication login being a member of the sysadmin fixed server role. A Database Engine SQL document is connected to an instance of the Database Engine from SQL Server 2005 Service Pack 2 (SP2) or higher. Client and server sides must be on the same subnet because Firewall does not allow RPC on remote computers (that are not on the same subnet). You cannot work with the database in a single-user mode. Note : The Debugger does not work with Azure SQL databases. Now, we are going to demonstrate how easy it is to debug a stored procedure with the help of the SQL Debugger available in dbForge Studio for SQL Server, as well as how to debug a stored procedure directly from the SQL document. Prerequisites For demo purposes, we will use the spShowOddNumbers stored procedure, which returns all odd numbers from a given range. IF OBJECT_ID(N'dbo.OddNumbers', N'U') IS NOT NULL DROP TABLE [dbo].[OddNumbers];\nCREATE TABLE dbo.OddNumbers\t(Odd_Numbers INT PRIMARY KEY);\nGO\nCREATE TRIGGER TrInsNumber ON [dbo].[OddNumbers] AFTER INSERT AS PRINT 'Row inserted';\nGO\nCREATE OR ALTER PROCEDURE dbo.spInsOddNumbers @VarIns INT\nAS\n\tBEGIN\n\t\tINSERT INTO OddNumbers VALUES (\t@VarIns\t)\n\tEND\nGO \nCREATE OR ALTER PROCEDURE dbo.spShowOddNumbers\n\t@LowerRange INT,\n\t@UpperRange INT\nAS\n\tBEGIN\n\tSET NOCOUNT ON\n\t\tDECLARE @TempVar INT\n\t\tSET @TempVar = @LowerRange\n\t\tWHILE (@TempVar < @UpperRange)\n\t\t\tBEGIN\n\t\t\t\tIF (@TempVar % 2 != 0)\n\t\t\t\t\tBEGIN\n\t\t\t\t\t\tPRINT @TempVar\n\t\t\t\t\t\tEXECUTE spInsOddNumbers @VarIns = @TempVar\n\t\t\t\t\t\tEND\n\t\t\t\tSET @TempVar = @TempVar + 1\n\t\t\tEND\n\t\tPRINT 'PRINTED ODD NUMBERS BETWEEN ' + RTRIM(@LowerRange) + ' and ' + RTRIM(@UpperRange)\n\t\tSELECT * FROM OddNumbers;\n\t\tTRUNCATE TABLE OddNumbers;\n\t\tSET NOCOUNT OFF\n\tEND\nGO How to debug a stored procedure Open dbForge Studio for SQL Server. To start debugging, in Database Explorer , right-click the stored procedure for debugging and select Step Into . The Edit Parameters dialog opens where you can quickly add values of input variables. In the dialog, you can execute or debug a procedure without declaring the values of variables in the script. So next, assign values 1 and 4 for the LowerRange and UpperRange variables respectively, and then click OK . Note: @LowerRange and @UpperRange define the range of values. After that, you need to click F11 to start debugging the procedure step by step. During debugging, you can monitor the variable values. To do that, you need to add these variables to the Watches pane. Step 1: Add variables to the Watches pane In the SQL Debugger document, right-click the variables which values you want to track and then select Add Watch . As you can see, we set the watches to the following variables: @LowerRange, @UpperRange, and @TempVar. The variables along with their values and data types are now displayed in the Add Watches pane. Step 2: Set breakpoints During debugging, you can set breakpoints to break or pause debugger execution on any line of executable code. All the breakpoints and their properties will be displayed in the Breakpoints pane. Let’s see how to manage breakpoints. First, set a breakpoint on the following line of the code: SET @TempVar = @TempVar + 1 You can insert a breakpoint using one of the following ways: Right-click the line which you want to break at and click Insert Breakpoint . On the Debug menu, click Toggle Breakpoint . In the gray bar to the left, click the line you want to break at. As you can see, the breakpoint is now displayed in the gray bar to the left of the statement and in the Breakpoints pane. Also, you can view the breakpoint status as enabled or disabled. Note: If you do not need a breakpoint any longer, you can disable or remove breakpoints by using the toolbar in the Breakpoints pane. Alternatively, you can remove the breakpoint by right-clicking the breakpoint you want to delete and selecting Delete Breakpoint . Step 3: Debug dependent objects After inserting breakpoints, we can keep on debugging by clicking Step Into on the Debug menu or pressing F11 . Meanwhile, you can see how the values of variables are being changed in the Watches pane. Also, you can hover over the variable to view the current value of the variable in the hint. It should be noted that if you debug the procedure that triggers the execution of the dependent database objects, debugging of these objects will start automatically. In our example, the spShowOddNumbers procedure triggers the spInsOddNumbers procedure that inserts values into the OddNumbers table. It, in turn, starts the TrInsNumber trigger that is being debugged too. Thus, during the debugging of the spInsOddNumbers procedure, all dependent objects will be under debugging as shown in the screenshot. In the SQL Debugger, you can also view information about the database objects being debugged in the Call Stack pane. It displays the procedure calls that are currently on the stack. The yellow arrow refers to the line of the code where the debugger is currently located. Step 4: Analyze the result When debugging is complete, the Output pane automatically opens displaying the execution result of procedure debugging. This allows you to examine the result and understand what can be done to avoid errors in your code. With SQL Debugger, you can debug the procedure directly from the SQL document. Now, it is time to see how to do that. Debug a stored procedure from a SQL document You can also debug a stored procedure directly from the code in the SQL document with the help of the built-in script generation feature. To do that, in Database Explorer , right-click the procedure you want to debug and select Generate Script As > EXECUTE > To New SQL Window . As you can see, a script to debug the procedure has been generated in a new SQL document. In the SQL document, assign the values 1 and 4 for the LowerRange and UpperRange variables respectively, and then press F11 . This will let the debugger go through the variables one by one and step into the called procedure to debug it. After debugging is over, the Output pane opens automatically where you can view and analyze the result of the debugging process. Conclusion As you can see, debugging of stored procedures, functions, and triggers with SQL Debugger built into dbForge Studio for SQL Server is simple, quick, and efficient. The tool can easily test your code for any errors, unexpected behavior, or adverse effects. Want to see how SQL Debugger works? [Download](https://www.devart.com/dbforge/sql/studio/download.html) a free 30-day trial version of dbForge Studio for SQL Server to evaluate the capabilities of the tool. While working with the tool, you will definitely enjoy it and after its expiration, you don’t want to part with it. Tags [dbForge Studio for SQL Server](https://blog.devart.com/tag/dbforge-studio-for-sql-server) [debugging](https://blog.devart.com/tag/debugging) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [t-sql debugger](https://blog.devart.com/tag/t-sql-debugger) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Debug+a+Stored+Procedure+with+dbForge+Studio+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html&title=How+to+Debug+a+Stored+Procedure+with+dbForge+Studio+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html&title=How+to+Debug+a+Stored+Procedure+with+dbForge+Studio+for+SQL+Server) [Copy URL](https://blog.devart.com/how-to-debug-a-stored-procedure-with-dbforge-studio-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) Find and Delete Incomplete Open Transactions in SQL Server – Part 1 By [dbForge Team](https://blog.devart.com/author/dbforge) March 17, 2020 [0](https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html#respond) 2928 Frequently enough, MS SQL Server open transactions are incompleted and their initiators forget about them. This is quite a commonly encountered situation in routine database development. The most striking example is when one uses SQL Server Management Studio to run a script that starts an explicit transaction with the BEGIN TRANSACTION statement, but then the batch is canceled in the middle of a transaction without issuing a COMMIT or ROLLBACK statement to complete the transaction and the transaction is left open. That leads to the situation where the locks acquired during that transaction continue to be held. Meanwhile, a person who launched the transaction either forgets about it or puts it aside for some time. As a result, a large number of locks are held and users are blocked. This article discusses deleting lost transactions using the [SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) tool. The incomplete transaction term is used to refer to an active (running) transaction that doesn’t have any active (running) queries for a sufficiently long period of time T. A general algorithm for finding and deleting incomplete SQL Server transactions Follow the steps below to find open transactions that are not complete and delete them 1. First, let’s create two tables: one table to keep and analyze information about current open transactions and the second one to archive the selected from the first table transactions for further analysis. 2. Collect information about transactions and their sessions that have no queries (transactions launched and left unfinished within a certain period of time T). 3. Update the table containing the list of currently active transactions from step 1 (if an incomplete transaction acquires an active request, it is no longer considered to be lost and has to be deleted from the table). 4. Determine the sessions to be killed (a session has at least one incomplete transaction mentioned in the table from step 1 and no queries are running for that session). 5. Archive the data you are going to delete (information about the transactions, sessions, and connections that will be killed). 6. Kill the sessions. 7. Delete the processed entries and those entries that cannot be deleted and have been in the table from step 1 for a long time. Below is the worked example of how to implement step 1 of that algorithm. We’ll demonstrate how to create a table to list and archive incomplete transactions. Create a table to list and check lost transactions Applying Code formatting With the help of [dbForge SQL Complete](https://www.devart.com/dbforge/sql/sqlcomplete/) , we can quickly and easily create a table to store information about current lost transactions. 1. The tool helps you speed up your routine coding with multiple embedded prompts for the T-SQL code. It takes just a few clicks to make a table. Fig. 1. An autocomplete drop-down list 2. The letters in a string are converted into uppercase. Fig. 2. Lowercase commands converted to uppercase This way, we continue to polish the table creation script: SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[SessionTran](\n[SessionID] INT NOT NULL,\n[TransactionID] BIGINT NOT NULL,\n[CountTranNotRequest] TINYINT NOT NULL,\n[CountSessionNotRequest] TINYINT NOT NULL,\n[TransactionBeginTime] DATETIME NOT NULL,\n[InsertUTCDate] DATETIME NOT NULL,\n[UpdateUTCDate] DATETIME NOT NULL,\n CONSTRAINT [PK_SessionTran] PRIMARY KEY CLUSTERED \n(\n[SessionID] ASC,\n[TransactionID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) ON [PRIMARY]\n) ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD  CONSTRAINT [DF_SessionTran_Count] DEFAULT ((0)) FOR [CountTranNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD  CONSTRAINT [DF_SessionTran_CountSessionNotRequest]  DEFAULT ((0)) FOR [CountSessionNotRequest]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD  CONSTRAINT [DF_SessionTran_InsertUTCDate]  DEFAULT (getutcdate()) FOR [InsertUTCDate]\nGO\n\nALTER TABLE [srv].[SessionTran] ADD  CONSTRAINT [DF_SessionTran_UpdateUTCDate]  DEFAULT (getutcdate()) FOR [UpdateUTCDate]\nGO In\nthis script: 1) SessionID identifies a session 2) TransactionID identifies a lost transaction 3) CountTranNotRequest stands for the number of times the transaction was recorded as lost 4) CountSessionNotRequest stands for the number of times the session was recorded as one that has no active queries and contains a lost transaction 5) TransactionBeginTime refers to the start date and time of the lost transaction 6) InsertUTCDate identifies the date and time (UTC) when the record was made 7) UpdateUTCDate identifies the date and time (UTC) when the record was updated. How to Format the Document with SQL Complete Applying\nFormatting to the current document It doesn’t matter which editor you used to create a SQL document, you can still format it with the help of the SQL Complete Format Document option. Fig. 3. The Format Document command on the SQL Complete menu Before formatting, the script looked as follows: create table [srv].[SessionTran](\n[SessionID] int not null, [TransactionID] bigint not null, [CountTranNotRequest] tinyint not null, [CountSessionNotRequest] tinyint not null,\n[TransactionBeginTime] datetime not null, [InsertUTCDate] datetime not null, [UpdateUTCDate] datetime not null,\n constraint [PK_SessionTran] primary key clustered ([SessionID] asc, [TransactionID] asc)\n with (pad_index = off, statistics_norecompute = off, ignore_dup_key = off, allow_row_locks = on, allow_page_locks = on, fillfactor = 95) ON [PRIMARY]\n) on [PRIMARY]\nGO Below is the script we got after applying formatting: CREATE TABLE [srv].[SessionTran] (\n[SessionID] INT NOT NULL\n   ,[TransactionID] BIGINT NOT NULL\n   ,[CountTranNotRequest] TINYINT NOT NULL\n   ,[CountSessionNotRequest] TINYINT NOT NULL\n   ,[TransactionBeginTime] DATETIME NOT NULL\n   ,[InsertUTCDate] DATETIME NOT NULL\n   ,[UpdateUTCDate] DATETIME NOT NULL\n   ,CONSTRAINT [PK_SessionTran] PRIMARY KEY CLUSTERED ([SessionID] ASC, [TransactionID] ASC)\nWITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) ON [PRIMARY]\n) ON [PRIMA\nGO The differences can be seen with half an eye: The\n\tindentation was employed The\n\tT-SQL keywords got capitalized Line\n\tbreaks were added before every column name. That significantly enhances code readability and comprehensibility leading to more accurate perception and decision-making. Applying\nFormatting to external documents The SQL Complete tool also allows applying formatting only to a selected fragment of your script. Please, note that the Format Selection option gets enabled only after you select the fragment to be formatted either with a mouse or a cursor. Fig. 4. The Format Selection command on the SQL Complete menu The figure below shows the script fragment before formatting. Fig. 5. The code fragment before applying formatting Study the following figure to see how the code fragment looks like after formatting. Fig. 6. The formatted code fragment The [SQL Formatter](https://www.devart.com/dbforge/sql/sqlcomplete/sql-code-formatter.html) functionality should also be mentioned. With its help, you can seamlessly format scripts written previously. Fig. 7. The SQL Formatter command After selecting the SQL Formatter command from the SQL Complete menu, the window offering to select the formatting mode appears. You can choose to format either the selected files or all files within a certain folder. In our worked example, we select Files . Click Next to proceed. Fig. 8. SQL Formatter Wizard Now you need to select files to format. Fig. 9. Adding files to format Having compiled a list of files to be formatted, click Format . Fig. 10. Starting the formatting process Thus, the formatting is launched. Fig. 11. Formatting the files On completion of the formatting process, the window appears displaying the number of files that have been modified, and these files open in SQL Server Management Studio. Fig. 12. Completion window in SQL Formatter In the same manner, we can refactor the code of all the files in the Scripts directory. To format all files in the specified folder, in the SQL Formatter Wizard, select the Directories option. Fig. 13. The Directories option in the SQL Formatter Wizard Having selected the Directories mode, click Next to proceed. In the next window, click Add to select the required folder. It is worth mentioning that in the SQL Formatter Wizard you can specify file extensions to be formatted as well as select to keep modified files open after formatting and include sub-folders. Fig. 14. The Folder list window of the SQL Formatter Wizard In this case, when the formatting process finishes, the files that have been modified also open in SSMS. Fig. 15. Completion window in SQL Formatter Please note, that by default, those files are not saved, but open in a modified form instead. If you want to change that, you need to clear the checkbox Keep modified files open after formatting . After unselecting the checkbox, the modified files won’t open in Management Studio and the changes will be applied and saved straightway. Advanced Formatting options Providing that, you need deeper customization options, navigate to the Options window of SQL Complete. That can be easily done from the main menu. Fig.16. The Options command on the SQL Complete menu Next, in the sidebar, navigate to Formatting . Fig. 17. Formatting Options The Formatting tab has two sub-tabs. On the General sub-tab you can: make\n\tbasic formatting adjustments customize\n\tnotifications settings select\n\tan\n\teditable\n\tdictionary file to\n\ttune up capitalization and\n\tspecify available prefixes for\n\texceptions. On the Profiles sub-tab, you will find a list of formatting profiles. On this tab, you can also edit the existing profile, create a new one, activate the required profile, or open the folder containing formatting profiles. Fig. 18. The Profiles sub-tab If you want either to edit a profile or check the profile rules, you need to highlight the required profile and then click Edit Profile . Alternatively, you can double-click the desired profile. Fig. 19. Editing a profile The Code Formatting functionality of SQL Complete helps transform illegible SQL code into a layout readable for most users. It significantly facilitates code development and maintenance. To find more about code formatting follow this [link](https://www.devart.com/dbforge/sql/sqlcomplete/sql-code-formatter.html) . How to enable or disable dbForge SQL Complete One may encounter situations where it is required to disable the tool as the SSMS gets way too slow. This can be done in a few clicks. Just click Disable Code Completion on the main menu of the tool. Fig. 20. Disabling SQL Complete Correspondingly, click Enable Code Completion if you want to enable the tool. Fig. 21. Enabling Code Completion Getting back to the point, we have created the table srv.SessionTran to record killed sessions for lost transactions. How to create a table to archive lost transactions according to delete actions Now, in a similar way, we will create a table to archive open transactions selected from the first table according to the delete actions. SET ANSI_NULLS ON\nGO\n\nSET QUOTED_IDENTIFIER ON\nGO\n\nCREATE TABLE [srv].[KillSession](\n[ID] [int] IDENTITY(1,1) NOT NULL,\n[session_id] [smallint] NOT NULL,\n[transaction_id] [bigint] NOT NULL,\n[login_time] [datetime] NOT NULL,\n[host_name] [nvarchar](128) NULL,\n[program_name] [nvarchar](128) NULL,\n[host_process_id] [int] NULL,\n[client_version] [int] NULL,\n[client_interface_name] [nvarchar](32) NULL,\n[security_id] [varbinary](85) NOT NULL,\n[login_name] [nvarchar](128) NOT NULL,\n[nt_domain] [nvarchar](128) NULL,\n[nt_user_name] [nvarchar](128) NULL,\n[status] [nvarchar](30) NOT NULL,\n[context_info] [varbinary](128) NULL,\n[cpu_time] [int] NOT NULL,\n[memory_usage] [int] NOT NULL,\n[total_scheduled_time] [int] NOT NULL,\n[total_elapsed_time] [int] NOT NULL,\n[endpoint_id] [int] NOT NULL,\n[last_request_start_time] [datetime] NOT NULL,\n[last_request_end_time] [datetime] NULL,\n[reads] [bigint] NOT NULL,\n[writes] [bigint] NOT NULL,\n[logical_reads] [bigint] NOT NULL,\n[is_user_process] [bit] NOT NULL,\n[text_size] [int] NOT NULL,\n[language] [nvarchar](128) NULL,\n[date_format] [nvarchar](3) NULL,\n[date_first] [smallint] NOT NULL,\n[quoted_identifier] [bit] NOT NULL,\n[arithabort] [bit] NOT NULL,\n[ansi_null_dflt_on] [bit] NOT NULL,\n[ansi_defaults] [bit] NOT NULL,\n[ansi_warnings] [bit] NOT NULL,\n[ansi_padding] [bit] NOT NULL,\n[ansi_nulls] [bit] NOT NULL,\n[concat_null_yields_null] [bit] NOT NULL,\n[transaction_isolation_level] [smallint] NOT NULL,\n[lock_timeout] [int] NOT NULL,\n[deadlock_priority] [int] NOT NULL,\n[row_count] [bigint] NOT NULL,\n[prev_error] [int] NOT NULL,\n[original_security_id] [varbinary](85) NOT NULL,\n[original_login_name] [nvarchar](128) NOT NULL,\n[last_successful_logon] [datetime] NULL,\n[last_unsuccessful_logon] [datetime] NULL,\n[unsuccessful_logons] [bigint] NULL,\n[group_id] [int] NOT NULL,\n[database_id] [smallint] NOT NULL,\n[authenticating_database_id] [int] NULL,\n[open_transaction_count] [int] NOT NULL,\n[most_recent_session_id] [int] NULL,\n[connect_time] [datetime] NULL,\n[net_transport] [nvarchar](40) NULL,\n[protocol_type] [nvarchar](40) NULL,\n[protocol_version] [int] NULL,\n[encrypt_option] [nvarchar](40) NULL,\n[auth_scheme] [nvarchar](40) NULL,\n[node_affinity] [smallint] NULL,\n[num_reads] [int] NULL,\n[num_writes] [int] NULL,\n[last_read] [datetime] NULL,\n[last_write] [datetime] NULL,\n[net_packet_size] [int] NULL,\n[client_net_address] [nvarchar](48) NULL,\n[client_tcp_port] [int] NULL,\n[local_net_address] [nvarchar](48) NULL,\n[local_tcp_port] [int] NULL,\n[connection_id] [uniqueidentifier] NULL,\n[parent_connection_id] [uniqueidentifier] NULL,\n[most_recent_sql_handle] [varbinary](64) NULL,\n[LastTSQL] [nvarchar](max) NULL,\n[transaction_begin_time] [datetime] NOT NULL,\n[CountTranNotRequest] [tinyint] NOT NULL,\n[CountSessionNotRequest] [tinyint] NOT NULL,\n[InsertUTCDate] [datetime] NOT NULL,\n CONSTRAINT [PK_KillSession] PRIMARY KEY CLUSTERED \n(\n[ID] ASC\n)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]\n) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]\nGO\n\nALTER TABLE [srv].[KillSession] ADD  CONSTRAINT [DF_KillSession_InsertUTCDate]  DEFAULT (getutcdate()) FOR [InsertUTCDate]\nGO In\nthis script: sys.dm_exec_sessions and sys.dm_exec_connections refer to the system views InsertUTCDate identifies\nUTC date and time when the record was created. Conclusion In this article, we presented the general algorithm for deleting lost transactions in SQL Server and explored how to implement step 1 of that algorithm with the help of SQL Complete. Along with that obvious strengths of the tool were demonstrated. IntelliSense-style code completion, highly customizable and sharable code-formatting, efficient code refactoring, and a bunch of other useful features were designed to take care of your code letting you focus exclusively on how it actually works. The second part of the series of articles can be found [here](https://blog.devart.com/find-and-delete-sql-server-incomplete-open-transactions.html) . Tags [dbforge](https://blog.devart.com/tag/dbforge) [delete incomplete transactions](https://blog.devart.com/tag/delete-incomplete-transactions) [sql complete](https://blog.devart.com/tag/sql-complete) [SQL Server](https://blog.devart.com/tag/sql-server) [sql server transactions](https://blog.devart.com/tag/sql-server-transactions) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-delete-lost-transactions-with-sql-complete-part-1.html) [Twitter](https://twitter.com/intent/tweet?text=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+1&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-delete-lost-transactions-with-sql-complete-part-1.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html&title=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+1) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html&title=Find+and+Delete+Incomplete+Open+Transactions+in+SQL+Server+%E2%80%93+Part+1) [Copy URL](https://blog.devart.com/how-to-delete-lost-transactions-with-sql-complete-part-1.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Develop Database Applications for iOS and Android in C++Builder with UniDAC By [DAC Team](https://blog.devart.com/author/dac) July 3, 2014 [0](https://blog.devart.com/how-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html#respond) 6192 C++ Builder supports iOS and Android application development since version XE6. This article discusses the basics of developing database applications for iOS and Android in C++ Builder using UniDAC. [UniDAC](https://www.devart.com/unidac/) is a suite of universal data access components for connecting to [SQL Server](https://www.devart.com/sdac/) , [SQLite](https://www.devart.com/litedac/) , [Oracle](https://www.devart.com/odac/) , [MySQL](https://www.devart.com/mydac/) , [PostgreSQL](https://www.devart.com/pgdac/) , SAP ASE, xBase (DBF), Amazon Redshift, and [InterBase](https://www.devart.com/ibdac/) from iOS and Android (you can use UniDAC to develop Windows applications that connect to almost any popular database, but this is beyond the scope of this article). Connecting to a Database You can access a database from iOS or Android in almost the same way you  access it from Windows, but you should be aware of some aspects of  connecting and deploying files to a mobile device when working with a local database. This article contains connection instructions and sample code for each database supported by UniDAC. Connecting in Design-time Let’s create an Android application that connects to MySQL. Select File > New > Multi-Device Application – C++ Builder. Select Blank Application, then place the TUniConnection and TMySQLUniProvider components onto the form. Set the ProviderName property of TUniConnection to MySQL and assign values to the  Username, Password, Server, and Port properties. You can test database connectivity by setting the Connected property to True. If the values specified are correct, you will be able to view the list of available databases in the Database dropdown. Compiling the Project Select Project > Add to Project… and add the MySQL provider library, which is located in “C:\\Program Files (x86)\\Devart\\UniDAC for RAD Studio 10.3\\Lib\\Android64”. For C++ Builder 10.3 Rio, the filename is libmyprovider260.a. The table below contains database servers and their corresponding provider libraries for mobile application development in C++ Builder 10.3 Rio using UniDAC. Database System Standard Edition Professional Edition ASE libaseprovider260.a libtdsprovider260.a SQL Server libmsprovider260.a libtdsprovider260.a SQLite libliteprovider260.a sqlite3.o MySQL libmyprovider260.a Oracle liboraprovider260.a PostgreSQL libpgprovider260.a InterBase ToGo libibprovider260.a Amazon Redshift librsprovider260.a libpgprovider260.a xBase libdbfprovider260.a libvquery260.a sqlite3.o If you don’t add this file to the project, you will get an error message during compilation: “[ldandroid Error] C:\\Users\\Public\\Documents\\Embarcadero\\Studio\\14.0\\PlatformSDKs\\android-ndk-r9c\\toolchains\\arm-linux-androideabi-4.6\\prebuilt\\windows\\bin\\arm-linux-androideabi-ld.exe: .\\Android\\Debug\\Unit1.o: in function _ZTX6TForm1:.\\Android\\Debug\\Unit1.o.ll(.data.rel.ro._ZTX6TForm1+0x6): error: undefined reference to ‘vtable for Mysqluniprovider::TMySQLUniProvider'” Compile the project. Connecting in Run-Time Put the needed providers onto the form and add their library files (similar to what you did in design-time). Note that despite having the same name, the provider libraries for Android and iOS are different and located in their respective folders: “C:\\Program Files (x86)\\Devart\\UniDAC for RAD Studio 10.3\\Lib\\Android64” “C:\\Program Files (x86)\\Devart\\UniDAC for RAD Studio 10.3\\Lib\\iOSDevice64” You need to either place the TUniConnection component onto the form or add the following lines to the header file: #include \"DBAccess.hpp\"\n#include \"Uni.hpp\" and the following lines to the cpp file: #pragma link \"DBAccess\"\n#pragma link \"Uni\" If you are planning to use a local database on a mobile device, add this line to the header file to get access to the IOUtils namespace: #include ASE ASE has no client for Android or iOS, therefore a connection to an ASE server can only be established directly via TCP/IP by setting the Direct property to True: SpecificOptions->Values[\"Direct\"] = \"True\"; Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"ASE\";\n Connection->Server = \"server\";\n Connection->Username = \"user_name\";\n Connection->Password = \"password\";\n Connection->Database = \"database_name\";\n Connection->SpecificOptions->Values[\"Direct\"] = \"True\";\n Connection->Connect();\n ShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } SQL Server SQL Server has no MS SQL Native Client for Android or iOS, therefore a connection to SQL Server can only be established directly via TCP/IP by setting the Provider property to prDirect: SpecificOptions->Values[\"Provider\"] = \"prDirect\"; Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"SQL Server\";\n Connection->Server = \"server\";\n Connection->Username = \"user_name\";\n Connection->Password = \"password\";\n Connection->Database = \"database_name\";\n Connection->SpecificOptions->Values[\"Provider\"] = \"prDirect\";\n Connection->Connect();\n ShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } SQLite If you don’t deploy a database with your application, set the ForceCreateDatabase property to True to create a database file automatically when the user first launches your application. SpecificOptions->Values[\"ForceCreateDatabase\"] = \"True\" Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"SQLite\";\n\tConnection->SpecificOptions->Values[\"ForceCreateDatabase\"] = \"True\";\n\tConnection->Database = System::Sysutils::IncludeTrailingPathDelimiter(\n\t System::Ioutils::TPath::GetDocumentsPath()) + \"db.sqlite3\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } Oracle Oracle has no client for Android or iOS, therefore a connection to an Oracle server can only be established directly via TCP/IP by setting the Direct property to True: SpecificOptions->Values[\"Direct\"] = \"True\"; To establish a connection to Oracle from Android or iOS, assign your host, port, and service name or system identifier to the Server property. To connect using the Service Name, the format is as follows: Server = \"Host:Port:sn/ServiceName\";\nServer = \"Host:Port:sn=ServiceName\"; (deprecated format) To connect using the SID, the format is as follows: Server = \"Host:Port:SID\";\nServer = \"Host:Port:sid=SID\"; (deprecated format) If the port number is followed by a colon, and the service name prefix (sn=) or the SID prefix (sid=) is not defined, then by default, the connection will be established using SID. In majority of Oracle servers, the service name is the same as the SID. Consult the Oracle documentation for more information. Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"Oracle\";\n\tConnection->SpecificOptions->Values[\"Direct\"] = \"True\";\n\tConnection->Server = \"server:1521:orcl\";\n\tConnection->Username = \"user_name\";\n\tConnection->Password = \"password\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } MySQL MySQL has no client for Android or iOS, therefore a connection to a MySQL server can only be established directly via TCP/IP by setting the Direct property to True: SpecificOptions->Values[\"Direct\"] = \"True\"; Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"MySQL\";\n\tConnection->SpecificOptions->Values[\"Direct\"] = \"True\";\n\tConnection->Server = \"server\";\n\tConnection->Port = 3306;\n\tConnection->Username = \"user_name\";\n\tConnection->Password = \"password\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } PostgreSQL UniDAC supports only a direct connection to PostgreSQL, therefore there’s no property that instructs the client on how to connect to the server — you only need to set the server address, port, and the user credentials. Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"PostgreSQL\";\n\tConnection->Server = \"server\";\n\tConnection->Port = 5432;\n\tConnection->Database = \"database_name\";\n\tConnection->SpecificOptions->Values[\"Schema\"] = \"schema_name\";\n\tConnection->Username = \"user_name\";\n\tConnection->Password = \"password\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } InterBase You can connect to a local or remote InterBase ToGo database from iOS and Android devices. To connect to a local database, set the path to the database on the device: Database = System::Sysutils::IncludeTrailingPathDelimiter(\n\t System::Ioutils::TPath::GetDocumentsPath()) + \"db.gdb\"; If you need to establish a connection to a remote server, specify the server address and database name: UniConnection.Server = \"server\";\nUniConnection.Database = \"C:\\db.gdb\"; Note that the prefix System::Sysutils::IncludeTrailingPathDelimiter (System::Ioutils::TPath:: GetDocumentsPath) is required to connect to a local database. Local Database Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"InterBase\";\n\tConnection->Database = System::Sysutils::IncludeTrailingPathDelimiter(\n\t System::Ioutils::TPath::GetDocumentsPath()) + \"db.gdb\";\n\tConnection->Username = \"user_name\";\n\tConnection->Password = \"password\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } Remote Database Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"InterBase\";\n\tConnection->Server = \"server\";\n\tConnection->Database = \"C:\\db.gdb\";\n\tConnection->Username = \"user_name\";\n\tConnection->Password = \"password\";\n\tConnection->Connect();\n\tShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } Amazon Redshift UniDAC supports only a direct connection to Redshift, therefore there’s no property that instructs the client on how to connect to the server — you only need to set the server address, port, and the user credentials. Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"Redshift\";\n Connection->Server = \"server\";\n Connection->Username = \"user_name\";\n Connection->Password = \"password\";\n Connection->Database = \"database_name\";\n Connection->Port= 5439;\n Connection->Connect();\n ShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } xBase xBase databases don’t use the client-server model, therefore a connection to an xBase database can only be established directly via TCP/IP by setting the Direct property to True: SpecificOptions->Values[\"Direct\"] = \"True\"; To connect to an xBase database, you need to set the path to the database and its format. Sample TUniConnection * Connection;\n\n Connection = new TUniConnection(Form1);\n\n try {\n\tConnection->ProviderName = \"DBF\";\n Connection->Database = \"folder_name\";\n Connection->SpecificOptions->Values[\"DBFFormat\"] = \"dfVisualFoxPro\";\n Connection->SpecificOptions->Values[\"Direct\"] = \"True\";\n Connection->Connect();\n ShowMessage(\"Connected successfully\");\n }\n __finally {\n\tConnection->Free;\n } Deploying the Application The deployment of an application to a mobile device is described in the article “ [How to Develop Android Database Applications in RAD Studio](https://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html) “. The Difference Between the Android and iOS Application Deployment The deployment path is different on Android and iOS. If you want to deploy your application to both platforms, make sure that the deployment paths are specified correctly for both Android and iOS. NOTE: Remember to replace the default value (“.”) of Remote Path  with one of the values below. С++ Builder Function Deployment Path Destination on Device TPath::GetDocumentsPath .\\assets\\internal /data/data/com.embarcadero.MyProjects/files TPath::GetSharedDocumentsPath .\\assets /mnt/sdcard/Android/data/com.embarcadero.MyProjects/files Despite having the same name, the providers for Android and iOS are different and located in their respective folders. Debug The debugging process is described in the article [Remote Debugging of Android Applications in RAD Studio via Wi-Fi](https://blog.devart.com/remote-debug-of-android-application-in-rad-studio-xe5-via-wifi.html) Wrapping up You can build powerful applications with C++ Builder and UniDAC that work with ASE, SQL Server, SQLite, MySQL, Oracle, PostgreSQL, InterBase, Amazon Redshift, and xBase databases from Android and iOS devices. Tags [android development](https://blog.devart.com/tag/android-development-2) [c++builder](https://blog.devart.com/tag/cbuilder) [ios development](https://blog.devart.com/tag/ios-development) [rad studio](https://blog.devart.com/tag/rad-studio) [unidac](https://blog.devart.com/tag/unidac) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Develop+Database+Applications+for+iOS+and+Android+in+C%2B%2BBuilder+with+UniDAC&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html&title=How+to+Develop+Database+Applications+for+iOS+and+Android+in+C%2B%2BBuilder+with+UniDAC) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html&title=How+to+Develop+Database+Applications+for+iOS+and+Android+in+C%2B%2BBuilder+with+UniDAC) [Copy URL](https://blog.devart.com/how-to-develop-database-applications-with-unidac-for-ios-and-android-in-cbuilder-xe6.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Develop iOS Applications in Delphi XE4 Using Devart Data Access Components By [DAC Team](https://blog.devart.com/author/dac) May 30, 2013 [7](https://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html#comments) 5435 Half a year ago we published an article describing the process of [iOS application development in Delphi XE2](https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html) . We received quite a lot of positive feedback on that article, but the main thing is that the article helped many our users create their first applications for iPhone, iPad and iPod. Since then, been a long time – and the new RAD Studio XE4 first saw the light of day, in which the process of iOS application development greatly changed. But, fortunately, all the changes were oriented to make the development simple and easier to understand. iOS development limitations are not for us! The matter is that iOS has a quite serious limitation on application deployment to iOS devices: you cannot deploy any libraries (*.dylib) along with your application. If you have developed applications for work with databases earlier, you should know that to work with any databases, you need either a client to be installed or a library allowing connection to a certain database. This can be a very serious brick wall when attempting to create a business-application that must work with databases on iOS. But the point is that Devart data access components allow working with the most popular databases without installing any client software and require no libraries. Therefore library (*.dylib) deployment restrictions don’t apply to applications developed with Devart data access components, since your application simply won’t require these libraries. In addition, we as developers did our best to enhance our products functionality – and now we can proudly claim that Oracle and InterBase ToGo are added to the list of the databases, that can be worked with from iOS. Here is the list of the databases, which your iOS applications will be able to work with: SQLite Oracle MySQL PostgreSQL InterBase ToGo If you want your application to work with all these databases, you can use [UniDAC](https://www.devart.com/unidac/) ; if you want to work with one DB, any of the following products will be suitable for you: [LiteDAC](https://www.devart.com/litedac/) [ODAC](https://www.devart.com/odac/) [MyDAC](https://www.devart.com/mydac/) [PgDAC](https://www.devart.com/pgdac/) [IBDAC](https://www.devart.com/ibdac/) Direct data access or DataSnap When working with databases, we should choose the way we want to work with data: directly or using DataSnap. Description of work with a database using DataSnap can be easily found in the [Using DAC products in multi-tier DB application development](https://blog.devart.com/using-dac-products-in-multi-tier-db-application-development.html) article, so we will look at the second method. Today’s mobile platforms do not yet possess the hardware power, which modern computers have, so the main requirement for a mobile application remains low consumption of system resources. Proceeding from this, let’s consider creating an application, that will work with a database directly and show maximum performance on those resources, that they have available. Connecting to a database from an iOS application At the stage of mobile application design and development, work with UniDAC looks exactly the same, as when developing a normal desktop application. We can place TUniConnection, TUniQuery and other components on the form in design-time: Otherwise, we can create all necessary components in run-time. The very creation of components is pretty trivial, so let’s consider examples of how to initialize UniDAC connection for work in iOS. SQLite Since SQLite is a local DB, we don’t need Host and Port. We should just specify Database, the path to the database file on the hard drive. But since iOS has definite limitations on work with the file system, to retrieve the path to the database (to a folder, where the application can create and modify files), the following function may be used: DocumentPath := TPath.GetDocumentsPath; Besides this, if you are not sure that an SQLite DB exists on an iOS device (e.g., on the first start of your application), and you want to create it in this case, the following parameter must be set: UniConnection.SpecificOptions.Values['ForceCreateDatabase'] := 'True'; And the most important — we should specify the Provider we are going to use. In other words, Provider defines the DB we want to work with: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'SQLite';\n\n UniConnection.SpecificOptions.Values['ForceCreateDatabase'] := 'True';\n UniConnection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.sqlite3';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; Oracle Connection to Oracle in iOS is possible only in the Direct mode, since it doesn’t require Oracle client installation. The connection mode can be specified as follows: UniConnection.SpecificOptions.Values['Direct'] := 'True'; In addition, the server name must be generated correctly, since, if we have no client, we have no tnsnames.ora file with the server list as well. Therefore, to establish connection in iOS, we need to know the server Host and Port, as well as its SID or Service Name. To connect via the SID, the server should be set in the following way: UniConnection.Server := 'Host:Port:sid=SID'; or a simplified way: UniConnection.Server := 'Host:Port:SID'; To connect via the Service Name – as follows: UniConnection.Server := 'Host:Port:sn=SID'; In other words, the ‘sid=’ prefix of the third parameter indicates that connection is established via the SID, and the ‘sn=’ prefix indicates that connection is established via the Service Name. If no prefix is specified, then, by default, it is considered, that we want to establish connection via the SID. The majority of Oracle servers have the same SID and Service Name, so you, mostlikely, won’t have to go into such nuances. And don’t forget to specify the provider: UniConnection.ProviderName := 'Oracle'; Example: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'Oracle';\n\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n\n UniConnection.Server := 'server:1521:orcl';\n UniConnection.Username := 'user_name';\n UniConnection.Password := 'password';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; MySQL Work with MySQL server is also possible only in the Direct mode, and this parameter setting doesn’t differ from Oracle: UniConnection.SpecificOptions.Values['Direct'] := 'True'; To establish connection, you need to know the Host and Port. Host can be represented by both server name: UniConnection.Server := 'server'; and its IP: UniConnection.Server := '192.168.0.1'; Since there can be several databases at the MySQL server, we have to specify the exact one we want to work with: UniConnection.Database := 'database_name'; And, surely, we don’t forget to specify the Provider: UniConnection.ProviderName := 'MySQL'; Example: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'MySQL';\n\n UniConnection.SpecificOptions.Values['Direct'] := 'True';\n\n UniConnection.Server := 'server';\n UniConnection.Port := 3306;\n UniConnection.Database := 'database_name';\n UniConnection.Username := 'user_name';\n UniConnection.Password := 'password';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; PostgreSQL UniDAC allows work with PostgreSQL exclusively in the Direct mode, therefore mode setting can be omitted. For the rest, establishing connection to PostgreSQL is almost the same as the one to MySQL. There also can be several databases at one PostgreSQL server, but, besides that, each DB can have several schemes. If we want to work with a scheme different from public, we should specify its name: UniConnection.SpecificOptions.Values['Schema'] := 'schema_name'; Example: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'PostgreSQL';\n\n UniConnection.Server := 'server';\n UniConnection.Port := 5432;\n UniConnection.Database := 'database_name';\n UniConnection.SpecificOptions.Values['Schema'] := 'schema_name';\n UniConnection.Username := 'user_name';\n UniConnection.Password := 'password';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; InterBase You can work with both local and remote InterBase databases from iOS. Let’s consider both approaches. Firstly, as always, specify the provider: UniConnection.ProviderName := 'InterBase'; For a local DB specify the path to the local file: UniConnection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.gdb'; For a remote database specify the server name and the path to the file on the server: UniConnection.Server := 'server';\nUniConnection.Database := 'D:db.gdb'; A sample of connection to a local DB: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'InterBase';\n\n UniConnection.Database := IncludeTrailingPathDelimiter(TPath.GetDocumentsPath) + 'db.gdb';\n UniConnection.Username := 'user_name';\n UniConnection.Password := 'password';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; A sample of connection to a remote DB: var\n UniConnection: TUniConnection;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'InterBase';\n\n UniConnection.Server := 'server';\n UniConnection.Database := 'D:db.gdb';\n UniConnection.Username := 'user_name';\n UniConnection.Password := 'password';\n\n try\n UniConnection.Connect;\n except\n on E: Exception do\n ShowMessage(E.Message);\n end;\n finally\n UniConnection.Free;\n end;\nend; DataBase deployment to iOS In order for our application to be able to work with local SQLite and InterBase ToGo databases, we should make sure these databases to be deployed to an iOS device. Here is nothing difficult at this. First we should call the Project->Deployment menu: After this add our databases for SQLite and InterBase to the list of files, that must be deployed to an iOS device together with your application. Don’t forget to change the Remote Path default value “.” with the correct one “StartUpDocuments” for the added files. Application debug on iOS simulator Let’s say you have created your first application (or taken a ready Demo-project, that can be found in the folder, where UniDAC was installed). The next step will be the application start on the iOS Simulator. For this, you need: Mac OS Lion (OS X 10.7) or Mountain Lion (OS X 10.8) — let’s consider you do have Mac OS, since iOS application development is just impossible without it; Xcode 4.3 for iOS 5.1 or Xcode 4.5 for iOS 6 — everything is simple here: in Mac OS go to [https://developer.apple.com/downloads/](https://developer.apple.com/downloads/) and download the needed Xcode; PAServer – nothing complicated here as well. In the folder, where you have installed RAD Studio XE4, there is a folder PAServer. There is an package RADPAServerXE4.pkg in this folder, it’s the PAServer installer for Mac OS. Copy it to your Mac OS and run it. These are all the requirements for starting your application on the iOS Simulator. Now we can move directly to the application start. The first step is PAServer start on Mac OS: Set the password that will be needed to specify in RAD Studio XE4 for connectionto started PAServer (the password can be empty). Then we see a window offering toenter the admin password: Enter the admin password. The Mac OS and PAServer preparation is now complete. Now, run RAD Studio XE4, open our application and select the platform — iOS Simulator: When the platform is selected, we can try running the application. Most likely, there is no remote profile created yet, therefore you will be offered to create a new one: There is nothing difficult in a new remote profile creation. The following must be specified: the profile name; the HostName or IP of the computer with Mac OS, on which PAServer is started; the password that you have specified when starting PAServer (if you have specified it). And finally, having performed these simple steps, we run our application on the iOS Simulator: Starting application on a real iOS device After your application starts working on the iOS Simulator, the last step remains — to run it on your iPhone, iPad, or iPod. If you have performed all the previous steps and run your application on the iOS Simulator, this means your Mac OS is already setup and ready for deployment of the created application to an iOS device. It remains only to select the platform — iOS Device: And of course, to run the application. When attempting to run the application ona real iOS device, you most likely will see such a window: To run the application on a real iOS device, in addition to remote profile, an SDK version must be selected. To select an SDK, a new remote profile must be created (since a remote profile for the iOS Simulator differs from a remote profile for a real device). Therefore we create a new remote profile (its creation has no difference with creation a remote profile for the simulator), and select the SDK compatible with the iOS version on your device: Now you can see your application work result not on an abstract simulator, but on a real device: Conclusion We have considered a sample of creating the simpliest database applications for iOS in Delphi XE4, which can establish connection to SQLite, Oracle, MySQL, PostgreSQL, and InterBase ToGo using UniDAC. And we hope this article will help you make the first step towards creation of your own applications for iOS. Tags [delphi](https://blog.devart.com/tag/delphi) [ios development](https://blog.devart.com/tag/ios-development) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Develop+iOS+Applications+in+Delphi+XE4+Using+Devart+Data+Access+Components&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html&title=How+to+Develop+iOS+Applications+in+Delphi+XE4+Using+Devart+Data+Access+Components) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html&title=How+to+Develop+iOS+Applications+in+Delphi+XE4+Using+Devart+Data+Access+Components) [Copy URL](https://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 7 COMMENTS Gayle June 28, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 1:12 pm I’ve been surfing online more than 4 hours today, yet I never found any interesting article like yours. It is pretty worth enough for me. Personally, if all webmasters and bloggers made good content as you did, the internet will be a lot more useful than ever before. My website; [Mobile application developers](http://fueled.com/mobile-app-development-london/) AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:28 am Thank you for your kind words, Gayle. We are glad the article was useful for you. It seems the [How to Develop Android Database Applications in RAD Studio XE5](http://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html) article may be intersting for you as well. aRiS September 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 3:43 am How to Develop Android Applications in Delphi XE5 Using Devart Data Access Components…??? Thanx AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:17 am Hello, aRiS! We have revealed this topic in the [“How to Develop Android Database Applications in RAD Studio XE5”](http://blog.devart.com/android-database-application-development-in-rad-studio-xe5.html) article. Jadiel Bruno March 25, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 8:24 pm Tem previsão do UniDAC funcionar no AppMethod nova plataforma da embarcadero ? DAC Team May 15, 2014\t\t\t\t\t\t At\t\t\t\t\t\t 11:53 am The Edition with source code must work with it. James grills March 13, 2018\t\t\t\t\t\t At\t\t\t\t\t\t 2:15 pm Thanks for the post. It was really helpful. Comments are closed."} {"url": "https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html", "product_name": "Unknown", "content_type": "Blog", "content": "[Delphi DAC](https://blog.devart.com/category/products/delphi-dac) [How To](https://blog.devart.com/category/how-to) How to Develop iOS Applications in Delphi XE2 Using Devart Data Access Components By [DAC Team](https://blog.devart.com/author/dac) September 14, 2012 [5](https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html#comments) 4731 Delphi XE2 allows to develop iOS applications as well as applications for Win 32-bit, 64-bit, and Mac OS. In this article, we will try to explain how to develop iPhone apps in Delphi XE2. However, everything below can be used in application development for iPad and iPod, since they support iOS mobile platform as well. iOS application development is a little bit different from development of common desktop applications, and it consists of two main stages: Application development in Delphi XE2 Application compilation and building in Xcode on Mac OS This is due to the fact that Delphi does not have a native compiler for iOS platform, and the FPC compiler should be used for compilation. Also, note that an iOS application can only work with several databases (where no client library is required, or if there is a native client for iOS). This restriction is due to the fact that, according to the Apple policy, iOS applications must not be written using external dynamic libraries, if only they are not built into the iOS SDK or statically linked to the application itself (you can find more detailed information at the [Apple site](https://developer.apple.com/documentation) ). So, using [Devart Data Access Components](https://www.devart.com/dac.html) , it is possible to work with the following databases in iOS: SQLite (the library is supplied with iOS) MySQL in the direct mode (the client is not required) PostgreSQL in the direct mode (the client is not required) We have used [UniDAC](https://www.devart.com/unidac/) for developing our sample application, and all three databases are used in it. Also, you can develop DB applications for iOS using the following separate Devart products: [PgDAC](https://www.devart.com/pgdac/) for work with PostgreSQL [MyDAC](https://www.devart.com/mydac/) for work with MySQL [LiteDAC](https://www.devart.com/litedac/) for work with SQLite The process of application development using these products is similar to the one described below, except that you have to use the product-specific connection and query components instead of TUniConnection and TUniQuery listed in our code samples. Application development in Delphi At the first stage, a new iOS application can be created using “File -> New -> Other -> FireMonkey HD iOS Application” menu. A new iOS application can be created Then the application is designed similarly to any other application, i.e. you can place visual components on the form, implement application logic, etc. Our demo form can be seen below: However, there are several restrictions in Delphi XE2 if your application needs to work with a database: Data-access components can not be placed on the form in design-time and have to be created, configured and released manually in runtime. There are some peculiarities when generating project-relative paths (for example, when setting a database name for the connection) in iOS. There is no possibility to set a link between data-access components and visual controls (for example, like connecting TStringGrid and a dataset with the LiveBindings mechanism in a trivial FireMonkey application). The process of displaying and editing data needs to be implemented manually. Creating data-access components So, when developing your iOS application, you first see that all the data-access components on the Tool Palette are not available, as it is shown below: In order to use UniDAC in our sample application, we have declared two variables in the private section of the form declaration, and then created the TUniConnection and TUniQuery instances in the OnClick event handler of the “Connect” button: TiForm = class(TForm)\nprivate\n UniConnection: TUniConnection;\n UniQuery: TUniQuery;\nend;\n\n...\n\nprocedure TiForm.btConnectClick(Sender: TObject);\nbegin\n if not Assigned(UniConnection) then\n UniConnection := TUniConnection.Create(Self);\n UniConnection.ProviderName := 'SQLite';\n UniConnection.Database := GetDatabasePath + edDataBase.Text;\n UniConnection.Connect;\n\n if not Assigned(UniQuery) then\n UniQuery := TUniQuery.Create(Self);\n UniQuery.Connection := UniConnection;\n UniQuery.SQL.Text := 'SELECT ID, Common_Name, Graphic, SpeciesName, Category, Notes FROM FISH';\n UniQuery.Open;\nend; Setting the database path As you can see, in the code sample above we have used some GetDatabasePath function when setting the UniConnection.Database property. The point is that, under iOS, there is no ability to use usual ParamStr(0) function for obtaining the application path (for example, in order to generate a project-relative path to a database). In our sample application, we implemented the GetDatabasePath function as follows: function GetDatabasePath: string;\n{$IFDEF FPC}\nvar\n Paths : NSArray;\n FileName : NSString;\nbegin\n Paths := NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, True);\n FileName := Paths.objectAtIndex(0);\n Result := String(FileName.UTF8String) + '/';\n{$ELSE}\nbegin\n Result := ExtractFilePath(ParamStr(0));\n{$ENDIF}\nend; When the application runs in Windows, the function will return the application path. But, when the application runs in iOS, the function will return a path to the special Documents sub-folder located in the application root folder on the iOS device. The Documents sub-folder is automatically created for each iOS application when it is installed. All application data, including the database, has to be placed in this sub-folder. Unfortunatelly, there is no possibility to include the database file into the project in order to place it into the Documents subfolder automatically when the application is installed. Therefore you will have to copy the database file to the iOS device manually after installing the application. Retrieving and displaying data Then, the task is to display the data retrieved by the query in visual controls. As it is described above, there is no possibility in Delphi XE2 to do this in a way like using a TDataSource component in VCL, or LiveBindings in FMX. Developer’s own methods are required to be implemented for each specific case to display and edit data. In our sample application, we did not use any data-aware components like TDBGrid or TStringGrid, therefore we simply set visual controls properties with the values of the corresponding query fields when the trackbar at the botton of the form changes its position: procedure TiForm.TrackBar1Change(Sender: TObject);\nbegin\n if Assigned(UniQuery) and UniQuery.Active then begin\n UniQuery.Locate('ID', VarArrayOf([Round(TrackBar1.Value + 1)]), [loPartialKey]);\n lbCommon_Name.Text := UniQuery.FieldByName('Common_Name').AsString;\n lbSpeciesName.Text := UniQuery.FieldByName('SpeciesName').AsString;\n lbCategory.Text := UniQuery.FieldByName('Category').AsString;\n meDescription.Text := UniQuery.FieldByName('Notes').AsString;\n end;\nend; Data can be displayed in a grid in a similar way. You can loop by the records in a dataset, and enter the values of each field into a corresponding cell of the grid, as follows: var\n UniConnection: TUniConnection;\n UniQuery: TUniQuery;\n i: Integer;\nbegin\n UniConnection := TUniConnection.Create(nil);\n try\n UniConnection.ProviderName := 'SQLite';\n UniConnection.Database := GetDatabasePath + 'fish.db3';\n UniConnection.Connect;\n\n UniQuery := TUniQuery.Create(nil);\n try\n UniQuery.Connection := UniConnection;\n UniQuery.SQL.Text := 'Select Category, Common_Name, Notes from FISH';\n UniQuery.Open;\n\n while not UniQuery.Eof do begin\n for i := 0 to UniQuery.Fields.Count - 1 do\n StringGrid1.Cells[i, UniQuery.RecNo - 1] := UniQuery.Fields[i].AsString;\n UniQuery.Next;\n end;\n\n finally\n UniQuery.Free;\n end;\n\n finally\n UniConnection.Free;\n end;\nend; Data editing also needs to be performed manually, i.e. you should call the corresponding Insert/Update/Delete methods, fill in the fields of a dataset with the right values, and then call the Post method, etc. Compilation and deployment The second stage is compilation and deployment of applications directly to the iOS platform. The stage consists of three steps: Creating an Xcode project for the Delphi project. Compiling the Xcode project on MacOS. Deploying the application to an iOS device. Creating an Xcode project In order to compile a Delphi project on MacOS, a special Xcode project has to be created. The project consists of several additional files that are created in the “xcode” subfolder in the root folder of the project. To create an Xcode project for the existing Delphi project, the dpr2Xcode.exe command-line utility is used. The utility is supplied with Delphi XE2 and located in the Bin folder. To make the usage of the utility more handy, you can customize the IDE Tools menu. Open the “Tools-> Configure Tools” menu, press the “Add…” button and fill the properties as follows: Title: Export to Xcode Program: dpr2Xcode.exe Parameters: $PROJECT Now you can open your iOS project in the IDE and use the “Tools -> Export to Xcode” menu to create an Xcode project. As it described before, after doing this, the “xcode” subfolder will be created in the root folder of the project. Then, it is nessessary to copy all UniDAC source files into the iOS project root folder, because UniDAC files will also be required by the FPC compiler to build the project. Compiling the Xcode project on MacOS Now, the iOS project is ready to be compiled with Xcode on MacOS. The MacOS system used for compiling must conform to the following requirements: Operating system: OS X 10.6 Snow Leopard or OS X 10.7 Lion Free Pascal version 2.6.0 installed on MacOS Xcode and the iOS SDK installed on MacOS (We have verified Xcode versions 3.2.5 and 4.2, and iOS SDK versions 4.2.x, 4.3.x and 5.0) the FireMonkey-iOS-XE2.dmg package installed on MacOS (The package is required by FPC to compile Delphi projects. It is supplied with Delphi XE2 and located in the FireMonkey-iOS folder) an iOS mobile device (iPhone, iPod or iPad) connected to the Mac via USB port (If you don’t have one, you can test the application in the iOS device simulator included in Xcode) So, if you have an appropriate MacOS system, you should copy the project folder to MacOS, open the project in Xcode, then select an appropriate output device in Xcode (a real iOS device or a simulator) and build the project. When a simulator is chosen as an output device, then after the successfull building, the application will be opened in the simulator directly on the MacOS system. When a real connected device is chosen as an output device, the application will be automatically transferred and installed on the device, and its label will appear on the device desktop. Also, since all the source files are transferred to Mac OS, there is a possibility to debug the application directly in Xcode. Deploying the application to an iOS device The application deployment process is not the objective of this article, it is widely described in Internet. As mentioned above, you do not have to deploy any additional libraries with an application written using [UniDAC](https://www.devart.com/unidac/) , [PgDAC](https://www.devart.com/pgdac/) , [MyDAC](https://www.devart.com/mydac/) , or [LiteDAC](https://www.devart.com/litedac/) . Also note, that in order to develop applications on a real iPhone, iPod, or iPad, you have to sign up for the Apple’s paid iOS Developer Program and configure the device for development purposes. You can find out more about the iOS Developer Program [here](https://developer.apple.com/programs/) . Without this license, the application can be only tested in the iOS simulator included in Xcode. Tags [delphi](https://blog.devart.com/tag/delphi) [ios development](https://blog.devart.com/tag/ios-development) [rad studio](https://blog.devart.com/tag/rad-studio) [DAC Team](https://blog.devart.com/author/dac) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Develop+iOS+Applications+in+Delphi+XE2+Using+Devart+Data+Access+Components&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html&title=How+to+Develop+iOS+Applications+in+Delphi+XE2+Using+Devart+Data+Access+Components) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html&title=How+to+Develop+iOS+Applications+in+Delphi+XE2+Using+Devart+Data+Access+Components) [Copy URL](https://blog.devart.com/how-to-develop-ios-database-applications-in-delphi-xe2-using-devart-data-access-components.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025 5 COMMENTS Raul Islas Matadamas March 27, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 3:40 am Congratulations! This article it’s perfect to probe the devart components. Gernot May 19, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 7:01 am Could you update the article for XE4? I am unable to make it work. Even so the dB Sems to be in documents the program does not find it. Is there a way to update the dB via iTunes? AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:51 am Please have a look at the [How to Develop iOS Applications in Delphi XE4 Using Devart Data Access Components](http://blog.devart.com/how-to-develop-ios-applications-in-delphi-xe4-using-devart-data-access-components.html) article. devtools.korzh October 20, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 3:19 pm Congrats! AndreyP October 24, 2013\t\t\t\t\t\t At\t\t\t\t\t\t 10:24 am Thank you! We hope the article was useful for you. Comments are closed."} {"url": "https://blog.devart.com/how-to-disable-all-foreign-keys-in-oracle-scheme.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Oracle Tools](https://blog.devart.com/category/products/oracle-tools) How To: Disable All Foreign Keys in Oracle Scheme By [dbForge Team](https://blog.devart.com/author/dbforge) September 17, 2010 [0](https://blog.devart.com/how-to-disable-all-foreign-keys-in-oracle-scheme.html#respond) 13341 When you perform data maintenance operations, sometimes, it’s necessary to disable or enable all foreign keys in the user schema. Here is the script that solves this task: SET serveroutput ON;\n/\nDECLARE\n /*The name of the schema that should be synchronized.*/\n Schema_Name VARCHAR2(4000) :='type target schema name here';\n /*The operation type:*/\n /*  ON — enable foreign keys;*/\n /*  OFF — disable foreign keys.*/\n ON_OFF VARCHAR2(4000) :='ON';\nPROCEDURE CONSTRAINTS_ON_OFF\n(Target_Schema_Name IN VARCHAR2, Action IN VARCHAR2:='')\nIS\n sql_str VARCHAR2(4000);\n FK_name VARCHAR2(4000);\n var_action VARCHAR2(4000);\nCURSOR cCur1 IS\n /*Creating the list of foreign keys that should be disabled/enabled,*/\n /*with creating a command at the same time.*/\n SELECT\n 'ALTER TABLE '||OWNER||'.'||\n TABLE_NAME||' '||var_action||' CONSTRAINT '||CONSTRAINT_NAME AS sql_string,\n CONSTRAINT_NAME\n FROM\n ALL_CONSTRAINTS\n WHERE\n CONSTRAINT_TYPE='R' AND OWNER=Target_Schema_Name;\nBEGIN\n IF upper(Action)='ON' THEN\n var_action :='ENABLE';\n ELSE\n var_action :='DISABLE';\n END IF;\nOPEN cCur1;\n LOOP\n FETCH cCur1 INTO SQL_str,fk_name;\n EXIT WHEN cCur1%NOTFOUND;\n /*Disabling/Enabling foreign keys.*/\n EXECUTE IMMEDIATE SQL_str;\n DBMS_Output.PUT_LINE('Foreign key '||FK_name||' is '||var_action||'d');\n END LOOP;\nEXCEPTION\nWHEN OTHERS THEN\n BEGIN\n DBMS_Output.PUT_LINE(SQLERRM);\n END;\n CLOSE cCur1;\nEND;\nBEGIN\n CONSTRAINTS_ON_OFF(Schema_Name,ON_OFF);\n /*specify additional calls if necessary*/\nEND;\n/\nCOMMIT;\n/ Executing the script with the ON_OFF parameter set to ‘OFF’ will lead to disabling foreign keys, and setting it to ‘ON’ will lead to enabling them. But you should keep in mind that if data after synchronization contradicts the logic of data integrity on the server side the procedure of disabling and enabling foreign keys will fail. You can also use this script in express edition of [dbForge Data Compare for Oracle](https://www.devart.com/dbforge/oracle/datacompare/) when transferring master-detail data. Tags [data compare](https://blog.devart.com/tag/data-compare) [Oracle](https://blog.devart.com/tag/oracle) [oracle tools](https://blog.devart.com/tag/oracle-tools) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-disable-all-foreign-keys-in-oracle-scheme.html) [Twitter](https://twitter.com/intent/tweet?text=How+To%3A+Disable+All+Foreign+Keys+in+Oracle+Scheme&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-disable-all-foreign-keys-in-oracle-scheme.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-disable-all-foreign-keys-in-oracle-scheme.html&title=How+To%3A+Disable+All+Foreign+Keys+in+Oracle+Scheme) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-disable-all-foreign-keys-in-oracle-scheme.html&title=How+To%3A+Disable+All+Foreign+Keys+in+Oracle+Scheme) [Copy URL](https://blog.devart.com/how-to-disable-all-foreign-keys-in-oracle-scheme.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Check Constraints for Data Integrity](https://blog.devart.com/how-to-use-sql-check-constraints.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Download and Install dbForge DevOps Automation PowerShell for SQL Server By [dbForge Team](https://blog.devart.com/author/dbforge) April 20, 2021 [0](https://blog.devart.com/how-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html#respond) 2742 [dbForge DevOps Automation](https://www.devart.com/dbforge/sql/database-devops/) helps automate SQL Server database development, testing, and deployment. It was designed to minimize deployment risks and speed up the release cycle, making the overall workflow safe and consistent. This product is supplied free of charge as part of [dbForge SQL Tools](https://www.devart.com/dbforge/sql/sql-tools/) and includes dbForge DevOps Automation PowerShell for SQL Server, which helps fine-tune and automate routine tasks via [PowerShell cmdlets](https://docs.devart.com/devops-automation-for-sql-server/powershell-cmdlets/export-devartdbproject.html) . These tasks include building and deploying databases locally or on remote SQL servers, generating NuGet packages, running tSQLt tests and generating test data, documenting databases in multiple formats, deploying NuGet packages, and synchronizing them with working databases, as well as publishing them to specific feeds for further use. Requirements First off, you need to make sure that you have a proper version of PowerShell installed on your computer. The minimum PowerShell version required to run dbForge DevOps Automation is 3.0. Note: You can quickly check your current PowerShell version by opening Windows PowerShell ISE and running the command $PSVersionTable.PSVersion . If the Major version is 3 or higher, you are ready to proceed further. Other requirements are as follows: Microsoft Windows 7/8/8.1/10 Windows Server 2008/2012/2016/2019 .NET Framework 4.7.2 or later (you can [download it here](https://dotnet.microsoft.com/download/dotnet-framework/net472) ) Microsoft SQL Server ( [check compatibility](https://docs.devart.com/devops-automation-for-sql-server/getting-started/requirements.html) ) Download & Installation Since dbForge DevOps Automation is included in a bundle called SQL Tools, you can [download them here](https://www.devart.com/dbforge/sql/database-devops/download.html) in a single installation file. If you haven’t downloaded the dbForge DevOps Automation PowerShell module during your installation of SQL Tools, you can download it separately from [PowerShell Gallery](https://www.powershellgallery.com/packages/Devart.DbForge.DevOpsAutomation.SqlServer/1.0.147) . To do that, launch your Windows PowerShell ISE and run the following command: Install-Module -Name Devart.DbForge.DevOpsAutomation.SqlServer If you have never run PowerShell scripts previously, you will encounter the following error: This can be easily fixed with changing your PowerShell execution policy. Simply launch Windows PowerShell ISE as administrator and run Set-ExecutionPolicy -ExecutionPolicy Unrestricted . You will be prompted to confirm your new execution policy. Click “Yes” to confirm it and re-run the abovementioned installation command. The installation will be completed almost instantly. Now all dbForge DevOps Automation PowerShell cmdlets are available in your Windows PowerShell ISE, and you can start using them to configure PowerShell scripts for your CI/CD. Note: When it comes to PowerShell cmdlets, Windows PowerShell ISE is preferable to the standard Windows PowerShell, since it delivers autocompletion while suggesting cmdlets and parameters. Tags [dbforge](https://blog.devart.com/tag/dbforge) [dbForge DevOps Automation](https://blog.devart.com/tag/dbforge-devops-automation) [SQL Server](https://blog.devart.com/tag/sql-server) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Download+and+Install+dbForge+DevOps+Automation+PowerShell+for+SQL+Server&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html&title=How+to+Download+and+Install+dbForge+DevOps+Automation+PowerShell+for+SQL+Server) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html&title=How+to+Download+and+Install+dbForge+DevOps+Automation+PowerShell+for+SQL+Server) [Copy URL](https://blog.devart.com/how-to-download-and-install-dbforge-devops-automation-powershell-for-sql-server.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-enable-mysql-query-log.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [MySQL Tools](https://blog.devart.com/category/products/mysql-tools) How to Enable, Configure, and Use MySQL Query Logging By [Nataly Smith](https://blog.devart.com/author/nataly-smith) September 26, 2024 [0](https://blog.devart.com/how-to-enable-mysql-query-log.html#respond) 952 What is the purpose of query logging in MySQL? Why is it important? How do you enable and configure it? In this article, we will address these questions and more. You will find a detailed guide on how to use mysqldumpslow to analyze slow query logs, along with tips on harnessing convenient GUI tools like [dbForge Studio for MySQL](https://www.devart.com/dbforge/mysql/studio/) to your advantage. Contents Enabling MySQL query log Editing the MySQL configuration file Using MySQL commands Configuring the slow query log Purpose of the slow query log Enabling the slow query log using the configuration file Setting slow query log parameters Using mysqldumpslow to analyze slow query logs Overview of mysqldumpslow Summarizing slow query logs How to enable MySQL query log using dbForge Studio for MySQL by Devart Overview of dbForge Studio for MySQL Step-by-step guide to enabling the query log Conclusion As promised, we will begin this article by answering the first question: what is the purpose of query logging in MySQL? Just like any other system, a database is doing its thing, constantly changing and sometimes malfunctioning, unfortunately. Wouldn’t it be nice to know why? This is exactly where query logging comes in handy. By logging queries, especially slow ones, you can see which operations are consuming the most resources, helping you optimize performance. It also plays a huge part in troubleshooting, allowing you to [trace problematic queries](https://www.devart.com/dbforge/mysql/studio/execution-history.html) that may be causing errors or slowdowns. Note: Check out our [Show running processes in MySQL](https://www.devart.com/dbforge/mysql/studio/show-running-queries-in-processlist.html) article to maximize your optimization game. Enabling MySQL query log The second question on today’s agenda was: how do you go about enabling MySQL query log? Essentially, there are two ways to do it: Edit the MySQL configuration file. Execute the corresponding MySQL command. Below, you will find the detailed instructions on how to do both. Editing the MySQL configuration file In order to make changes to the MySQL configuration file: 1. Find the my.ini file. By default, it is located in the C:/ProgramData/MySQL/MySQL Server 9.0/my.ini directory. 2. Right-click the file and open it with a text editor of your liking. We will stick to the classics and open it with Notepad: 3. Inside the file, find the [mysqld] section. Here, add the following parameters: general_log=1\n\ngeneral_log_file=\"general.log\" 4. Save and close the file. 5. Restart the MySQL service to apply the changes. Now, the general query log is enabled, and all queries executed on the MySQL server will be logged in the specified log file: Using MySQL commands 1. Start by logging into your MySQL server. 2. Before making any changes, you might want to check if the general query log is already enabled. SHOW VARIABLES LIKE 'general_log';\nSHOW VARIABLES LIKE 'general_log_file'; 3. If it is not already on, you can change that by executing the following command: SET GLOBAL general_log = 'ON'; 4. Specify the file where the logs should be written. You can set this to any path where the MySQL server has write access. SET GLOBAL general_log_file = 'C:/ProgramData/MySQL/MySQL Server 9.0/Data/general.log'; Replace 'C:/ProgramData/MySQL/MySQL Server 9.0/Data/general.log' with the actual path where you want to store the log file. 5. To ensure that the logging is enabled and the file path is correctly set, run the following: SHOW VARIABLES LIKE 'general_log';\nSHOW VARIABLES LIKE 'general_log_file'; 6. Once you no longer need query logging, you can turn it off to avoid unnecessary disk usage. SET GLOBAL general_log = 'OFF'; Note: You may need root or sudo access to view or modify the log file if stored in a restricted directory. Configuring the slow query log Having in mind the MySQL query log enabling procedure, we can now move on to configuring the slow query log. It is going to be easy for us since the steps are rather similar. Purpose of the slow query log As the name suggests, the slow query log in MySQL is a tool that registers queries that take longer than a specified amount of time to execute. By analyzing these logs, you can spot inefficient queries and optimize them, finding bottlenecks and improving overall database performance. Enabling the slow query log using the configuration file To enable the slow query log in MySQL, you need to modify the my.ini configuration file again. 1. Open the my.ini file in a text editor. 2. In the [mysqld] section, add the following parameters: slow_query_log=1\n\nslow_query_log_file=\"slow.log\" 3. Save the changes and restart the MySQL server. Setting slow query log parameters Being “slow” is a rather relative concept. Thus, we need to define it for MySQL. To fine-tune the slow query log, set the long_query_time additional parameter, which defines the threshold for what is considered a slow query. long_query_time=2 After completing these steps, you should see all the queries that take longer than 2 seconds logged in the slow.log file. Get insights into MySQL log types, their use cases and advantages — [view MySQL logs](https://blog.devart.com/how-to-view-the-mysql-server-logs.html) tutorial. Using mysqldumpslow to analyze slow query logs Just documenting all the slow-running queries in MySQL is not enough. To make a huge step towards actually optimizing the performance, you can try the mysqldumpslow tool. Overview of mysqldumpslow mysqldumpslow is a utility that helps you summarize and analyze the slow query logs generated by MySQL. Normally, it groups queries that are similar except for the particular values of number and string data values. The basic syntax looks somewhat like this: mysqldumpslow [options] [log_file ...] The supported options are: Option Description -a Do not abstract all numbers to N and strings to ‘S’. -n Abstract numbers with at least the specified digits. –debug Write debugging information. -g Only consider statements that match the pattern. –help Display help message and exit. -h Specify the host name of the server in the log file name. -i Specify the name of the server instance. -l Do not subtract lock time from total time. -r Reverse the sort order. -s Sort output. -t Display only first num queries. –verbose Enable verbose mode. Summarizing slow query logs Let us get a general summary of the slow log. To do that, open Windows Command Prompt and execute the following query: mysqldumpslow \"C:/ProgramData/MySQL/MySQL Server 9.0/Data/slow.log\" You should see the entire content of the slow.log file: However, if you need, say, only the top 3 slowest queries, run the following command: mysqldumpslow -t 3 \"C:/ProgramData/MySQL/MySQL Server 9.0/Data/slow.log\" You can also group queries by fingerprint to see the most frequent slow queries: mysqldumpslow -a \"C:/ProgramData/MySQL/MySQL Server 9.0/Data/slow.log\" Keep in mind that you need the Perl environment installed on your Windows machine for the mysqldumpslow command to operate. Otherwise, you will be getting an error message instead of the expected output. Download and install [Strawberry Perl](https://strawberryperl.com/) . Open Command Prompt and execute the perl -v command to see if the environment was installed properly. Create a mysqldumpslow.bat file in the C:\\Windows\\ directory. Copy and paste this text into the newly created file perl \"C:/Program Files/MySQL/MySQL Server 9.0/bin/mysqldumpslow.pl\" %* . Save the mysqldumpslow.bat file. Restart the Command Prompt, and feel free to play with mysqldumpslow however you’d like. How to enable MySQL query log using dbForge Studio for MySQL by Devart GUI tools provide a user-friendly alternative to executing MySQL code directly for those who prefer a more visual and intuitive approach to database management. That is precisely what we are going to talk about next. Overview of dbForge Studio for MySQL dbForge Studio for MySQL is a convenient IDE designed to make MySQL database development and administration tasks a piece of cake. One of its [standout features](https://www.devart.com/dbforge/mysql/studio/features.html) is the ability to perform complex operations, such as enabling and configuring MySQL query l ogs. Step-by-step guide to enabling the query log Since dbForge Studio was specifically created to improve your experience through a visual approach to database management and administration, there is more than one way to see and analyze the query log. The first and basic one would be checking status messages and errors that are logged to the Output window. Errors are not only logged to the Output window but are also shown in the Error List window. The second one is more advanced. You can use the query optimization tool provided by dbForge Studio — [Query Profiler](https://www.devart.com/dbforge/mysql/studio/query-profiler.html) . It can help you profile and improve MySQL query performance, as well as track the differences in profiling results when executing the query several times. Detect slow-running queries, examine the workload, and analyze bottlenecks to resolve performance issues in MySQL databases. Profiling can also be used to determine the unexpected behavior of queries. To activate the Query Profiler: 1. Click the Query Profiling Mode button on the SQL toolbar. 2. Execute the query by clicking the Execute button or pressing F5 . In our case, we will be using this query: SELECT\n c.customer_id,\n c.first_name,\n c.last_name,\n f.title,\n COUNT(r.rental_id) AS total_rentals,\n SUM(p.amount) AS total_payments,\n AVG(p.amount) AS average_payment,\n MAX(p.amount) AS max_payment,\n MIN(p.amount) AS min_payment,\n (SELECT COUNT(*) FROM rental r2 WHERE r2.customer_id = c.customer_id) AS total_rentals_by_customer,\n (SELECT SUM(p2.amount) FROM payment p2 WHERE p2.customer_id = c.customer_id) AS total_payment_by_customer\nFROM\n rental r\nJOIN payment p ON r.rental_id = p.rental_id\nJOIN customer c ON r.customer_id = c.customer_id\nJOIN inventory i ON r.inventory_id = i.inventory_id\nJOIN film f ON i.film_id = f.film_id\nGROUP BY\n c.customer_id, c.first_name, c.last_name, f.title\nHAVING\n total_rentals > 5\nORDER BY\n total_rentals DESC,\n total_payments DESC\nLIMIT 50; The Profile tab As you can see, the profiling results are displayed conveniently on a separate tab of your SQL document, so you can easily navigate to your query and other tabs if needed without switching to additional windows and documents. Query Profiler keeps the query text along with its profiling results to let you optimize MySQL queries effectively. You need to select a required profiling result and click SQL Query . With the query changes history, you can return to any step of the query optimization, review, execute, or save the query. The Plan tab Use the [MySQL EXPLAIN plan](https://www.devart.com/dbforge/mysql/studio/explain-plan.html) to achieve better query performance with the least resource consumption. It provides a comprehensive view of how operations are executed, including the sequence in which tables are joined and accessed for optimal performance in MySQL databases. Additionally, it shows the time taken to process the rows involved in each operation. The Session Statistics tab Query Profiler automatically compares STATUS variables for the required query before and after execution in the Session Statistics tab of the Query Profiler tree. This type of data is displayed as a grid and is applicable to the current connection. This information allows you to monitor MySQL query performance to decide where to search for bottlenecks and optimize MySQL queries. Query profiling results comparison Upon analyzing the query profiling results, we see several issues that require our attention. Let’s talk about one of them for the sake of science. For example, the following two subqueries are correlated and are executed for each row in the outer query: (SELECT COUNT(*) FROM rental r2 WHERE r2.customer_id = c.customer_id) (SELECT SUM(p2.amount) FROM payment p2 WHERE p2.customer_id = c.customer_id) This can severely impact performance, especially with large datasets. What we can do in this case is rewrite these subqueries as JOINs or pre-aggregated views to reduce the computational overhead. Having made changes in the query, let us click the Get New Results button in the Profiler document view. New profiling results appear in the tree view as a new node with the time and date of query execution. The key to optimizing MySQL queries is to see the differences in profiling results after your changes. Select profiling results for two query executions by holding the CTRL key and get the differences highlighted in the grid. Note: If you are looking for major performance tuning techniques, we have just a thing for you! You will find [helpful optimization tips and tricks](https://www.devart.com/dbforge/mysql/studio/mysql-performance-tips.html) on our website. Moreover, you can learn how to troubleshoot MySQL performance with slow query log on our YouTube channel: Conclusion In this article, we talked about different ways to enable and configure query logging in MySQL. Depending on your personal style, you can choose from editing the configuration file, using MySQL commands, or opting for an IDE like dbForge Studio for MySQL with its invaluable Query Profiler feature. It is a query optimization tool that helps you profile and improve MySQL query performance, as well as track the differences in profiling results when executing the query several times. Give it a try by [downloading a 30-day free trial](https://www.devart.com/dbforge/mysql/studio/download.html) ! Tags [dbforge studio](https://blog.devart.com/tag/dbforge-studio) [enable mysql query log](https://blog.devart.com/tag/enable-mysql-query-log) [MySQL](https://blog.devart.com/tag/mysql) [mysql slow query log](https://blog.devart.com/tag/mysql-slow-query-log) [Nataly Smith](https://blog.devart.com/author/nataly-smith) dbForge Team Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-enable-mysql-query-log.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Enable%2C+Configure%2C+and+Use+MySQL+Query+Logging&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-enable-mysql-query-log.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-enable-mysql-query-log.html&title=How+to+Enable%2C+Configure%2C+and+Use+MySQL+Query+Logging) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-enable-mysql-query-log.html&title=How+to+Enable%2C+Configure%2C+and+Use+MySQL+Query+Logging) [Copy URL](https://blog.devart.com/how-to-enable-mysql-query-log.html) RELATED ARTICLES [How To](https://blog.devart.com/category/how-to) [Database Protection Guide: Best Practices for Ensuring Database Security](https://blog.devart.com/database-security.html) May 5, 2025 [How To](https://blog.devart.com/category/how-to) [Database Normalization in SQL: Key Steps, Benefits, and Examples](https://blog.devart.com/database-normalization.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [Best Database Diagram Tools– Free and Paid](https://blog.devart.com/best-database-diagram-tools.html) May 5, 2025"} {"url": "https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [Industry Insights](https://blog.devart.com/category/industry-insights) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How to Export and Import SQL Server Database Data to a SQL Script By [dbForge Team](https://blog.devart.com/author/dbforge) August 17, 2020 [0](https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html#respond) 61871 Data migration is a common challenge in the field of database development, management, and administration. This guide will illustrate the process of exporting database data to a .sql file and subsequently importing it into the target database. This method allows for significant versatility: you can either migrate the whole database or certain objects, depending on your purpose. Whether you’re rehosting data on a new server or consolidating databases, these steps will ensure a smooth migration. This article has been updated on Nov/08/2023. We have added some information on using SSMS to import and export database data, best practices, potential issues that may arise, and solutions to swiftly solve those. Contents Why exporting and importing data is crucial Methods to export and import data in SQL Server Using SQL Server Management Studio (SSMS) Export SQL Server data to a SQL script Common pitfalls Best practices Conclusion Why exporting and importing data is crucial To begin with, let us emphasize the significance of being familiar with importing and exporting techniques when working with SQL Server databases. First of all, it enables data backup and recovery in the moments of the most need, safeguarding against potential data loss or corruption. Additionally, it plays a pivotal role in data migration, ensuring a smooth transition when moving to a new server, system, or database platform. Moreover, it facilitates data sharing and collaboration by allowing easy dissemination of information among stakeholders. Lastly, exporting and importing data is integral for database performance optimization, particularly when dealing with large datasets, as it allows for efficient bulk operations. Methods to export and import data in SQL Server Moving forward, we will provide you with step-by-step tutorials on different export and import methods. This task can be completed in many ways, including the following: Using data export tools embedded in [SSMS](https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15) . With dbForge Data Compare for SQL Server. Via [SSIS](https://docs.microsoft.com/en-us/sql/integration-services/ssis-how-to-create-an-etl-package?view=sql-server-ver15) package implementation. In this article, we decided to focus on the most popular and convenient ones: using SSMS and SQL scripts. Using SQL Server Management Studio (SSMS) In order to start the SQL Server Import and Export Wizard in SSMS, follow these simple steps: In SQL Server Management Studio, connect to an instance of the SQL Server Database Engine. Expand Databases . Right-click a database. Point to Tasks . Click one of the following options. Import Data Export Data The SQL Server Import and Export Wizard can copy data to and from the data sources listed in the table below: Data Source Requirements Additional Information Enterprise databases SQL Server or SQL Server Data Tools (SSDT) Client software Drivers or providers SQL Server or SSDT installs the files needed to connect to SQL Server. For other enterprise databases like [Oracle](https://www.devart.com/dbforge/oracle/all-about-oracle-database/) or IBM DB2, you need to have the client software installed. Microsoft provides drivers and providers for Oracle, and you can get the Microsoft OLEDB Provider for DB2 v5.0 for Microsoft SQL Server from the Microsoft SQL Server 2016 Feature Pack. Text files (flat files) No additional files required Microsoft Excel and Microsoft Access files Microsoft Office Microsoft Access Database Engine 2016 Redistributable Microsoft Office does not install all the files needed to connect to Excel and Access files. You need to download the Microsoft Access Database Engine 2016 Redistributable. Azure data sources Microsoft SQL Server 2016 Integration Services Feature Pack for Azure SQL Server Data Tools do not install the files needed to connect to Azure Blob Storage. You need to download the Microsoft SQL Server 2016 Integration Services Feature Pack for Azure. Open source databases Download additional files for specific databases (e.g., PostgreSQL, MySQL) For PostgreSQL, see [Connect to a PostgreSQL Data Source](https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/connect-to-a-postgresql-data-source-sql-server-import-and-export-wizard?view=sql-server-ver16) . For MySQL, see [Connect to a MySQL Data Source](https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/connect-to-a-mysql-data-source-sql-server-import-and-export-wizard?view=sql-server-ver16) . Any other data source with available driver or provider Download additional files For sources with an ODBC driver available, see [Connect to an ODBC Data Source](https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/connect-to-an-odbc-data-source-sql-server-import-and-export-wizard?view=sql-server-ver16) . For sources with a .Net Framework Data Provider available, download the required files. For sources with an OLE DB Provider available, download the necessary files. Third-party components for SSIS May require additional files Third-party components providing source and destination capabilities for various data sources are sometimes marketed as add-on products for SQL Server Integration Services (SSIS). [Home](https://blog.devart.com/) » How to Export and Import SQL Server Database Data to a SQL Script [Home](https://blog.devart.com/) » How to Export and Import SQL Server Database Data to a SQL Script Export SQL Server data to a SQL script In this article, we are going to have a look at the [Data Pump](https://www.devart.com/dbforge/sql/data-pump/) solution, which is also a part of [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/) . This tool is sure to facilitate data import and export as it offers advanced options, templates, and a number of widely used data formats for both export and import. You can find a step-by-step guide to [designing a database for a recruitment service right here](https://blog.devart.com/sql-database-design-basics-with-example.html) . Select data export on the database level Select the required database and right-click the necessary table (if you need to export a specific table) or the database (if you need to export several tables) and select Export Data : Select data export on the table level Select the export format Next, on the Export format page, you can choose from different formats of data export. We choose SQL scripts and press Next : Select data to export Now, on the Source page, select the tables to export data from and press Next . In this case, we select three directories: 1. Company is a list of companies. 2. Position is the list of positions. 3. Skill is a list of skills. Note that it is possible to change the connection and the database whenever required. Select the script generation method Next, on the Options page, select the script generation method for data export and select if you need to include the database name in the script. Then, click Next. Note that the window suggests 4 types of script generation for data export: 1. INSERT. The script for inserting data will be generated. 2. UPDATE. The script for updating data will be generated. That is, the matching key fields will be found, and the update will be performed. 3. DELETE. The script for deleting data will be generated. That is, all data that matches the exported data by key fields on the target database side will be deleted. 4. MERGE. The script for merging data will be generated. It will include the first two types: insert and update. Select columns and key fields for export Now, on the Table columns page, you need to select the required columns and key fields for export (by default, we select all columns for export, and the key fields match the primary keys’ definitions of the corresponding tables). Then, click Next . Select data to be exported Following that, on the Exported rows page, select which data to export and press Next . Note that you can select all rows as well as an exact range of rows for data export. Set errors handling page Additionally, you can configure errors handling parameters on the Errors handling page. Note that users often select the Write a report to a log file option when they need to analyze the report results. But to make it simple, leave the default options and click Export to start exporting data. Finish the export When the export is complete, you can either click Finish or open the folder with generated scripts by pressing the Open result folder button: View the scripts As a result, there will be 3 scripts generated for each directory table: The T-SQL script is going to look the following way: SET DATEFORMAT ymd\nSET ARITHABORT, ANSI_PADDING, ANSI_WARNINGS, CONCAT_NULL_YIELDS_NULL, QUOTED_IDENTIFIER, ANSI_NULLS, NOCOUNT ON\nSET NUMERIC_ROUNDABORT, IMPLICIT_TRANSACTIONS, XACT_ABORT OFF\nGO\n\nSET IDENTITY_INSERT JobEmplDB.dbo.Skill ON\nGO\nINSERT JobEmplDB.dbo.Skill(SkillID, SkillName) VALUES (689, N'C#')\n...\nINSERT JobEmplDB.dbo.Skill(SkillID, SkillName) VALUES (14, N'SQL')\nGO\nSET IDENTITY_INSERT JobEmplDB.dbo.Skill OFF\nGO You need to apply the generated scripts to the necessary target database. But what if the data was saved in a different format? For that purpose, there is data import, which you can open by right-clicking the database or the desired table: Selecting data import on the database level: Selecting data import on the table level: Keep going in a similar way as we have performed data export. Common pitfalls When working with SQL Server data import and export, there are some rather common pitfalls you can encounter sooner or later. Thus, it is better to be informed about those to avoid them in the future. These pitfalls can lead to data inconsistencies, errors, or performance issues. Here are some of the most frequent ones: Data type mismatch : One of the most common pitfalls is not matching data types between source and target tables. This can lead to truncation, loss of precision, or conversion errors during the import/export process. Missing or incorrect data : Failing to validate data before the import/export operation can result in missing or incorrect information in the target table. Improper handling of NULL values : Not properly handling NULL values can lead to unexpected behavior. For instance, if a column is not configured to allow NULLs in the target table, but the source data contains NULLs, it can cause errors. Lack of data integrity checks : Neglecting to enforce referential integrity or other constraints can lead to inconsistencies in the target database. Insufficient permissions : Users without necessary permissions will not be able to perform data import/export operations at all. Avoiding these common pitfalls requires careful planning, thorough testing, and a good understanding of both the source and target databases. Best practices Taking into account all the common data import and export pitfalls we described in the previous section, the first recommendation suggests itself: avoid those at all costs. However, we will not leave you with just that. There are more tips and tricks to follow for the best experience: When importing large amounts of data, having improper indexes on the target table can result in slow performance. Make sure to use appropriate indexes since they can help speed up data retrieval. Keep a close eye on duplicates in the source data since they can lead to integrity issues in the target table. Always double-check the import/export options before proceeding to avoid accidentally overwriting the existing data in the target table. Write efficient SQL queries to retrieve and insert data; avoid unnecessary joins or computations. It is important to implement error handling and logging mechanisms since failing to do so can make it challenging to troubleshoot issues during the import/export process. Always perform thorough testing in a non-production environment before importing or exporting data on production. Disable triggers on the target table(s) before starting the import process and re-enable them once the operation is complete to optimize performance. Conclusion That’s all for now, folks. To summarize, this time, we have looked upon data import and export to an SQL file using a highly customizable solution from dbForge Studio for SQL Server. By the way, to discover how to import and export data in SQL Server, you can watch [this video](https://youtu.be/qF1VxSbzm4g) . Additionally, if you want to learn how to import data to an SQL Server database with the help of dbForge Data Pump during the DevOps process, feel free to watch this video. Also, visit the Documentation Center to find out more about [how to import data from a CSV file](https://docs.devart.com/studio-for-sql-server/exporting-and-importing-data/csv-import.html) . For reference, CSV is a compact text format that is used for storing tabular data. As well as that, it is a very common format as most modern spreadsheet applications (like Excel) can work with files in CSV format (export/import data). Tags [data export](https://blog.devart.com/tag/data-export) [data import](https://blog.devart.com/tag/data-import) [SQL Server](https://blog.devart.com/tag/sql-server) [SQL Server Tutorial](https://blog.devart.com/tag/sql-server-tutorial) [ssms](https://blog.devart.com/tag/ssms) [dbForge Team](https://blog.devart.com/author/dbforge) Share [Facebook](https://www.facebook.com/sharer.php?u=https%3A%2F%2Fblog.devart.com%2Fhow-to-export-and-import-sql-server-database-data-to-a-sql-script.html) [Twitter](https://twitter.com/intent/tweet?text=How+to+Export+and+Import+SQL+Server+Database+Data+to+a+SQL+Script&url=https%3A%2F%2Fblog.devart.com%2Fhow-to-export-and-import-sql-server-database-data-to-a-sql-script.html&via=Devart+Blog) [Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html&title=How+to+Export+and+Import+SQL+Server+Database+Data+to+a+SQL+Script) [ReddIt](https://reddit.com/submit?url=https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html&title=How+to+Export+and+Import+SQL+Server+Database+Data+to+a+SQL+Script) [Copy URL](https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html) RELATED ARTICLES [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [How to Use SQL Server Filtered Indexes for Better Queries](https://blog.devart.com/sql-server-filtered-indexes.html) May 9, 2025 [Products](https://blog.devart.com/category/products) [Understanding System Tables in SQL Server](https://blog.devart.com/system-tables-in-sql-server.html) May 5, 2025 [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) [SQL ALTER COLUMN Command: Quickly Change Data Type and Size](https://blog.devart.com/dbforge-sql-studio-sql-alter-column.html) May 6, 2025"} {"url": "https://blog.devart.com/how-to-export-sql-server-data-from-table-to-a-csv-file.html", "product_name": "Unknown", "content_type": "Blog", "content": "[How To](https://blog.devart.com/category/how-to) [SQL Server Tools](https://blog.devart.com/category/products/sql-server-tools) How To Export SQL Server Data From Table To a CSV File By [dbForge Team](https://blog.devart.com/author/dbforge) February 14, 2024 [0](https://blog.devart.com/how-to-export-sql-server-data-from-table-to-a-csv-file.html#respond) 89952 In this article, we are going to export a table from SQL Server to a .csv file using four different tools. Besides, you will learn how to export SQL query results with and without headers to a .csv file. Contents Method 1: Using SQL Server Management Studio Method 2: Exporting SQL results to a CSV file with and without headers Method 3: Exporting SQL data with PowerShell Method 4: Using the BCP tool Method 5: Using the GUI tool – dbForge Studio for SQL Server Conclusion Method 1: Using SQL Server Management Studio One of the most popular tools for exporting tables to a .csv file is, of course, SQL Server Management Studio. 1. In SQL Server Management Studio, connect to a database you want to export a table from. 2. Right-click the database and navigate to Tasks > Export Data : 3. In the SQL Server Import and Export Wizard window, click Next : 4. Customize the data in the Choose a Data Source window: Select SQL Server Native Client 11.0 from the Data source drop-down menu. By default, the Server name and Database fields already contain appropriate data. Select a required mode in the Authentication block. After you have adjusted the data, the window will look as follows: 5. Then click Next . 6. Customize the data in the Choose a Destination window: Select Flat File Destination from the Destination drop-down menu. Enter the file name in the File Name field. To select the file destination path, click Browse , select the path, and create the .csv file. 7. Click Next . 8. Select a required option in the Specify Table Copy or Query window and click Next . 9. Select the table you want to export from the Source table or view drop-down menu and click Next . 10. In the Save and Run Package window, click Next . 11. Read the information in the Complete the Wizard window and click Finish . After the export process has finished, there will be the report: If you want to save the report, you can click Report and select the desired option. 12. Finally, click Close . Method 2: Exporting SQL results to a CSV file with and without headers To export SQL queries results to .csv file, at first, you have to run a query in SQL Server Management Studio. Depending on your requirements, the results can be exported without or with headers. To export without headers : 1. In SQL Server Management Studio, after you have run a query, go to the Results tab. 2. Right-click the result set and click Save Results As : 3. Name the file and save it. To export with headers : 1. Create an empty .csv file on your pc. 2. In SQL Server Management Studio, after you have run a query, go to the Results tab. 3. Right-click the result set and click Select All : All rows must be highlighted. 4. Right-click the result set again and click Copy with Headers : 5. Paste the copied content into the .csv file that you have created before. Interested in export and import formats you can choose while using dbForge Studio for SQL Server? Learn more about [database export and import features](https://www.devart.com/dbforge/sql/studio/data-export-import.html) . Method 3: Exporting SQL data with PowerShell To use PowerShell for exporting SQL table to a .csv file, it is required to install an additional module for SQL Server, [SqlServer module](https://docs.microsoft.com/en-us/sql/powershell/download-sql-server-ps-module?view=sql-server-ver15) . 1. In SQL Server Management Studio, connect to a database you want to export a table from. 2. Open PowerShell ISE as Administrator and export data by running the following command: Invoke-Sqlcmd -Query \"SELECT * from ..;\" -ServerInstance \"\" | Export-Csv -Path \"file_ destination_path\" -NoTypeInformation Where: < database_name> : a database name that contains a table you want to export data from. Value example to enter is AdventureWorks2019 . : a schema name of a table you want to export data from. Value example to enter is Sales . < table_name > : a table name you want to export data from. Value example to enter is Store . < server_instance > : a name of SQL Server instance to which to connect. < file_destination_path > : a location where a specified .csv file will be stored. Value example to enter is D:\\store.csv . 3. Check the exported .csv by the location that you have specified in . Method 4: Using the BCP tool BCP (Bulk Copy Program) utility is another one tool that can be used for exporting SQL table data to a .csv file. The utility copes with the task if you need to export to .csv, .xml, .txt files. But if you need a table to be exported, for instance, to .xls file, you will have to search for another tool. 1. First of all, check whether everything works as expected. For this, open Command Prompt and type bcp ? . The output must be the following: 2. To export a table data to a .csv file, run the command below, but adjust the values: bcp .. out -S -c -t\",\" -T Here is the explanation for each value/argument in bcp command: Value/Argument Explanation database_\nname A database name that contains a table you want to \nexport data from. Value example to enter is AdventureWorks2019 . schema_\nname A schema name of a table you want to export data from. Value example to enter is Person . table_\nname A table name you want to export data from. Value example to enter is Address . out Used for copying data from a database table to a specified .csv file. file_\ndestination_path A location where a specified .csv file will be stored. Value example to enter is C:\\test\\address.csv . -S Used for specifying SQL Server instance to which to connect. server_\ninstance A name of SQL Server instance to which to connect. -c Used for performing the operation using a character data type. -t Used for setting the field terminator which will separate each column in a specified .csv file. -T Used for specifying that the bcp utility connects to \nSQL Server instance with a trusted connection (Windows Authentication). Other possible switches are: -U to connect using SQL Authentication. -P to connect using SQL Server user password. 3. Check the exported .csv by the location that you have specified in . Interested in importing and exporting data in SQL Server? Here are the must-read resources to facilitate your workflows: [How to export SQL Server data to an Excel file](https://blog.devart.com/export-sql-server-data-to-an-excel-file.html) [How to export and import SQL Server database data to a SQL script](https://blog.devart.com/how-to-export-and-import-sql-server-database-data-to-a-sql-script.html) Method 5: Using the GUI tool – dbForge Studio for SQL Server [dbForge Studio for SQL Server](https://www.devart.com/dbforge/sql/studio/download.html) ends a list of tools for exporting SQL data to a .csv file. It has the Export wizard that guides you through all the stages of the export process. The Export wizard contains many different export options to be customized. This makes the export process more optimized. 1. In dbForge Studio for SQL Server, right-click a table you want to export and click Export Data : There will open a separate window, Data Export CSV . 2. In the Export format page, select CSV and click Next : 3. In the Source page, ensure that all data is correct and click Next . 4. In the Output settings page, check a suggested destination path for the .csv file in File name field. If you want to set another location, click three dots next to the field: Here you can also select the options: Append timestamp to the file name : for adding a timestamp to the exported .csv file. If you select the option, the file will have the following name, for example, C:\\test\\data_