Exporting SQL Server Data using BCP Utility

Problem

Develop a solution that automates the process of exporting specific data from a Microsoft SQL Server database to a CSV file using the BCP utility. The solution should allow to specify the database, table, fields, and destination file path as inputs, and generate the CSV file with the desired data automatically.

Introduction:

In today’s data-driven world, the ability to export data from databases is crucial for various purposes, such as analysis, reporting, and sharing information. One popular tool for exporting data from Microsoft SQL Server is the Bulk Copy Program (BCP) utility. In this blog post, we will explore a step-by-step guide on how to use the BCP utility to export data from a SQL Server database and save it to a CSV file.

Step 1: Understanding the BCP Utility

BCP is a command-line tool provided by Microsoft SQL Server. It allows you to import and export large amounts of data between SQL Server databases and external files. With BCP, you can control the format and structure of the exported data, making it versatile for various data export needs.

Step 2: Setting Up the Command

Let’s break down the follwing bcp command and understand its components:bcp "select fldName, fldValue from [DBNAME].[dbo].[tblName]" queryout "Q:\Jeevan\exportdata.csv" -c -t, -S DBServer -T

  • bcp: The command itself that invokes the BCP utility.
  • "select fldName, fldValue from [DBNAME].[dbo].[tblName]": The SQL query that selects the desired fields from the table you specified. Replace [DBNAME][dbo], and [tblName] with the actual database, schema, and table names, respectively.
  • queryout: Specifies that the BCP utility should export the data from the SQL query.
  • "Q:\Jeevan\exportdata.csv": The file path and name of the output CSV file. Modify it to the desired location and filename.
  • -c: Specifies that the data should be exported in a character format.
  • -t,: Specifies the field terminator character to be a comma (,) in this case.
  • -S DBServer: Specifies the SQL Server instance or server name where the database resides. Replace DBServer with the appropriate server name or connection string.
  • -T: Specifies that the BCP utility should use Windows Authentication for the connection.

Step 3: Executing the Command

To execute the BCP command, follow these steps:

  1. Open the Command Prompt or your preferred command-line interface.
  2. Navigate to the directory where the BCP utility is located. By default, it can be found in the SQL Server installation directory under the “Tools\Binn” folder (ex. C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn). IF PATH variable is configures bcp caommand can be executed anywhere
  3. Copy and paste the modified BCP command into the command-line interface.
  4. Press Enter to execute the command.

Step 4: Verifying the Exported Data

Once the BCP command is executed successfully, navigate to the file path specified in the command ("Q:\Jeevan\exportdata.csv" in the provided command) to find the exported CSV file. Open the file using a spreadsheet application or a text editor to verify the exported data.

Key features of bcp

  1. High-performance data transfer: BCP is designed for high-speed data transfer between SQL Server and external files. It can handle large volumes of data quickly and efficiently.
  2. Command-line interface: BCP is a command-line tool, which allows for automation and scripting of data import/export operations. It can be easily integrated into batch files or used within scripts for seamless data processing.
  3. Flexible data formats: BCP supports various data formats, including delimited text files, fixed-width text files, native SQL Server binary files, and XML format. This flexibility enables importing and exporting data in a format that best suits your needs.
  4. Customizable field and row terminators: BCP allows you to specify custom field and row terminators when importing or exporting data. This is particularly useful when working with non-standard or special characters in your data.
  5. Data transformations: BCP provides options for transforming data during the import or export process. You can define custom transformations using format files, which allow you to map data columns, skip columns, apply data conversions, and handle special data formats.
  6. Batch size control: BCP allows you to control the batch size when importing or exporting data. You can specify the number of rows per batch, which helps optimize performance and manage memory usage.
  7. Error handling and logging: BCP provides options to handle errors during data transfer and log error messages to a specified file. This enables you to track and troubleshoot any issues that may occur during the import or export process.
  8. Security features: BCP supports authentication and encryption options to ensure secure data transfer. You can specify the login credentials and use SSL encryption for data protection.
  9. Integration with SQL Server features: BCP integrates well with other SQL Server features and components. It can be used in conjunction with SQL Server Integration Services (SSIS), SQL Server Agent, and other tools for comprehensive data management tasks.

These are some of the main features offered by the BCP utility. Depending on your specific use case, you can leverage these features to efficiently import or export data in bulk with Microsoft SQL Server.

Conclusion:

The BCP utility is a powerful tool for exporting data from SQL Server databases to external files. By following the step-by-step guide provided in this blog post, you can easily export data from your SQL Server database and save it to a CSV file. Remember to modify the command with your specific database, table, file path, and connection details to suit your requirements.

Copy Files/Directories using Robocopy with Powershell

I have worked on some repetitive works which involves copying files between some servers. I always try to automate as it reduce the manual errors and I am lazy to do boring task in manual :).

Using powershells Copy-Item is simple and straight forward. But Robocopy provides more features. It provides many options like retry/jobs/filtering/mirroring which I use mostly when working.

We can get the all supported parameters from Robocopy documentation or simple running robocopy /? in command line.

$source = '<SOURCE_PATH>'
$destination = '<DESTINATION_PATH>'
# /E - Copies subdirectories. This option automatically includes empty directories. Refer the doc for supported parameters
$robocopyOptions = @('/E')

Write-Host 'Copying from $source to $destination'

$CmdLine = @($source, $destination) + $robocopyOptions
& 'robocopy.exe' $CmdLine

Write-Host 'Copy Completed'

Preceding command invokes Robocopy windows utility and copy the files. This is the simplest usage and we can use it for more complex scenarios.

Can not Build/Publish Visual Studio Database project Targeting Azure Sql Database v12

Recently, I was working on a visual studio Database project and trying to publish to Azure SQL. But there is no option for selecting Azure SQL Database V12 in Target platform Database dropdown.Β  Microsoft Azure SQL Database is the only option.

In order to fix that issue just open the database project file(.sqlproj) in the XML editor and manually update the DSP tag with the following.

<DSP>Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider</DSP>
That’s it. Now we should be able to publish to Azure SQL Database.
Happy Coding πŸ™‚

Read and Write/Update the XML file in Powershell

In many cases, we found ourselves in a situation where we need to Read and update XML files.

It’s quite straight forward in Power Shell.

Sample XML File:Β 

Powershell script to read and update XML file

<App>
<Secret></Secret>
</App>
view raw sample.xml hosted with ❤ by GitHub

Powershell script to read and write XML file

$xmlFileName "Path to XML File";
[xml]$xmlDoc = Get-Content $xmlFileName
$xmlDoc.APP.Secret = "Some Value"
$xmlDoc.Save($xmlFileName)

Install Azure Blob Storage module in Sitecore 9.3 on prem

By default, we can store Blobs in SQL Database. Earlier we had the option to store Blobs in the file system. From, Sitecore 9.3 we have the Blob Storage concept. Blob Storage concepts give us freedom to configure storage providers as we like. Yes, So we can configure Sitecore to store Blobs anywhere we like :).

We can install the Sitecore Azure Blob Storage module to configure Sitecore to store Blobs in Azure Storage. Do you need to Store Blobs somewhere else?. May be in Google Cloud Storage or in AWS Storage. In that case, we can implement our own storage provider implementing Sitecore.Framework.Data.Blobs.Abstraction.

Let’s take a look at how we can configure the Sitecore Azure Blob Storage module in on-prem Sitecore 9.3 instance. Please refer the Sitecore doc for installing Sitecore 9.3Β 

1.Download Sitecore Azure Blob Storage module from here

2. Create an Azure Storage Account. Refer Microsoft docs for more information

3. Create a container

4. Copy the Storage connection string. Refer Microsoft docs

5. Use MsDeploy to install Sitecore Azure Blob Storage WDP.

"<FolderPathOfMsDeploy>\msdeploy.exe" -verb:sync -source:package="<FilePathOfWDP>" -dest:auto="<RootUrlOfSitecoreInstance>" -setParam:"IIS Web Application Name"="<IISWebAppName>" -setParam:"AzureStorageConnectionString"="<AzureStorageConnectionString>" -setParam:"AzureStorageContainerName"="<AzureStorageBlobContainerName>" -setParam:"DefaultProvider"="azure" -enableRule:DoNotDeleteRule -verbose
view raw msdeploy.cmd hosted with ❤ by GitHub

Parameters:

  • FilePathOfWDP: File path to Azure Blob Storage WDP
  • RootUrlOfSitecoreInstance: Url of Sitecore instance . In my case ” https://sc93xpcm/ ” .
  • IISWebAppName: IIS Web App Name. ex: “sc93xpCM”
  • AzureStorageConnectionString: Azure storage connection string we copied from the previous step no 4.
  • AzureStorageContainerName: Azure Storage container name

Above msdeploy cmd install the module in SC instance.

We have to do extra step on on-premise installation. We need to update Connectionstring.config. We can do it manually or using XDT transformation.

  1. Manual Step.

Add below node addingAzure Storage connection string into connectionstring.config.Β 

<add name=”azureblob” connectionString=”<Azure Storage Connection String>”Β  />

Β  Β  Β  2. XDT transformΒ 

Donwload Microsoft.Web.Xdt dll from nuget

Execute following powershell scriptΒ 

function XmlDocTransform($xml, $xdt)
{
if (!$xml -or !(Test-Path -path $xml -PathType Leaf)) {
throw "File not found. $xml";
}
if (!$xdt -or !(Test-Path -path $xdt -PathType Leaf)) {
throw "File not found. $xdt";
}
$scriptPath = (Get-Variable MyInvocation -Scope 1).Value.InvocationName | split-path -parent
Add-Type -LiteralPath "$scriptPath\Microsoft.Web.XmlTransform.dll"
$xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
$xmldoc.PreserveWhitespace = $true
$xmldoc.Load($xml);
$transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
if ($transf.Apply($xmldoc) -eq $false)
{
throw "Transformation failed."
}
$xmldoc.Save($xml);
}
XmlDocTransform -xml "<PhysicalFolderOfSitecoreWebApp>\App_Config\ConnectionStrings.config" -xdt "<PhysicalFolderOfSitecoreWebApp>\App_Data\Transforms\AzureBlobStorageProvider\Xdts\App_Config\ConnectionStrings.config.xdt"

Update PhysicalFolderOfSitecoreWebApp in the script before running.Β  Make sure Microsoft.Web.XmlTransform.dll is in the same location where the script is executing.Β 

This will add a connection string node in connectionstrings.config.Β Β 

Now Sitecore is configured to store blobs in Azure Blob Storage. So when we create new Media items, blobs being stored in Azure Storage. :).Β Β 

Let’s see how we can implement custom providers to store blobs in any other storage in later posts. πŸ™‚

 

 

Azure Web App – Request Timeout Issue- 500 Error

Recently I was facing a issue with request time out in a web app in azure app services. It was a synchronous file upload which take more than 4 seconds. (Yes, off-course, synchronous way is not the optimum solution)

I investigated this issue and I found that Azure App services (Web app) has default 230 seconds of timeout. If a request take more than this time it will be a 500 Error. But still this request is allowed to continue in the background in server.

So we should keep this in mind and we should design our applications in a reactive way.

So if you are getting a request timeout in Azure web app this could be the issue.

More readings:

https://feedback.azure.com/forums/169385-web-apps/suggestions/36572656-make-web-app-timeout-of-230-seconds-configurable

https://social.msdn.microsoft.com/Forums/azure/en-US/560dc2a9-43e1-4c68-830c-6e1defe2f72d/azure-web-app-request-timeout-issue?forum=WAVirtualMachinesforWindows

https://docs.microsoft.com/en-us/azure/app-service/faq-availability-performance-application-issues#why-does-my-request-time-out-after-230-seconds

https://www.edureka.co/community/22010/azure-asp-net-webapp-the-request-timed-out

The “Using” Statement In Powershell

When we do code in c#, we have using statement to dispose our objects. so we don’t have to. :).Β  What about PowerShell? Can we do that?

Here is a PowerShell function which behaves as using statement. πŸ™‚


Function Using-Object(
[System.IDisposable]
$InputObject,
[scriptblock]
$ScriptBlock = {throw "ScriptBlock is mandatory, please provide a value."})
{
try
{
. $ScriptBlock
}
finally
{
if ($null -ne $InputObject -and $InputObject -is [System.IDisposable])
{
$InputObject.Dispose()
}
}
}

So whenever we are dealing with an object that should be disposed we can use this function as below.


# $Connection object will be diposed.
Using-Object($Connection = New-Object System.Data.SQLClient.SQLConnection($ConnectionString)) {
#code goes here.
}

view raw

using-using.ps1

hosted with ❤ by GitHub

Isn’t that cool? πŸ™‚