โŒ
This page contains Windows bias

About This Page

This page is part of the Azure documentation. It contains code examples and configuration instructions for working with Azure services.

Bias Analysis

Bias Types:
โš ๏ธ windows_first
โš ๏ธ powershell_heavy
โš ๏ธ missing_linux_example
โš ๏ธ windows_tools
Summary:
The documentation demonstrates a strong Windows bias by exclusively using Visual Studio (a Windows-centric IDE) and the .NET SDK for all code and project setup. All instructions and screenshots assume a Windows environment, with references to Notepad, Visual Studio, and SQL Server Management Studio (SSMS)โ€”all Windows tools. There are no examples or guidance for Linux users (e.g., using VS Code, .NET CLI, or cross-platform editors), nor are there any alternative command-line instructions (such as Bash or Azure CLI). Package installation is shown only via the NuGet Package Manager Console (PowerShell-based), with no mention of dotnet CLI or other cross-platform methods.
Recommendations:
  • Provide alternative instructions for Linux/macOS users, such as using VS Code or JetBrains Rider instead of Visual Studio.
  • Include .NET CLI commands (e.g., 'dotnet new console', 'dotnet add package ...') for project and package management, which work cross-platform.
  • Mention cross-platform editors (e.g., VS Code, Vim, Emacs) for editing files instead of only Notepad and Visual Studio.
  • Show how to run and debug the application using the dotnet CLI, not just Visual Studio menus.
  • Reference Azure CLI or Bash scripts for resource creation and management, in addition to or instead of PowerShell/Windows tools.
  • Suggest cross-platform SQL clients (e.g., Azure Data Studio, sqlcmd) for verifying results, not just SSMS.
  • Explicitly state that the tutorial can be followed on Linux/macOS with appropriate tools, and link to relevant setup guides.
GitHub Create pull request

Scan History

Date Scan ID Status Bias Status
2025-07-12 23:44 #41 in_progress โŒ Biased
2025-07-12 00:58 #8 cancelled โœ… Clean
2025-07-10 05:06 #7 processing โœ… Clean

Flagged Code Snippets

2. Use a tool such as [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/) to create the *adfv2tutorial* container, and to upload the *inputEmp.txt* file to the container. #### Create a sink SQL table Next, create a sink SQL table: 1. Use the following SQL script to create the *dbo.emp* table in your Azure SQL Database.
2. Allow Azure services to access SQL Database. Ensure that you allow access to Azure services in your server so that the Data Factory service can write data to SQL Database. To verify and turn on this setting, do the following steps: 1. Go to the [Azure portal](https://portal.azure.com) to manage your SQL server. Search for and select **SQL servers**. 2. Select your server. 3. Under the SQL server menu's **Security** heading, select **Firewalls and virtual networks**. 4. In the **Firewall and virtual networks** page, under **Allow Azure services and resources to access this server**, select **ON**. ## Create a Visual Studio project Using Visual Studio, create a C# .NET console application. 1. Open Visual Studio. 2. In the **Start** window, select **Create a new project**. 3. In the **Create a new project** window, choose the C# version of **Console App (.NET Framework)** from the list of project types. Then select **Next**. 4. In the **Configure your new project** window, enter a **Project name** of *ADFv2Tutorial*. For **Location**, browse to and/or create the directory to save the project in. Then select **Create**. The new project appears in the Visual Studio IDE. ## Install NuGet packages Next, install the required library packages using the NuGet package manager. 1. In the menu bar, choose **Tools** > **NuGet Package Manager** > **Package Manager Console**. 2. In the **Package Manager Console** pane, run the following commands to install packages. For information about the Azure Data Factory NuGet package, see [Microsoft.Azure.Management.DataFactory](https://www.nuget.org/packages/Microsoft.Azure.Management.DataFactory/).
## Run the code Build the application by choosing **Build** > **Build Solution**. Then start the application by choosing **Debug** > **Start Debugging**, and verify the pipeline execution. The console prints the progress of creating a data factory, linked service, datasets, pipeline, and pipeline run. It then checks the pipeline run status. Wait until you see the copy activity run details with the data read/written size. Then, using tools such as SQL Server Management Studio (SSMS) or Visual Studio, you can connect to your destination Azure SQL Database and check whether the destination table you specified contains the copied data. ### Sample output