Questions tagged [azure-sql-data-warehouse]

1

votes
1

answer
71

Views

Azure SQL Data Warehouse - Max concurrent queries

I have to decide to use an Azure SQL Data Warehouse or a SQL Data warehouse based on Microsoft SQL Server virtualized on a VM. The problem what i do not understand is the MAX CONCURRENT QUERIES LIMITATION TO 32. The same for the Azure SQL Database is 6400. To be honest when i want to use the Azure D...
STORM B.
1

votes
2

answer
32

Views

JOOQ with SQL DataWarehouse?

Does JOOQ support dialect for 'SQL DataWarehouse'? Any pointers .
Sid
1

votes
1

answer
72

Views

Bulk data processing using dapper and data warehouse

I am using dapper in front end to process the data and inserting it into data warehouse. I have a scenario where i need to send bulk data from from dapper to data warehouse and perform few operation on it. I can do that using data table. i can create a data table, fill it with data and then pass tha...
Jai
1

votes
1

answer
83

Views

Loading only latest files data to Azure SQL Datawarehouse

Step#1: WE are supposed to copy the CSV Files from On-Premise File Server to Azure Blob Storage (say - 'Staging' Container in Blob Storage). Step#2: Applying Polybase, we will load these files data to Azure SQL Datawarehouse. We are maintaining the same file name (sync with the Staging DB Tables), e...
Koushik
1

votes
2

answer
78

Views

Azure DataWarehouse load CSV with external Table

I can't find a complete example on how I would be able to load a CSV file directlry with a external table into a Sql Datawarehouse. The file is on a Storage account https://tstodummy.blob.core.windows.net/ Blob container referencedata-in, folder csv-uploads, file something.csv. This is my code CREAT...
Harry Leboeuf
1

votes
1

answer
67

Views

How to add a partition boundary only when not exists in SQL Data Warehouse?

I am using Azure SQL Data Warehouse Gen 1, and I create a partition table like this CREATE TABLE [dbo].[StatsPerBin1]( [Bin1] [varchar](100) NOT NULL, [TimeWindow] [datetime] NOT NULL, [Count] [int] NOT NULL, [Timestamp] [datetime] NOT NULL) WITH ( DISTRIBUTION = HASH ( [Bin1] ), CLUSTERED INDEX([Bi...
Lucas Yang
1

votes
1

answer
35

Views

How to delete row from a heap with a batch size 10000

No supported- DELETE TOP(10000) FROM dataArchival.MyTable WHERE DateLocal BETWEEN '2018-03-01' AND '2018-10-01' delete dataArchival.MyTable from dataArchival.MyTable d,#myTemp d2 where d.DateLocal=d2.DateLocal delete d from dataArchival.MyTable d ( SELECT *, RN = ROW_NUMBER() OVER(ORDER BY (SELECT...
Alivia
1

votes
1

answer
32

Views

How to use Rollup Grouping Function error in SQL DW?

I'm getting the error thatROLLUP is not a function name but the documentation says it should work Msg 104162, Level 16, State 1, Line 2 'ROLLUP' is not a recognized built-in function name. I've tried group by grouping sets but it told me the syntax was wrong, that's when I saw that grouping sets doe...
0

votes
0

answer
4

Views

How to achieve string_agg function in azure data warehouse

i'm using the string_agg function in some query bu trying to convert it into azure sql data warehouse.How could i achieve this? Can any one help me on this issue?
pythonUser
1

votes
3

answer
648

Views

Convert timezone in Azure SQL Data Warehouse

I have a DATETIME column from the US/Pacific timezone, but it is not encoded as such. How can I convert this to UTC timezone in Azure SQL Data Warehouse? The AT DATETIME T-SQL function seems like the closest fit, but it is not supported by Azure SQL Data Warehouse. https://docs.microsoft.com/en-us/s...
Erik Shilts
1

votes
1

answer
238

Views

Debug sql database scoped credentials failure

I created a scoped credential in a Azure SQL Datawarehouse database to create an external table over some files in a Azure Data Lake Store. When I try creating the external table I get the message. Msg 105061, Level 16, State 1, Line 35 Unable to find any valid credential associated with the specifi...
Molotch
1

votes
1

answer
131

Views

Error while using the write method of a dataframe object on databricks

I am trying to write some data to Azure SQL Data Warehouse using Azure Databricks and Python. The code I am using is as follows: salesdf.write\ .format('com.databricks.spark.sqldw')\ .option('url', sqlDwUrlSmall)\ .option('dbtable', 'StgSales')\ .option( 'forward_spark_azure_storage_credentials','T...
Brian Custer
1

votes
1

answer
99

Views

Visual Studio Deploy to SQL Data Warehouse Incorrect Syntax Near 'ANSI_NULLS'

I would like to deploy a Database Project in Visual Studio 2017 to Azure SQL Data Warehouse. Unfortunately each attempt fails with the following error message Failed to import target model [database name]. Detailed message Parse Error at line: 1, column: 5: Incorrect Syntax near 'ANSI_NULLS'. Connec...
lennard
1

votes
2

answer
0

Views

Strategies to prevent duplicate data in Azure SQL Data Warehouse

At the moment I am setting up an Azure SQL Data Warehouse. I am using Databricks for the ETL process with JSON-files from Azure Blob Storage. What is the best practice to make sure to not import duplicate dimensions or facts into the Azure SQL Data Warehouse? This could happen for facts e.g. in th...
Thomas Hahn
1

votes
1

answer
0

Views

How to update a Azure SQL Database/Data Warehouse table by azure Databricks?

I have a requirement in my project where i am implementing SCD type 2 table in Azure SQL DW. I am able to insert new records using JDBC connector but i need to update old records as 'expired' and update other records as per updated values.
shubham nayak
1

votes
3

answer
0

Views

CTE with DELETE - Alternative for SQL Data Warehouse

I would like to delete all rows in a table where the batchId (a running number) older than the previous two. I could probably do this in a SQL Database with the query: WITH CTE AS( SELECT *, DENSE_RANK() OVER(ORDER BY BATCHID DESC) AS RN FROM MyTable ) DELETE FROM CTE WHERE RN>2 But the same is not...
user2263025
1

votes
1

answer
19

Views

Azure SQL Data Warehouse equivalent of AWS Redshift's “UNLOAD” statement

Is there any command in Azure SQL Database/Data warehouse, that is similar to UNLOAD statement in Redshift? I am looking for a sql statement in Azure that will create a file in Azure blob.
Noor
1

votes
2

answer
37

Views

Is there any alternative of CREATE TYPE in SQL as CREATE TYPE is Not supported in Azure SQL data warehouse

I am trying to execute this query but as userdefined(Create type) types are not supportable in azure data warehouse. and i want to use it in stored procedure. CREATE TYPE DataTypeforCustomerTable AS TABLE( PersonID int, Name varchar(255), LastModifytime datetime ); GO CREATE PROCEDURE usp_upsert_cus...
gaurav modi
1

votes
1

answer
435

Views

Azure Data Warehouse Insert to Huge Table

The agreed pattern to insert new data into an already existing table in Azure Data Warehouse seems to be... create table dbo.MyTable_New with (distribution = round_robin) as select Col1 ,Col2 from dbo.MyTable union all select Col1 ,Col2 from dbo.MyNewStuff; Now, what we are seeing is that on really...
m1nkeh
1

votes
1

answer
87

Views

disable column store in azure sql dataware house

I am writing a dataframe from Azure Databricks onto a sql dataware house with res.write \ .format('jdbc') \ .option('url', url) \ .option('dbtable', table) \ .option('user', user) \ .option('password', password) \ .save() with this I am getting an error Column 'username' has a data type th...
Dee
2

votes
1

answer
49

Views

Column Name also appear as a row when querying the external table for specific column

I have a file in azure data lake store. I am using polybase to move data from data lake store to data warehouse. I followed all the steps which are mentioned here. Let's say I have created external table as External_Emp which has 3 columns : ID, Name, Dept. When I am running following query: selec...
Jai
3

votes
1

answer
524

Views

Azure data factory copy activity performance tuning

https://docs.microsoft.com/en-us/azure/data-factory/data-factory-load-sql-data-warehouse. According this link with 1000 DWU and polybase I should get 200MBps throughput. But I am getting 4.66 MBps. I have added user in xlargerc resource class to achieve best possible throughput from azure sql datawa...
vidyak
2

votes
1

answer
828

Views

Why is Polybase slow for large compressed files that span 1 billion records?

What would cause Polybase performance to degrade when querying larger datasets in order to insert records into Azure Data Warehouse from Blob storage? For example, a few thousand compressed (.gz) CSV files with headers partitioned by a few hours per day across 6 months worth of data. Querying these...
Fastidious
7

votes
2

answer
613

Views

Table Variables in Azure Data Warehouse

In a SQL Server database, one can use table variables like this: declare @table as table (a int) In an Azure Data Warehouse, that throws an error. Parse error at line: 1, column: 19: Incorrect syntax near 'table' In an Azure Data Warehouse, you can use temporary tables: create table #table (a int)...
Dan Bracuk
2

votes
1

answer
757

Views

Aggregate strings in Azure SQL Data Warehouse

Is there a way to aggregate strings in Azure SQL Data Warehouse similar to the string_agg function in SQL Server? I have records with strings that I want to concatenate into a single string. SELECT string_agg(string_col, ',') FROM table1 https://docs.microsoft.com/en-us/sql/t-sql/functions/string-ag...
Erik Shilts
2

votes
1

answer
56

Views

Panic during sql data warehouse bulkcopy

I'm writing the data to Azure SQL Datawarehouse using the go-mssql driver. I'm getting a panic thrown at random (at least I haven't been able to reliably replicate this issue) when using the bulkcopy functionality to write some data. The error is panic: runtime error: slice bounds out of range gorou...
Thihara
2

votes
1

answer
599

Views

not able to install SSDT for visual studio 2017 professional

Need your help I have successfully installed VS2017 on my computer . But when I tried to install SQL server data tools 15.6.0 or 15.5.1 it gives me error as below : Setup failed The configuration registry key could not be opened(0x800703F3) Thanks All!!
user7854107
2

votes
1

answer
241

Views

Access Azure Data Lake Analytics Tables from SQL Server Polybase

I need to export a multi terabyte dataset processed via Azure Data Lake Analytics(ADLA) onto a SQL Server database. Based on my research so far, I know that I can write the result of (ADLA) output to a Data Lake store or WASB using built-in outputters, and then read the output data from SQL server...
user796246
1

votes
2

answer
544

Views

Real time Streaming data into Azure Datawarehouse from sql server

I'm trying to build a real-time reporting service on top of Microsoft Azure Data Warehouse. Currently I have a SQL server with about 5 TB of data. I want to stream the data to the data warehouse and use Azure DW's computation power to generate real-time reporting based on data. Is there any ready to...
taffarel
1

votes
1

answer
241

Views

Cast binary column to string in Azure SQL Data Warehouse

I currently have functions in Postgres and Redshift that take a randomly generated string, hash it, then uses part of the hash to generate a random number between 0-99. I am trying to replicate this functionality in Azure SQL Data Warehouse such that I get the same value in SQL DW as I do in Postgr...
Erik Shilts
2

votes
2

answer
1k

Views

Azure Data Lake - HDInsight vs Data Warehouse

I'm in a position where we're reading from our Azure Data Lake using external tables in Azure Data Warehouse. This enables us to read from the data lake, using well known SQL. However, another option is using Data Lake Analytics, or some variation of HDInsight. Performance wise, I'm not seeing muc...
MMartin
4

votes
0

answer
174

Views

Why CTAS statement is so fast in Azure SQL DW?

I have noticed that Create Table As Select (CTAS) in SQL Data Warehouse statements are extremely fast compared to Select into statement. I want to know what magic microsoft did to make it so fast?
HimalayanNinja