Could I have ordered a negative number of pizzas at a corner store? For that I probably should bake those pizzas myself, bring them over to the store, and ask the proprietor to pay me back. "Hey, there's some pizza for you guys! No delivery charges."
Yesterday a user discovered unusually high number of items on daily back order report. When I looked into the issue I found that all those items had negative ordered counts. I still have not figured out why that happened. But there's one thing I am sure about: had I created a constraint, today I would have one thing less to worry about.
Thursday 22 November 2012
Tuesday 20 November 2012
Another wheel invented or creating memo columns in cursors
In the SQL SELECT placed below, the last column is populated with the output from a function. Occasionally the length of the output exceeds 255 characters, which requires the column to be of the Memo data type.
SELECT;
product_id,;
cat.category_id,;
category_name,;
GetCategoryFullpath(product_id, cat.category_id);
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Using PADR() for extending GetCategoryFullpath return beyond 255 characters results in "String is too long to fit" error. Same happens when PADR() is applied directly in the SQL SELECT.
In VFP8 this obstacle seems to be impassable (or "unpassable" using modern English). VFP9 added CAST() function that solves the task.
SELECT;
product_id,;
cat.category_id,;
category_name,;
CAST(GetCategoryFullpath(product_id, cat.category_id) as M) as category_fullpath;
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Thanks to Naomi and Sergey on UniversalThread pointing me to the CAST(), I stopped short from wasting more time on reinventing the wheel.
SELECT;
product_id,;
cat.category_id,;
category_name,;
GetCategoryFullpath(product_id, cat.category_id);
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Using PADR() for extending GetCategoryFullpath return beyond 255 characters results in "String is too long to fit" error. Same happens when PADR() is applied directly in the SQL SELECT.
In VFP8 this obstacle seems to be impassable (or "unpassable" using modern English). VFP9 added CAST() function that solves the task.
SELECT;
product_id,;
cat.category_id,;
category_name,;
CAST(GetCategoryFullpath(product_id, cat.category_id) as M) as category_fullpath;
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Thanks to Naomi and Sergey on UniversalThread pointing me to the CAST(), I stopped short from wasting more time on reinventing the wheel.
Tuesday 13 November 2012
Moving SQL Server database to SQL Azure
The task was simple: a database -- the structure and also the data -- had to be copied from a physical SQL Server up to the Cloud, which was in that case the SQL Azure.
The first tool I tested was SQL Azure Migration Wizard. This is very decent and handy tool I must say. With just a few minor glitches I had 240 Mb database moved up in two successfully completed tries. The first one took 40+ minutes, and 80 minutes was the second. I assume that SQL Azure responsiveness may vary, especially for trial accounts the kind I was using.
The migration tool uploads the structure and the data in a single package, and simplifies the process significantly.
You have to be aware of the differences between SQL Server 2008 R2 and SQL Azure. Naming a few: unsupported DBCC CHECKIDENT should probably make you review some stored procedures in your database. Note that the migration tool moves all stored procedures to the cloud regardless of whether they will continue working properly or not.
A less subtle difference is no FILESTREAM in SQL Azure. My database initially contained some FILESTREAM data. SQL Azure compatibility forced me to review and alter the structure switching from FILESTREAM to varbinary(max).
The second thing I wanted to try was creating empty SQL Azure database by running a script. The script was to be generated in SQL Server Management Studio. As I found, generating and applying such script took minimal time and efforts.
The ability to connect to SQL Azure database in the Management Studio is a really great thing. While online SQL Azure database management (Silverlight) works quite decently, it is not yet quite on par.
You operate in familiar environment, though some features are disabled and some might not work as they used to. For example, each my attempt to open a stored procedure or a trigger for modifications failed. Disabling a trigger also failed while invoked from the context menu, but succeeded through executing a query.
The part of my present project is conversion of data stored in Visual FoxPro database to SQL Server database. To transfer the data up to the cloud I intended to use same routines I used to convert the data to the local SQL Server database.
First I modified the remote connection in VFP database container. For both local and cloud connections I already had SQL Server data sources created. The following command creates connection to my local SQL Server using data source name.
CREATE CONNECTION MyDbConnection DATASOURCE "LocalSqlServer"
Another command creates connection to SQL Azure server.
CREATE CONNECTION MyDbConnection DATASOURCE "SqlAzure" USERID "[my Azure login]" PASSWORD "[my Azure password]"
Since both connections share the name "MyDbConnection", the VFP database really "sees no difference", or let's call it has no awareness (and does not need to) of what kind the target database is. At any time, by launching CREATE CONNECTION command in VFP command line, I can reconnect the VFP database to either local or cloud SQL server.
I also made small changes in my conversion routine giving it means of recognizing the target database whether it SQL Server or SQL Azure. That was accomplished by adding a remote view, the last column of which carried the required information (5 for Azure, 3 for SQL Server 2008 R2).
CREATE SQL VIEW SqlServerProperty REMOTE as SELECT cast(SERVERPROPERTY('productversion') as varchar(50)) as product_version, cast(SERVERPROPERTY('productlevel') as varchar(50)) as product_level, cast(SERVERPROPERTY('edition') as varchar(100)) as edition, cast(SERVERPROPERTY('EngineEdition') as int) as engine_edition
So far so good. I set the connection to SQL Azure and started data conversion routine. Noticeably slower, it was running for some time without glitches, while I occasionally monitored the progress by launching short queries in the Management Studio. Unfortunately the routine failed with fairly non-descriptive message informing of failed data connection -- connectivity error dbnetlib connectionwrite send().
Partially that was caused by designing the conversion routine with local SQL Server in mind. Once the routine hits anything that may result in incompletely converted data, it simply throws an exception, and exits without trying to repeat the failed action.
. . .
Based on my brief tests, now I am more inclined to have VFP to SQL Server data conversion completed locally, sending converted data up to the cloud with SQL Azure Migration Wizard or with a similar tool.
The first tool I tested was SQL Azure Migration Wizard. This is very decent and handy tool I must say. With just a few minor glitches I had 240 Mb database moved up in two successfully completed tries. The first one took 40+ minutes, and 80 minutes was the second. I assume that SQL Azure responsiveness may vary, especially for trial accounts the kind I was using.
The migration tool uploads the structure and the data in a single package, and simplifies the process significantly.
You have to be aware of the differences between SQL Server 2008 R2 and SQL Azure. Naming a few: unsupported DBCC CHECKIDENT should probably make you review some stored procedures in your database. Note that the migration tool moves all stored procedures to the cloud regardless of whether they will continue working properly or not.
A less subtle difference is no FILESTREAM in SQL Azure. My database initially contained some FILESTREAM data. SQL Azure compatibility forced me to review and alter the structure switching from FILESTREAM to varbinary(max).
The second thing I wanted to try was creating empty SQL Azure database by running a script. The script was to be generated in SQL Server Management Studio. As I found, generating and applying such script took minimal time and efforts.
The ability to connect to SQL Azure database in the Management Studio is a really great thing. While online SQL Azure database management (Silverlight) works quite decently, it is not yet quite on par.
You operate in familiar environment, though some features are disabled and some might not work as they used to. For example, each my attempt to open a stored procedure or a trigger for modifications failed. Disabling a trigger also failed while invoked from the context menu, but succeeded through executing a query.
The part of my present project is conversion of data stored in Visual FoxPro database to SQL Server database. To transfer the data up to the cloud I intended to use same routines I used to convert the data to the local SQL Server database.
First I modified the remote connection in VFP database container. For both local and cloud connections I already had SQL Server data sources created. The following command creates connection to my local SQL Server using data source name.
CREATE CONNECTION MyDbConnection DATASOURCE "LocalSqlServer"
Another command creates connection to SQL Azure server.
CREATE CONNECTION MyDbConnection DATASOURCE "SqlAzure" USERID "[my Azure login]" PASSWORD "[my Azure password]"
Since both connections share the name "MyDbConnection", the VFP database really "sees no difference", or let's call it has no awareness (and does not need to) of what kind the target database is. At any time, by launching CREATE CONNECTION command in VFP command line, I can reconnect the VFP database to either local or cloud SQL server.
I also made small changes in my conversion routine giving it means of recognizing the target database whether it SQL Server or SQL Azure. That was accomplished by adding a remote view, the last column of which carried the required information (5 for Azure, 3 for SQL Server 2008 R2).
CREATE SQL VIEW SqlServerProperty REMOTE as SELECT cast(SERVERPROPERTY('productversion') as varchar(50)) as product_version, cast(SERVERPROPERTY('productlevel') as varchar(50)) as product_level, cast(SERVERPROPERTY('edition') as varchar(100)) as edition, cast(SERVERPROPERTY('EngineEdition') as int) as engine_edition
So far so good. I set the connection to SQL Azure and started data conversion routine. Noticeably slower, it was running for some time without glitches, while I occasionally monitored the progress by launching short queries in the Management Studio. Unfortunately the routine failed with fairly non-descriptive message informing of failed data connection -- connectivity error dbnetlib connectionwrite send().
Partially that was caused by designing the conversion routine with local SQL Server in mind. Once the routine hits anything that may result in incompletely converted data, it simply throws an exception, and exits without trying to repeat the failed action.
. . .
Based on my brief tests, now I am more inclined to have VFP to SQL Server data conversion completed locally, sending converted data up to the cloud with SQL Azure Migration Wizard or with a similar tool.
Subscribe to:
Posts (Atom)