Could I have ordered a negative number of pizzas at a corner store? For that I probably should bake those pizzas myself, bring them over to the store, and ask the proprietor to pay me back. "Hey, there's some pizza for you guys! No delivery charges."
Yesterday a user discovered unusually high number of items on daily back order report. When I looked into the issue I found that all those items had negative ordered counts. I still have not figured out why that happened. But there's one thing I am sure about: had I created a constraint, today I would have one thing less to worry about.
Thursday 22 November 2012
Tuesday 20 November 2012
Another wheel invented or creating memo columns in cursors
In the SQL SELECT placed below, the last column is populated with the output from a function. Occasionally the length of the output exceeds 255 characters, which requires the column to be of the Memo data type.
SELECT;
product_id,;
cat.category_id,;
category_name,;
GetCategoryFullpath(product_id, cat.category_id);
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Using PADR() for extending GetCategoryFullpath return beyond 255 characters results in "String is too long to fit" error. Same happens when PADR() is applied directly in the SQL SELECT.
In VFP8 this obstacle seems to be impassable (or "unpassable" using modern English). VFP9 added CAST() function that solves the task.
SELECT;
product_id,;
cat.category_id,;
category_name,;
CAST(GetCategoryFullpath(product_id, cat.category_id) as M) as category_fullpath;
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Thanks to Naomi and Sergey on UniversalThread pointing me to the CAST(), I stopped short from wasting more time on reinventing the wheel.
SELECT;
product_id,;
cat.category_id,;
category_name,;
GetCategoryFullpath(product_id, cat.category_id);
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Using PADR() for extending GetCategoryFullpath return beyond 255 characters results in "String is too long to fit" error. Same happens when PADR() is applied directly in the SQL SELECT.
In VFP8 this obstacle seems to be impassable (or "unpassable" using modern English). VFP9 added CAST() function that solves the task.
SELECT;
product_id,;
cat.category_id,;
category_name,;
CAST(GetCategoryFullpath(product_id, cat.category_id) as M) as category_fullpath;
FROM category cat;
INNER JOIN product_to_category ptc ON ptc.category_id = cat.category_id
Thanks to Naomi and Sergey on UniversalThread pointing me to the CAST(), I stopped short from wasting more time on reinventing the wheel.
Tuesday 13 November 2012
Moving SQL Server database to SQL Azure
The task was simple: a database -- the structure and also the data -- had to be copied from a physical SQL Server up to the Cloud, which was in that case the SQL Azure.
The first tool I tested was SQL Azure Migration Wizard. This is very decent and handy tool I must say. With just a few minor glitches I had 240 Mb database moved up in two successfully completed tries. The first one took 40+ minutes, and 80 minutes was the second. I assume that SQL Azure responsiveness may vary, especially for trial accounts the kind I was using.
The migration tool uploads the structure and the data in a single package, and simplifies the process significantly.
You have to be aware of the differences between SQL Server 2008 R2 and SQL Azure. Naming a few: unsupported DBCC CHECKIDENT should probably make you review some stored procedures in your database. Note that the migration tool moves all stored procedures to the cloud regardless of whether they will continue working properly or not.
A less subtle difference is no FILESTREAM in SQL Azure. My database initially contained some FILESTREAM data. SQL Azure compatibility forced me to review and alter the structure switching from FILESTREAM to varbinary(max).
The second thing I wanted to try was creating empty SQL Azure database by running a script. The script was to be generated in SQL Server Management Studio. As I found, generating and applying such script took minimal time and efforts.
The ability to connect to SQL Azure database in the Management Studio is a really great thing. While online SQL Azure database management (Silverlight) works quite decently, it is not yet quite on par.
You operate in familiar environment, though some features are disabled and some might not work as they used to. For example, each my attempt to open a stored procedure or a trigger for modifications failed. Disabling a trigger also failed while invoked from the context menu, but succeeded through executing a query.
The part of my present project is conversion of data stored in Visual FoxPro database to SQL Server database. To transfer the data up to the cloud I intended to use same routines I used to convert the data to the local SQL Server database.
First I modified the remote connection in VFP database container. For both local and cloud connections I already had SQL Server data sources created. The following command creates connection to my local SQL Server using data source name.
CREATE CONNECTION MyDbConnection DATASOURCE "LocalSqlServer"
Another command creates connection to SQL Azure server.
CREATE CONNECTION MyDbConnection DATASOURCE "SqlAzure" USERID "[my Azure login]" PASSWORD "[my Azure password]"
Since both connections share the name "MyDbConnection", the VFP database really "sees no difference", or let's call it has no awareness (and does not need to) of what kind the target database is. At any time, by launching CREATE CONNECTION command in VFP command line, I can reconnect the VFP database to either local or cloud SQL server.
I also made small changes in my conversion routine giving it means of recognizing the target database whether it SQL Server or SQL Azure. That was accomplished by adding a remote view, the last column of which carried the required information (5 for Azure, 3 for SQL Server 2008 R2).
CREATE SQL VIEW SqlServerProperty REMOTE as SELECT cast(SERVERPROPERTY('productversion') as varchar(50)) as product_version, cast(SERVERPROPERTY('productlevel') as varchar(50)) as product_level, cast(SERVERPROPERTY('edition') as varchar(100)) as edition, cast(SERVERPROPERTY('EngineEdition') as int) as engine_edition
So far so good. I set the connection to SQL Azure and started data conversion routine. Noticeably slower, it was running for some time without glitches, while I occasionally monitored the progress by launching short queries in the Management Studio. Unfortunately the routine failed with fairly non-descriptive message informing of failed data connection -- connectivity error dbnetlib connectionwrite send().
Partially that was caused by designing the conversion routine with local SQL Server in mind. Once the routine hits anything that may result in incompletely converted data, it simply throws an exception, and exits without trying to repeat the failed action.
. . .
Based on my brief tests, now I am more inclined to have VFP to SQL Server data conversion completed locally, sending converted data up to the cloud with SQL Azure Migration Wizard or with a similar tool.
The first tool I tested was SQL Azure Migration Wizard. This is very decent and handy tool I must say. With just a few minor glitches I had 240 Mb database moved up in two successfully completed tries. The first one took 40+ minutes, and 80 minutes was the second. I assume that SQL Azure responsiveness may vary, especially for trial accounts the kind I was using.
The migration tool uploads the structure and the data in a single package, and simplifies the process significantly.
You have to be aware of the differences between SQL Server 2008 R2 and SQL Azure. Naming a few: unsupported DBCC CHECKIDENT should probably make you review some stored procedures in your database. Note that the migration tool moves all stored procedures to the cloud regardless of whether they will continue working properly or not.
A less subtle difference is no FILESTREAM in SQL Azure. My database initially contained some FILESTREAM data. SQL Azure compatibility forced me to review and alter the structure switching from FILESTREAM to varbinary(max).
The second thing I wanted to try was creating empty SQL Azure database by running a script. The script was to be generated in SQL Server Management Studio. As I found, generating and applying such script took minimal time and efforts.
The ability to connect to SQL Azure database in the Management Studio is a really great thing. While online SQL Azure database management (Silverlight) works quite decently, it is not yet quite on par.
You operate in familiar environment, though some features are disabled and some might not work as they used to. For example, each my attempt to open a stored procedure or a trigger for modifications failed. Disabling a trigger also failed while invoked from the context menu, but succeeded through executing a query.
The part of my present project is conversion of data stored in Visual FoxPro database to SQL Server database. To transfer the data up to the cloud I intended to use same routines I used to convert the data to the local SQL Server database.
First I modified the remote connection in VFP database container. For both local and cloud connections I already had SQL Server data sources created. The following command creates connection to my local SQL Server using data source name.
CREATE CONNECTION MyDbConnection DATASOURCE "LocalSqlServer"
Another command creates connection to SQL Azure server.
CREATE CONNECTION MyDbConnection DATASOURCE "SqlAzure" USERID "[my Azure login]" PASSWORD "[my Azure password]"
Since both connections share the name "MyDbConnection", the VFP database really "sees no difference", or let's call it has no awareness (and does not need to) of what kind the target database is. At any time, by launching CREATE CONNECTION command in VFP command line, I can reconnect the VFP database to either local or cloud SQL server.
I also made small changes in my conversion routine giving it means of recognizing the target database whether it SQL Server or SQL Azure. That was accomplished by adding a remote view, the last column of which carried the required information (5 for Azure, 3 for SQL Server 2008 R2).
CREATE SQL VIEW SqlServerProperty REMOTE as SELECT cast(SERVERPROPERTY('productversion') as varchar(50)) as product_version, cast(SERVERPROPERTY('productlevel') as varchar(50)) as product_level, cast(SERVERPROPERTY('edition') as varchar(100)) as edition, cast(SERVERPROPERTY('EngineEdition') as int) as engine_edition
So far so good. I set the connection to SQL Azure and started data conversion routine. Noticeably slower, it was running for some time without glitches, while I occasionally monitored the progress by launching short queries in the Management Studio. Unfortunately the routine failed with fairly non-descriptive message informing of failed data connection -- connectivity error dbnetlib connectionwrite send().
Partially that was caused by designing the conversion routine with local SQL Server in mind. Once the routine hits anything that may result in incompletely converted data, it simply throws an exception, and exits without trying to repeat the failed action.
. . .
Based on my brief tests, now I am more inclined to have VFP to SQL Server data conversion completed locally, sending converted data up to the cloud with SQL Azure Migration Wizard or with a similar tool.
Thursday 27 September 2012
Testing Telerik ZIPLibrary
The ZIP library I've been testing came with Telerik WPF Controls bundle. I assume that its interface and usage does not differ significantly if at all in Telerik Silverlight, WinForm and ASP.NET libraries.
Start using the library with adding the reference to Telerik.Windows.Zip.
Files can be compressed by calling either Add or AddStream methods of ZipPackage class. The second method allows choosing between compression algorithms.
private static void TestCompress()
{
using (var compressedFile = File.Create(@"c:\temp\documents.zip"))
{
using (var zipPackage = ZipPackage.Create(compressedFile))
{
//zipPackage.Add(
// Directory.GetFiles(@"c:\temp").Where(
// item => new[] { ".doc", ".docx" }.Contains(Path.GetExtension(item))));
foreach (var file in Directory.GetFiles(@"c:\temp").Where(
item => new[] { ".doc", ".docx" }.Contains(Path.GetExtension(item))))
{
zipPackage.AddStream(
File.OpenRead(file),
Path.GetFileName(file),
ZipCompression.Deflate64,
DateTime.Now);
}
}
}
}
private static void TestDecompress()
{
const string SourceCompressedFile = @"C:\temp\documents.zip";
using (var sourceStream = File.OpenRead(SourceCompressedFile))
{
using (var zipPackage = ZipPackage.Open(sourceStream))
{
foreach (var zipPackageEntry in
zipPackage.ZipPackageEntries.OrderByDescending(item => item.FileNameInZip))
{
Console.WriteLine(zipPackageEntry.FileNameInZip);
using (var openInputStream = zipPackageEntry.OpenInputStream())
{
using (var outputStream =
File.Create(@"c:\temp\aa\" + zipPackageEntry.FileNameInZip))
{
openInputStream.CopyTo(outputStream);
}
}
}
}
}
}
The compression ratio is acceptable -- slightly below the WinZip's level and almost on par with the Windows' level.
Dec. 21, 2012: The following pair of methods shows how to work with strings. Note that Base64 conversion is used for handling zero bytes. Due to that do not expect getting any significant gain, or even any gain at all, on strings shorter than 1K.
private static string CompressString(string source)
{
using (var memoryStream =
new MemoryStream())
using (var outputStream =
new ZipOutputStream(memoryStream, ZipCompression.Default))
using (var writer =
new StreamWriter(outputStream))
{
writer.Write(source);
writer.Flush();
return Convert.ToBase64String(memoryStream.ToArray());
}
}
private static string DecompressString(string source)
{
using (var memoryStream =
new MemoryStream(Convert.FromBase64String(source)))
using (var inputStream =
new ZipInputStream(memoryStream))
using (var reader =
new StreamReader(inputStream, Encoding.UTF8))
{
return reader.ReadToEnd();
}
}
Start using the library with adding the reference to Telerik.Windows.Zip.
Files can be compressed by calling either Add or AddStream methods of ZipPackage class. The second method allows choosing between compression algorithms.
private static void TestCompress()
{
using (var compressedFile = File.Create(@"c:\temp\documents.zip"))
{
using (var zipPackage = ZipPackage.Create(compressedFile))
{
//zipPackage.Add(
// Directory.GetFiles(@"c:\temp").Where(
// item => new[] { ".doc", ".docx" }.Contains(Path.GetExtension(item))));
foreach (var file in Directory.GetFiles(@"c:\temp").Where(
item => new[] { ".doc", ".docx" }.Contains(Path.GetExtension(item))))
{
zipPackage.AddStream(
File.OpenRead(file),
Path.GetFileName(file),
ZipCompression.Deflate64,
DateTime.Now);
}
}
}
}
File extraction is performed through accessing ZipPackageEntries collection of the ZipPackage.
private static void TestDecompress()
{
const string SourceCompressedFile = @"C:\temp\documents.zip";
using (var sourceStream = File.OpenRead(SourceCompressedFile))
{
using (var zipPackage = ZipPackage.Open(sourceStream))
{
foreach (var zipPackageEntry in
zipPackage.ZipPackageEntries.OrderByDescending(item => item.FileNameInZip))
{
Console.WriteLine(zipPackageEntry.FileNameInZip);
using (var openInputStream = zipPackageEntry.OpenInputStream())
{
using (var outputStream =
File.Create(@"c:\temp\aa\" + zipPackageEntry.FileNameInZip))
{
openInputStream.CopyTo(outputStream);
}
}
}
}
}
}
The compression ratio is acceptable -- slightly below the WinZip's level and almost on par with the Windows' level.
Dec. 21, 2012: The following pair of methods shows how to work with strings. Note that Base64 conversion is used for handling zero bytes. Due to that do not expect getting any significant gain, or even any gain at all, on strings shorter than 1K.
private static string CompressString(string source)
{
using (var memoryStream =
new MemoryStream())
using (var outputStream =
new ZipOutputStream(memoryStream, ZipCompression.Default))
using (var writer =
new StreamWriter(outputStream))
{
writer.Write(source);
writer.Flush();
return Convert.ToBase64String(memoryStream.ToArray());
}
}
private static string DecompressString(string source)
{
using (var memoryStream =
new MemoryStream(Convert.FromBase64String(source)))
using (var inputStream =
new ZipInputStream(memoryStream))
using (var reader =
new StreamReader(inputStream, Encoding.UTF8))
{
return reader.ReadToEnd();
}
}
Monday 10 September 2012
Creating SQL Server database backup using the Microsoft.SqlServer.Management.Smo
Assemblies to be referenced can be found in C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies. The "100" part in the path may need to be changed to reflect the SQL Server version in use.
Microsoft.SqlServer.ConnectionInfo
Microsoft.SqlServer.Management.Sdk.Sfc
Microsoft.SqlServer.Smo
Microsoft.SqlServer.SmoExtended
A backup can be created in either sync or async mode. Below is a basic console implementation built around the Backup class of the library.
using System;
using Microsoft.SqlServer.Management.Smo;
namespace SqlServerBackupStudy
{
public class Program
{
public static void Main(string[] args)
{
TestCreateBackup(
".",
"MyDatabase",
@"c:\backup\test_backup.bak");
Console.Write("\n\nAny key...");
Console.ReadKey();
}
private static void TestCreateBackup(
string serverName,
string databaseName,
string backupPath)
{
try
{
var server = new Server(serverName);
var backup = new Backup
{
Database = databaseName,
Initialize = true
};
backup.Devices.Add(new BackupDeviceItem(
backupPath,
DeviceType.File));
backup.SqlBackup(server);
}
catch (Exception ex)
{
Console.WriteLine(
"{0}\n{1}",
ex.GetType(),
ex.Message);
}
}
}
}
Microsoft.SqlServer.Management.Smo.FailedOperationException can be caused by non-existent or unreachable server or database, as well as by invalid backup path. My guess that insufficient disk space should cause same exception.
It is important to understand that in case of DeviceType.File the destination path is meant to be a path on the machine that runs the SQL Server. In other words, do not expect creating backup of a remote database on your local drive. This may change eventually.
Database restoring routine uses the Restore class, and otherwise looks almost identical to the backup routine.
private static void TestRestoreFromBackup(
string serverName,
string databaseName,
string backupPath)
{
try
{
var server = new Server(serverName);
var restore = new Restore
{
Database = databaseName
};
restore.Devices.Add(new BackupDeviceItem(
backupPath,
DeviceType.File));
restore.SqlRestore(server);
}
catch (Exception ex)
{
Console.WriteLine(
"{0}\n{1}",
ex.GetType(),
ex.Message);
}
}
The exception may have one and more inner exception levels. The lowest one is the most instructive. Here is an example of three levels of exception.
Also one has to ensure that the target database is not in use. Otherwise the following exception would pop up.
Exclusive access could not be obtained because the database is in use.\r\nRESTORE DATABASE is terminating abnormally.
Microsoft.SqlServer.ConnectionInfo
Microsoft.SqlServer.Management.Sdk.Sfc
Microsoft.SqlServer.Smo
Microsoft.SqlServer.SmoExtended
A backup can be created in either sync or async mode. Below is a basic console implementation built around the Backup class of the library.
using System;
using Microsoft.SqlServer.Management.Smo;
namespace SqlServerBackupStudy
{
public class Program
{
public static void Main(string[] args)
{
TestCreateBackup(
".",
"MyDatabase",
@"c:\backup\test_backup.bak");
Console.Write("\n\nAny key...");
Console.ReadKey();
}
private static void TestCreateBackup(
string serverName,
string databaseName,
string backupPath)
{
try
{
var server = new Server(serverName);
var backup = new Backup
{
Database = databaseName,
Initialize = true
};
backup.Devices.Add(new BackupDeviceItem(
backupPath,
DeviceType.File));
backup.SqlBackup(server);
}
catch (Exception ex)
{
Console.WriteLine(
"{0}\n{1}",
ex.GetType(),
ex.Message);
}
}
}
}
Microsoft.SqlServer.Management.Smo.FailedOperationException can be caused by non-existent or unreachable server or database, as well as by invalid backup path. My guess that insufficient disk space should cause same exception.
It is important to understand that in case of DeviceType.File the destination path is meant to be a path on the machine that runs the SQL Server. In other words, do not expect creating backup of a remote database on your local drive. This may change eventually.
Database restoring routine uses the Restore class, and otherwise looks almost identical to the backup routine.
private static void TestRestoreFromBackup(
string serverName,
string databaseName,
string backupPath)
{
try
{
var server = new Server(serverName);
var restore = new Restore
{
Database = databaseName
};
restore.Devices.Add(new BackupDeviceItem(
backupPath,
DeviceType.File));
restore.SqlRestore(server);
}
catch (Exception ex)
{
Console.WriteLine(
"{0}\n{1}",
ex.GetType(),
ex.Message);
}
}
The exception may have one and more inner exception levels. The lowest one is the most instructive. Here is an example of three levels of exception.
- Restore failed for server [server]
- An exception occurred while executing a Transact-SQL statement or batch.
- Cannot open backup device '...'. Operating system error 3(The system cannot find the path specified.).\r\nRESTORE DATABASE is terminating abnormally.
Also one has to ensure that the target database is not in use. Otherwise the following exception would pop up.
Exclusive access could not be obtained because the database is in use.\r\nRESTORE DATABASE is terminating abnormally.
Tuesday 14 August 2012
Telerik WPF RadDatePicker, XML serialization, handling null values
To support blank values, an instance of the RadDatePicker must be bound to a value of DateTime? (nullable) type. Otherwise wiping out control's value does not change the bound value at all. Even more, it displays a nasty red border around the control.
If a bound value is a member of class intended to be serialized and de-serialized, it should be attributed in a specific way.
a) Apparently a member of DateTime? type stored in XML attribute, but only XML element should be used for that. An example is below.
private DateTime? _startDate;
[XmlElementAttribute(
Form = System.Xml.Schema.XmlSchemaForm.Unqualified,
IsNullable = true)]
public DateTime? StartDate
{
get
{
return this._startDate;
}
set
{
this._startDate = value;
this.OnPropertyChanged("StartDate");
}
}
The IsNullable parameter has to be either omitted or set to True.
b) Optionally you may include ShouldSerialize method as below. Note that the method's name must end with the member's name.
public bool ShouldSerializeStartDate()
{
return this.StartDate.HasValue;
}
If a bound value is a member of class intended to be serialized and de-serialized, it should be attributed in a specific way.
a) Apparently a member of DateTime? type stored in XML attribute, but only XML element should be used for that. An example is below.
private DateTime? _startDate;
[XmlElementAttribute(
Form = System.Xml.Schema.XmlSchemaForm.Unqualified,
IsNullable = true)]
public DateTime? StartDate
{
get
{
return this._startDate;
}
set
{
this._startDate = value;
this.OnPropertyChanged("StartDate");
}
}
The IsNullable parameter has to be either omitted or set to True.
b) Optionally you may include ShouldSerialize method as below. Note that the method's name must end with the member's name.
public bool ShouldSerializeStartDate()
{
return this.StartDate.HasValue;
}
With such method present, an element with null value will be completely removed from serialization output.
With no ShouldSerialize method implemented, such element will stay in the output but will hold no value.
<SampleEntity>
<StartDate xsi:nil="true" />
<EndDate>2012-07-31T00:00:00</EndDate>
</SampleEntity>
This post I conclude with two generic serialization routines.
public static string SerializeInstance<T>(this T instance)
{
var serializer = new XmlSerializer(typeof(T));
var stringBuilder = new StringBuilder();
var settings = new XmlWriterSettings
{
ConformanceLevel = ConformanceLevel.Document,
OmitXmlDeclaration = true
};
using (var writer = XmlWriter.Create(stringBuilder, settings))
{
serializer.Serialize(writer, instance);
return stringBuilder.ToString();
}
}
public static T DeserializeInstance<T>(string content)
{
var serializer = new XmlSerializer(typeof(T));
using (var reader = new StringReader(content ?? string.Empty))
{
var instance = (T)serializer.Deserialize(reader);
return instance;
}
}
I didn't change anything and now it's not working!
That's an explanation any developer dealing with end-users very likely have heard. In fact most of the times it means "the stupid thing is not working, I have changed something that I'm not aware of, I cannot find a suitable person or entity to blame, fix it fast because I'm already two days late with my report!".
Very nice, then today I am such an end-user. The data part of a .NET WinForm application throws an exception. One of the following postulates for sure must be false:
- "this morning everything worked"
- "I swear I did not change anything"
To make it short, in app.config I have created the appSettings section preceding the connectionStrings section. It is not my first visit into this little trap.
Very nice, then today I am such an end-user. The data part of a .NET WinForm application throws an exception. One of the following postulates for sure must be false:
- "this morning everything worked"
- "I swear I did not change anything"
To make it short, in app.config I have created the appSettings section preceding the connectionStrings section. It is not my first visit into this little trap.
Subscribe to:
Posts (Atom)