Import/Export Business, Est. Rube Goldberg and Co.
I want to get local access to my blog data. The problem is that it is somewhere on a server, and while I can probably open a connection to it, it is not really something that I would want to deal with on a regular basis. So, I decided to export the data locally, and work on it from there.
After some failed restarts, I had come to the realization that trying to copy that much data over a WAN is probably not a good idea. So I started to look at local export solutions, which I could then download using standard HTTP.
As it turns out, SubText actually have this ability, to push stuff out using BlogML, except that it didn't work in my case. Missing some sort of an image, apparently. Realizing that I am on my own, I girded my loins and decided that I would face the inevitable, I would have to write code to do this.
It works like this:
<%@ Import Namespace="System.Data.SqlClient" %> <%@ Import Namespace="System.Data" %> <%@ Import Namespace="System.IO" %> <%@ Import Namespace="System.IO.Compression" %> <%@ Import Namespace="System.Configuration" %> <%@ Page Language="C#"%> <%string commands = @" SELECT * FROM Subtext_Config; SELECT * FROM Subtext_Content; SELECT * FROM Subtext_Feedback;"; string connectionString = ConfigurationManager.ConnectionStrings["SubtextData"].ConnectionString; DataSet ds = new DataSet(); using(SqlConnection con = new SqlConnection(connectionString)) { SqlDataAdapter adapter = new SqlDataAdapter(commands,con); adapter.Fill(ds); } ds.DataSetName = "Export"; ds.Tables[0].TableName = "Subtext_Config"; ds.Tables[1].TableName = "Subtext_Content"; ds.Tables[2].TableName = "Subtext_Feedback"; string fileName = MapPath("~/theFileName.zip"); using(FileStream fs = File.Create(fileName)) using(GZipStream zip = new GZipStream(fs, CompressionMode.Compress)) { ds.WriteXml(zip); zip.Flush(); fs.Flush(); } Response.Write("Successfully created file"); Response.End();%>
That done, I had a ~6 MB file (~30MB uncompressed) that I could play with. I decided to use SqlCE for the database, and run this piece of code:
static void Main() { File.Delete("Blog.sdf"); string connectionString = ConfigurationManager.ConnectionStrings["Blog"].ConnectionString; using (SqlCeEngine engine = new SqlCeEngine(connectionString)) engine.CreateDatabase(); ActiveRecordStarter.Initialize(); ActiveRecordStarter.CreateSchema(); DataSet ds = new DataSet(); using (FileStream st = File.Open("theFileName.zip", FileMode.Open)) using (GZipStream zip = new GZipStream(st, CompressionMode.Decompress)) ds.ReadXml(zip); using (SqlCeConnection connection = new SqlCeConnection(connectionString)) { connection.Open(); foreach (DataTable table in ds.Tables) { SqlCeCommand command = CreateCommand(table); command.Connection = connection; foreach (DataRow row in table.Rows) { for (int i = 0; i < row.ItemArray.Length; i++) { command.Parameters[i].Value = row.ItemArray[i] ?? DBNull.Value; } command.ExecuteNonQuery(); } } } using (SqlCeEngine engine = new SqlCeEngine(connectionString)) { bool verified = engine.Verify(); Console.WriteLine("Success: {0}", verified); } } private static SqlCeCommand CreateCommand(DataTable table) { SqlCeCommand insert = new SqlCeCommand(); StringBuilder sb = new StringBuilder(); sb.Append("INSERT INTO ").Append(table.TableName).Append(" ("); foreach (DataColumn column in table.Columns) { sb.Append("[").Append(column.ColumnName).Append("], "); } sb.Remove(sb.Length - 2, 2); sb.Append(") VALUES ("); foreach (DataColumn column in table.Columns) { sb.Append("@").Append(column.ColumnName).Append(", "); SqlCeParameter parameter = new SqlCeParameter(); parameter.ParameterName = "@" + column.ColumnName; insert.Parameters.Add(parameter); } sb.Remove(sb.Length - 2, 2); sb.Append(");"); insert.CommandText = sb.ToString(); Console.WriteLine(insert.CommandText); return insert; }
I actually tried to use a DbCommandAdapter here, but I quickly realized that this is way too complicated for me, and just rolled my own quick & dirty solution.
Interestingly, the DB file is ~60 MB, but it compressed to ~6MB as well. Not sure why the binary representation is bigger than the XML file.
This is throw-away code, most probably. I just need a reasonable data set for a sample. I think that I ended up working on how to get the data more than on the sample itself, however :-)
Comments
Comment preview