Skip to main content

Silverlight 3 beta 1 and Virtual Earth part 1 (GEO Data)

I recently did some playing with Silverlight 3 beta 1 and the Virtual Earth (VE) CTP. I wanted to map out the congressional districts for Colorado on the VE map. It was a little tricky but not bad once I figured a few things out.

First thing to figure out was where to get the geo mapping data to overlay on the VE CTP map. The site with a lot of this information is the U.S. Census web sites (you can find all the Congressional district data here). I decided to download and work with the shape file (.shp). Once you have these downloaded the next trick is figuring out how to get this data into SQL Server 2008. The tool I used was Shape2SQL from SharpGIS. It works great and is pretty simple to use. There is also a CodePlex SQL Server spatial tools project that may have some helpful tools.

Now that we have our spatial data imported into SQL we can start using it. The first thing to be aware of is that .Net 3.5 does not natively understand SQLGeometry types (seems silly but it is true). To over come this I simply added the Microsoft.SqlServer.Types assembly to my project. I created a web service my Silverlight application could use to query and get the needed spatial data.

   1: [ServiceContract(Namespace = "")]
   2: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]
   3: public class SpatialTypeService
   4: {
   5:     [OperationContract]
   6:     public Dictionary<string, List<string>> GetCongDists()
   7:     {
   8:         string connString = // your string here
   9:  
  10:         //Create string for SQL statement to find geometries within the map view   
  11:         StringBuilder sb = new StringBuilder();
  12:         sb.Append("SELECT Name, [GEOM].Reduce(.002) FROM [Districts]");
  13:  
  14:         //Connect to database   
  15:         SqlConnection SQLConn = new SqlConnection(connString);
  16:         SQLConn.Open();
  17:  
  18:         //Select all polygons that intersect the map extents   
  19:         SqlCommand SQLCMD = new SqlCommand(sb.ToString(), SQLConn);
  20:         SqlDataReader dr = SQLCMD.ExecuteReader();
  21:  
  22:         Dictionary<string,List<string>> locations = new Dictionary<string,List<string>>();
  23:  
  24:         while (dr.Read())
  25:         {
  26:             List<string> location = new List<string>();
  27:             SqlGeometry geo = (SqlGeometry)dr.GetValue(1);
  28:             // We start at 1 because the STPointN array cannot get a zero past to it
  29:             for (int i = 1; i < geo.STNumPoints(); i++)
  30:             {
  31:                 location.Add(geo.STPointN(i).STY.ToString() + "," + geo.STPointN(i).STX.ToString());
  32:             }
  33:             locations.Add(dr.GetValue(0).ToString(),location);
  34:         }
  35:         return locations;
  36:     }
  37: }

Most of this code is just a standard WCF contract. Lets look at the couple key areas here that deal with our GEO data. First, notice at line 12 we have our SQL statement to select the data.

  12:         sb.Append("SELECT Name, [GEOM].Reduce(.002) FROM [Districts]");

Here we are Selecting Name and GEOM data for all the districts we have. Notice the “Reduce(.002)” call we make. This Reduce function is key to keep performance of mapping this data fast. For Reduce you can pass in a number between 1 and 0. This call reduces the total number of mapping points SQL returns to us. The closer to 1 you get the less mapping points are returned. I used the Shape2SQL SQL Spatial tool to quickly play with the number and see what the returned mapping shape is (screen shapes below). The key here is to reduce as much as possible without changing the shape of what is returned. In my case .002 takes me from over a thousand points per zone to around 100 points. This makes zooming in and out on the VE CTP perform A LOT better.

imageHere is what my areas look like at .002:

 

 

 

 

image

Here is what it looks like at .1. You can see how my shapes have changed a lot and in my case to much. Depending on what you are trying to do, just play with the reduce function until you get the points as low as you can.

 

 

Now for the work we do to get all the values ready and in a form our Silverlight app can use (Lines 24 – 34).

Dictionary<string,List<string>> locations = new Dictionary<string,List<string>>();
 
while (dr.Read())
{
    List<string> location = new List<string>();
    SqlGeometry geo = (SqlGeometry)dr.GetValue(1);
    // We start at 1 because the STPointN array cannot get a zero past to it
    for (int i = 1; i < geo.STNumPoints(); i++)
    {
        location.Add(geo.STPointN(i).STY.ToString() + "," + geo.STPointN(i).STX.ToString());
    }
    locations.Add(dr.GetValue(0).ToString(),location);
}

I created a dictionary item to hold the name of each area and a list of its points. Once I run my select I loop through the datareader for each record and create a collection of all its points. To do this I load up the GEOM data as a SqlGeometry type. I can then see how many STNumPoints each GEOM object has and get the X and Y value for each point. I simply load that up as a string in the list that is ‘X,Y’ format.

This will get you in a position for your Silverlight application to consume the data. In my next post I will cover what my Silverlight application is doing.

Here is a screenshot of where we are going though:

image

Comments

Popular posts from this blog

Excel XIRR and C#

I have spend that last couple days trying to figure out how to run and Excel XIRR function in a C# application. This process has been more painful that I thought it would have been when started. To save others (or myself the pain in the future if I have to do it again) I thought I would right a post about this (as post about XIRR in C# have been hard to come by). Lets start with the easy part first. In order to make this call you need to use the Microsoft.Office.Interop.Excel dll. When you use this dll take note of what version of the dll you are using. If you are using a version less then 12 (at the time of this writing 12 was the highest version) you will not have an XIRR function call. This does not mean you cannot still do XIRR though. As of version 12 (a.k.a Office 2007) the XIRR function is a built in function to Excel. Prior version need an add-in to use this function. Even if you have version 12 of the interop though it does not mean you will be able to use the function. The

Password Management

The need to create, store and manage passwords is a huge responsibility in modern day life. So why is it that so many people do it so poorly? This is a loaded questions with answers ranging from people being uneducated, to lazy, to educated but not affective in their methods and many more. This blog is to help those (in some way even myself) around me strengthen their online security. Why does it matter? To answer this let's look at a few numbers. According to the US Department of Justice (DOJ)’s most recent study , 17.6 million people in the US experience some form of identity theft each year. Ok fine but that is identity theft that has nothing to do with password management. What is one way someone can start getting information about who you are? How do they get access to steal your money? From Cyber Security Ventures 2019 report : "Cybersecurity Ventures predicts that healthcare will suffer 2-3X more cyberattacks in 2019 than the average amount for other industries. W

Experience Profile Anonymous, Unknown and Known contacts

When you first get started with Sitecore's experience profile the reporting for contacts can cause a little confusion. There are 3 terms that are thrown around, 1) Anonymous 2) Unknown 3) Known. When you read the docs they can bleed into each other a little. First, have a read through the Sitecore tracking documentation to get a feel for what Sitecore is trying to do. There are a couple key things here to first understand: Unless you call " IdentifyAs() " for request the contact is always anonymous.  Tracking of anonymous contacts is off by default.  Even if you call "IdentifyAs()" if you don't set facet values for the contact (like first name and email) the contact will still show up in your experience profile as "unknown" (because it has no facet data to display).  Enabled Anonymous contacts Notice in the picture I have two contacts marked in a red box. Those are my "known" contacts that I called "IdentifyAs"

Uniting Testing Expression Predicate with Moq

I recently was setting up a repository in a project with an interface on all repositories that took a predicate. As part of this I needed to mock out this call so I could unit test my code. The vast majority of samples out there for mocking an expression predicate just is It.IsAny<> which is not very helpful as it does not test anything other then verify it got a predicate. What if you actually want to test that you got a certain predicate though? It is actually pretty easy to do but not very straight forward. Here is what you do for the It.IsAny<> approach in case someone is looking for that. this .bindingRepository.Setup(c => c.Get(It.IsAny<Expression<Func<UserBinding, bool >>>())) .Returns( new List<UserBinding>() { defaultBinding }.AsQueryable()); This example just says to always return a collection of UserBindings that contain “defaultBinding” (which is an object I setup previously). Here is what it looks like when you want to pass in an exp

Sitecore EXM 9.1 Performance and Scale

When working with Sitecore EXM it seems like one question everyone has is what level of performance can you get out of it. As with most things, the answer is "it depends". However, there are a number of things that go into this and things to think through and adjust to try to get a high rate of sending. Sitecore Hacker has a good blog post on scaling EXM .  As I spent time trying to scale my own instance I wanted to break things down a little more and provide some more concrete examples on steps takes to performance tune and performance I have seen. Let's breakdown some specifics about the architecture to help you understand where you might stand. I am running in AWS with a dedicated Content Management server, a dedicated dispatch server, a dedicated xConnect Server and of course a dedicated database server. Here are the specifications for all. Content Management: 16 gb RAM, 2.3 Ghz 4 core processor. Dedicated Dispatch: 16 GB RAM 3.0 Ghz 8 core xConnect: 4 gb RAM