Monday, August 27, 2018
Often I find I retain my local GIT branches long after their usefulness expires. Deleting them one at a time is a hassle, and probably the reason I don’t clean them up more often.
Fortunately, someone found a better way. A one line command (ok, its actually 3 commands piped together), and *poof*, all my local branches (EXCEPT master) are deleted.
Use this with caution, as you will lose all work in those branches that haven’t been merged to master yet.
git branch | grep -v "master" | xargs git branch -D
You can confirm the branches are gone by listing all your local branches:
$ git branch
* master
Wednesday, November 29, 2017
Microsoft has decided to separate the queue/topic send/receive functionality from the queue/topic management functionality. Some of these separations make sense, while others, like the inability to auto-provision new queues/topics does not.
In any event, we can still create these objects using the service bus REST API, but it requires some special handling, especially for authorization.
The send/receive client library uses a connection string for authentication. This is great. Its easy to use, and can be stored as a secret. No fuss, no muss. The REST API endpoints require a SAS token for authorization. You would think there would be a provider to produce a SAS token for a resource, given the resource path and connection string. You would be wrong. Finding working samples of the token generation using .net core 2.x was surprisingly difficult. In any event, after (too) much researching, I’ve come up with this:
public interface ISasTokenCredentialProvider
{
string GenerateTokenForResourceFromConnectionString(string resourcePath, string connectionString, TimeSpan? expiresIn = null);
}
public class SasTokenCredentialProvider:ISasTokenCredentialProvider
{
private readonly ILogger<SasTokenCredentialProvider> logger;
public SasTokenCredentialProvider(ILogger<SasTokenCredentialProvider> logger)
{
this.logger = logger;
}
public string GenerateTokenForResourceFromConnectionString(string resourcePath, string connectionString, TimeSpan? expiresIn = null)
{
if (string.IsNullOrEmpty(resourcePath))
{
throw new ArgumentNullException(nameof(resourcePath));
}
if (string.IsNullOrEmpty(connectionString))
{
throw new ArgumentException(nameof(connectionString));
}
// parse the connection string into useful parts
var connectionInfo = new ServiceBusConnectionStringInfo(connectionString);
// concatinate the service bus uri and resource paths to form the full resource uri
var fullResourceUri = new Uri(new Uri(connectionInfo.ServiceBusResourceUri), resourcePath);
// ensure its URL encoded
var fullEncodedResource = HttpUtility.UrlEncode(fullResourceUri.ToString());
// default to a 10 minute token
expiresIn = expiresIn ?? TimeSpan.FromMinutes(10);
var expiry = this.ComputeExpiry(expiresIn.Value);
// generate the signature hash
var signature = this.GenerateSignedHash($"{fullEncodedResource}\n{expiry}", connectionInfo.KeyValue);
// assembly the token
var keyName = connectionInfo.KeyName;
var token = $"SharedAccessSignature sr={fullEncodedResource}&sig={signature}&se={expiry}&skn={keyName}";
this.logger.LogDebug($"Generated SAS Token for resource: {resourcePath} service bus:{connectionInfo.ServiceBusResourceUri} token:{token}");
return token;
}
private long ComputeExpiry(TimeSpan expiresIn)
{
return DateTimeOffset.UtcNow.Add(expiresIn).ToUnixTimeSeconds();
}
private string GenerateSignedHash(string text, string signingKey)
{
using (var hmac = new HMACSHA256(Encoding.UTF8.GetBytes(signingKey)))
{
return HttpUtility.UrlEncode(Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(text))));
}
}
}
And here's the code for parsing the connection string:
public class ServiceBusConnectionStringInfo
{
private const string ServiceBusConnectionStringProtocol = "sb://";
private const string ServiceBusUriSuffix = ".servicebus.windows.net/";
public ServiceBusConnectionStringInfo(string connectionString)
{
this.ConnectionString = connectionString;
this.ServiceBusNamespace = this.ParseNamespace(connectionString);
this.ServiceBusResourceUri = this.ParseResourceUri(connectionString);
this.KeyName = this.ParseKeyName(connectionString);
this.KeyValue = this.ParseKeyValue(connectionString);
}
private string ParseResourceUri(string input)
{
var start = input.IndexOf(ServiceBusConnectionStringProtocol, StringComparison.InvariantCultureIgnoreCase) + ServiceBusConnectionStringProtocol.Length;
var stop = input.IndexOf(ServiceBusUriSuffix, StringComparison.InvariantCultureIgnoreCase)+ ServiceBusUriSuffix.Length;
return $"https://{input.Substring(start, stop - start)}";
}
private string ParseNamespace(string input)
{
var start = input.IndexOf(ServiceBusConnectionStringProtocol, StringComparison.InvariantCultureIgnoreCase) + 5;
var stop = input.IndexOf(ServiceBusUriSuffix, StringComparison.InvariantCultureIgnoreCase);
return input.Substring(start, stop - start);
}
private string ParseKeyName(string input)
{
return this.ParsePart(input, "SharedAccessKeyName");
}
private string ParseKeyValue(string input)
{
return this.ParsePart(input, "SharedAccessKey");
}
private string ParsePart(string input, string partName)
{
var parts = input.Split(";").Select(i => i.Trim()).ToArray();
var keyPart = parts.FirstOrDefault(i=>i.StartsWith(partName+"=", StringComparison.InvariantCultureIgnoreCase))
??parts.FirstOrDefault(i => i.StartsWith(partName+" =", StringComparison.InvariantCultureIgnoreCase));
// ReSharper disable once UseNullPropagation
if (keyPart != null)
{
var start = keyPart.IndexOf("=", StringComparison.InvariantCultureIgnoreCase) + 1;
return keyPart.Substring(start).Trim();
}
return null;
}
public string ServiceBusResourceUri { get; }
public string ServiceBusNamespace { get; }
public string ConnectionString { get;}
public string KeyName { get; }
public string KeyValue { get; }
}
Generating a token is now trival:
var token = sasTokenCredentialProvider.GenerateTokenForResourceFromConnectionString("MyQueue",<my sb connection string>);
Wednesday, November 22, 2017
I use swagger to document my API endpoints. I like the descriptive nature, and find the swagger UI to be a great place for quick testing and discovery.
The swagger UI works great out of the box for unsecured API endpoints, but doesn’t seem to have any built-in support for requiring users to supply an access token if its required by the endpoint.
Based on my research, it appears we can add an operation filter to inject the parameter into the swagger ui. Using the code at https://github.com/domaindrivendev/Swashbuckle/issues/290 as a guide, I’ve ported the filter to .net core (2.0) as:
/// <summary>
/// This swagger operation filter
/// inspects the filter descriptors to look for authorization filters
/// and if found, will add a non-body operation parameter that
/// requires the user to provide an access token when invoking the api endpoints
/// </summary>
public class AddAuthorizationHeaderParameterOperationFilter : IOperationFilter
{
#region Implementation of IOperationFilter
/// <summary>
/// </summary>
/// <param name="operation"></param>
/// <param name="context"></param>
public void Apply(Operation operation, OperationFilterContext context)
{
var descriptor = context.ApiDescription.ActionDescriptor;
var isAuthorized = descriptor.FilterDescriptors
.Any(i => i.Filter is AuthorizeFilter);
var allowAnonymous = descriptor.FilterDescriptors
.Any(i => i.Filter is AllowAnonymousFilter);
if (isAuthorized && !allowAnonymous)
{
if (operation.Parameters == null)
{
operation.Parameters = new List<IParameter>();
}
operation.Parameters.Add(new NonBodyParameter
{
Name = "Authorization",
In = "header",
Description = "access token",
Required = true,
Type = "string"
});
}
}
#endregion
}
and add it to the Swagger middleware
services.AddSwaggerGen(c =>
{
…
c.OperationFilter<AddAuthorizationHeaderParameterOperationFilter>();
});
That’s it! now when an endpoint requires an access token, the swagger UI will render a parameter for it:

Thursday, October 26, 2017
You can configure .NET Core to automatically push your nuget package to the package server of your choice by adding a Target to your project file.
1) If your package server requires an api key, you can set it by calling
nuget.exe SetApiKey <YourKey>
2) Add the following target to your csproj file. This sample is configured to only fire on Release Builds.
<Target Name="PushTarget" AfterTargets="Pack" Condition=" '$(Configuration)' == 'Release'">
<Message Importance="High" Text="This is a test After Build Target-->$(TargetPath)" />
<GetAssemblyIdentity AssemblyFiles="$(TargetPath)">
<Output TaskParameter="Assemblies" ItemName="AssemblyVersion" />
</GetAssemblyIdentity>
<Exec Command="dotnet nuget push $(TargetDir)..\$(TargetName).$(AssemblyVersion).nupkg -s https://www.nuget.org/api/v2/package "></Exec>
</Target>
OR
Here’s a version that will ensure releases with a .0 revision number are properly pushed.
<Target Name="PushPackageTarget" AfterTargets="Pack" Condition=" '$(Configuration)' == 'Release'">
< GetAssemblyIdentity AssemblyFiles="$(TargetPath)">
<Output TaskParameter="Assemblies" ItemName="AssemblyVersion" />
</GetAssemblyIdentity>
< PropertyGroup>
< vMajor>$([System.Version]::Parse(%(AssemblyVersion.Version)).Major)</vMajor>
< vMinor>$([System.Version]::Parse(%(AssemblyVersion.Version)).Minor)</vMinor>
< vBuild>$([System.Version]::Parse(%(AssemblyVersion.Version)).Build)</vBuild>
< vRevision>$([System.Version]::Parse(%(AssemblyVersion.Version)).Revision)</vRevision>
</PropertyGroup>
<Message Importance="High" Text="Property Group MajorVersion: $(vMajor).$(vMinor).$(vBuild)" />
<Exec Command="dotnet nuget push $(TargetDir)..\$(TargetName).$(vMajor).$(vMinor).$(vBuild).nupkg -s https://www.nuget.org/api/v2/package "></Exec>
</Target>
Friday, October 7, 2016
A simple one-liner to stop all the containers and then remove them
docker rm -f $(docker ps -a -q)
And similar for all images:
docker rmi $(docker images -q)
Wednesday, September 28, 2016
Sometimes Visual Studio updates don’t go as planned. I recently tried applying SP3 to my VS 2015 install only to have to die. Attempts to remove Visual Studio resulted in multiple failures, even when VS appeared to uninstall correctly, attempts to reinstall failed.
As I made preparations for a complete rebuild of my development PC, a friend of mine sent me a link to the Visual Studio Uninstaller. Written by the folks at Microsoft, it is the “scorch the earth”, last ditch option for uninstalling visual studio.
It took a while to run, and then re-install Visual Studio (with the SP) but it was successful in solving my problem, and prevented me from the dreaded “machine rebuild”.
here’s the link:
https://github.com/Microsoft/VisualStudioUninstaller/releases
Monday, July 25, 2016
Putting it here because I can never remember it..
docker rmi -f $(docker images -f "dangling=true" -q)
Thursday, June 25, 2015
The following query returns all the column information for columns in the specified table that participate in a FK relationship.
You can modify the query to return PK information by changing the constriant_type filter.
select * from information_schema.columns
where table_name = <TableName> AND table_schema=<Schema>and column_name not in (SELECT Col.Column_Name from
INFORMATION_SCHEMA.TABLE_CONSTRAINTS Tab,
INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE Col
WHERE
Col.Constraint_Name = Tab.Constraint_Name
AND Col.Table_Name = Tab.Table_Name
AND Constraint_Type = 'FOREIGN KEY'
AND Col.Table_Name = <TableName>)
Wednesday, July 30, 2014
? lcase(Mid$(CreateObject("Scriptlet.TypeLib").GUID, 2, 36))
Wednesday, July 2, 2014
I’m a big fan of the attribute routing in webapi 2. Its much more descriptive than using an abstracted routing table defined in my webapi config startup. Recently I had an issue where my controller endpoints where not getting invoked. Instead, the client requests were returned with a 404 response.
I had this defined in my controller:
1 [RoutePrefix("api/auth/applications")]
2 public class ApplicationController : BusinessServicesControllerBase
3 {
4 /// <summary>
5 /// The method servicing the HTTP GET verb returning a <see cref="Models.Auth.Application"/> .
6 /// </summary>
7 /// <returns><see cref="Models.Auth.Application"/> </returns>
8 [Route("{id}")]
9 [ResponseType(typeof(Models.Auth.Application))]
10 public IHttpActionResult Get(int id)
11 {
12 /// code
13 }
14 }
No matter the route address nor prefix, my routes were not getting hit. UNTIL…
Apparently there is name conflict with my controllers. I actually have 2 different controllers named ApplicationController, in two different namespaces. But the routing table doesn’t include the namespace when evaluating routes. Having duplicate controller names prevented the request from being router to a single controller, so asp.net returned a 404 error.
The answer was simple, I renamed the controller from ApplicationController to ApplicationAuthController (a unique controller name) and my problem was solved.
Wednesday, April 16, 2014
Reinstall all packages in ALL PROJECTS of the current solution:
Update-Package -Reinstall
Reinstall all packages in SPECIFIC PROJECT of the current solution:
Update-Package -ProjectName 'ProjectNameGoesHere' -Reinstall
Thursday, February 6, 2014
I Love Resharper’s Find In Solution Explorer command. It locates the currently active code file in the solution explorer. Using my keyboard layout I can activate it using Shift+Alt+L, but most times I prefer to select it from the context menu of my code window tab, you know, this thing:

Here’s how I added it:
- From the Tools menu select Customize
- On the Customize dialog click the Commands tab
- Select the Context Menu Radio button
- In the Context Menu drop down, scroll to the bottom and locate Other Context Menus | Easy MDI Document Window
- Scroll to the bottom of the list and click the Add Command button
- In the Add Command dialog, select Resharper from the Categories list.
- From the Commands list, select Resharper_LocateInSolutionExplorer

- Click Ok.
- Click the Move Up / Move Down buttons to place the command where you want it within the context menu.
- Optional – If you want a separator line above the command, with the command selected click the Modify Selection button and check Begin a Group
- Click the Close button.
That’s it! Now your active source file is one click away!
Saturday, February 1, 2014
Assuming you have a data context with a property of IDBSet<Person> and person has a property called Id, you can issue:
var person = context.PersonDbSet.FirstOrDefault(i=>i.Id=3);
here’s how you do it with reflection:
Assuming you have a reference to your data set, called dbset
Begin with defining the predicate (the where statement)
ParameterExpression parameter = Expression.Parameter(entityType, "Id");
MemberExpression property = Expression.Property(parameter, 3);
ConstantExpression rightSide = Expression.Constant(refId);
BinaryExpression operation = Expression.Equal(property, rightSide);
Type delegateType = typeof (Func<,>).MakeGenericType(entityType, typeof (bool));
LambdaExpression predicate = Expression.Lambda(delegateType, operation, parameter);
Now a reference to the extension method:
var method = typeof (System.Linq.Queryable).GetMethods(BindingFlags.Static | BindingFlags.Public)
.FirstOrDefault(m => m.Name == "FirstOrDefault" && m.GetParameters().Count() == 2);
MethodInfo genericMethod = method.MakeGenericMethod(new[] { entityType });
Finally execute the method:
object retVal = genericMethod.Invoke(null, new object[] {dbSet, predicate});
Tuesday, January 22, 2013
USE [CatalogWithMessageBroker]
GO
select q.name as QueueName, p.rows as MsgCount, case sq.is_receive_enabled when 0 then 'Disabled' else 'Enabled' end as QueueEnabled
from sys.objects as o
join sys.partitions as p on p.object_id = o.object_id
join sys.objects as q on o.parent_object_id = q.object_id
join sys.service_queues sq on sq.name = q.name
where p.index_id = 1
Friday, December 7, 2012
While troubleshooting a blocked transaction issue recently, I found this code online. My apologies in not citing its source, but its lost in my browse history some where.
While the transaction is executing and blocked, open a connection to the database containing the transaction and run the following to return both the SQL statement blocked (the Victim), as well as the statement that’s causing the block (the Culprit)
--
prepare a table so that we can filter out sp_who2 results
DECLARE @who TABLE(BlockedId INT,
Status VARCHAR(MAX),
LOGIN VARCHAR(MAX),
HostName VARCHAR(MAX),
BlockedById VARCHAR(MAX),
DBName VARCHAR(MAX),
Command VARCHAR(MAX),
CPUTime INT,
DiskIO INT,
LastBatch VARCHAR(MAX),
ProgramName VARCHAR(MAX),
SPID_1 INT,
REQUESTID INT)
INSERT INTO @who EXEC sp_who2
--select the blocked and blocking queries (if any) as SQL text
SELECT
(
SELECT TEXT
FROM sys.dm_exec_sql_text(
(SELECT handle
FROM (
SELECT CAST(sql_handle AS VARBINARY(128)) AS handle
FROM sys.sysprocesses WHERE spid = BlockedId
) query)
)
) AS 'Blocked Query (Victim)',
(
SELECT TEXT
FROM sys.dm_exec_sql_text(
(SELECT handle
FROM (
SELECT CAST(sql_handle AS VARBINARY(128)) AS handle
FROM sys.sysprocesses WHERE spid = BlockedById
) query)
)
) AS 'Blocking Query (Culprit)'
FROM @who
WHERE BlockedById != ' .'
Monday, December 3, 2012
SELECT 'Checking Broker Service Status...'
IF (select Top 1 is_broker_enabled from sys.databases where name = 'NWMESSAGE')=1
SELECT ' Broker Service IS Enabled' -- Should return a 1.
ELSE
SELECT '** Broker Service IS DISABLED ***'
/* If Is_Broker_enabled returns 0, uncomment and run this code
ALTER DATABASE NWMESSAGE SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
Alter Database NWMESSAGE Set enable_broker
GO
ALTER DATABASE NWDataChannel SET MULTI_USER
GO
*/
SELECT 'Checking For Disabled Queues....'
-- ensure the queues are enabled
-- 0 indicates the queue is disabled.
Select '** Receive Queue Disabled: '+name from sys.service_queues where is_receive_enabled = 0
--select [name], is_receive_enabled from sys.service_queues;
/*If the queue is disabled, to enable it
alter queue QUEUENAME with status=on; – replace QUEUENAME with the name of your queue
*/
-- Get General information about the queues
--select * from sys.service_queues
-- Get the message counts in each queue
SELECT 'Checking Message Count for each Queue...'
select q.name, p.rows
from sys.objects as o
join sys.partitions as p on p.object_id = o.object_id
join sys.objects as q on o.parent_object_id = q.object_id
join sys.service_queues sq on sq.name = q.name
where p.index_id = 1
-- Ensure all the queue activiation sprocs are present
SELECT 'Checking for Activation Stored Procedures....'
SELECT '** Missing Procedure: '+q.name
From sys.service_queues q
Where NOT Exists(Select * from sysobjects where xtype='p' and name='activation_'+q.name)
and q.activation_procedure is not null
DECLARE @sprocs Table (Name Varchar(2000))
Insert into @sprocs Values ('Echo')
Insert into @sprocs Values ('HTTP_POST')
Insert into @sprocs Values ('InitializeRecipients')
Insert into @sprocs Values ('sp_EnableRecipient')
Insert into @sprocs Values ('sp_ProcessReceivedMessage')
Insert into @sprocs Values ('sp_SendXmlMessage')
SELECT 'Checking for required stored procedures...'
SELECT '** Missing Procedure: '+s.name
From @sprocs s
Where NOT Exists(Select * from sysobjects where xtype='p' and name=s.name)
GO
-- Check the services
Select 'Checking Recipient Message Services...'
Select '** Missing Message Service:' + r.RecipientName +'MessageService'
From Recipient r
Where not exists (Select * from sys.services s where s.name COLLATE SQL_Latin1_General_CP1_CI_AS= r.RecipientName+'MessageService')
DECLARE @svcs Table (Name Varchar(2000))
Insert into @svcs Values ('XmlMessageSendingService')
SELECT '** Missing Service: '+s.name
From @svcs s
Where NOT Exists(Select * from sys.services where name=s.name COLLATE SQL_Latin1_General_CP1_CI_AS)
GO
/*** To Test a message send Run:
sp_SendXmlMessage 'TSQLTEST', 'CommerceEngine','<Root><Text>Test</Text></Root>'
*/
Select CAST(message_body as XML) as xml, * From XmlMessageSendingQueue
/*** clean out all queues
declare @handle uniqueidentifier
declare conv cursor for
select conversation_handle from sys.conversation_endpoints
open conv
fetch next from conv into @handle
while @@FETCH_STATUS = 0
Begin
END Conversation @handle with cleanup
fetch next from conv into @handle
End
close conv
deallocate conv
***********************
Wednesday, September 26, 2012
We recently underwent an upgrade that required us to change our database columns from varchar to NVarchar, to support unicode characters.
Digging through the internet, I found a base script which I modified to handle reserved word table names, and maintain the NULL/NotNull constraint of the columns.
I Ran this script
use NWOperationalContent – Your Catalog Name here
GO
SELECT 'ALTER TABLE ' + isnull(schema_name(syo.id), 'dbo') + '.[' + syo.name +'] '
+ ' ALTER COLUMN [' + syc.name + '] NVARCHAR(' + case syc.length when -1 then 'MAX'
ELSE convert(nvarchar(10),syc.length) end + ') '+
case syc.isnullable when 1 then ' NULL' ELSE ' NOT NULL' END +';'
FROM sysobjects syo
JOIN syscolumns syc ON
syc.id = syo.id
JOIN systypes syt ON
syt.xtype = syc.xtype
WHERE
syt.name = 'varchar'
and syo.xtype='U'
which produced a series of ALTER statements which I could then execute the tables. In some cases I had to drop indexes, alter the tables, and re-create the indexes. There might have been a better way to do that, but manually dropping them got the job done.
use NWMerchandisingContent
GO
ALTER TABLE Locale Drop Constraint PK_Locale
ALTER TABLE Country DROP CONSTRAINT PK_Country
GO
ALTER TABLE dbo.[Campaign] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[BundleLocalization] ALTER COLUMN [Locale] NVARCHAR(8) NOT NULL;
ALTER TABLE dbo.[BundleLocalization] ALTER COLUMN [UnitOfmeasure] NVARCHAR(200) NULL;
ALTER TABLE dbo.[BundleLocalization] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[BundleComponentLocalization] ALTER COLUMN [Locale] NVARCHAR(8) NOT NULL;
ALTER TABLE dbo.[BundleComponentLocalization] ALTER COLUMN [Imperative] NVARCHAR(MAX) NULL;
ALTER TABLE dbo.[BundleComponentLocalization] ALTER COLUMN [Instructions] NVARCHAR(MAX) NULL;
ALTER TABLE dbo.[BundleComponentLocalization] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[BundleComponent] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[Bundle] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[Banner] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[Video] ALTER COLUMN [Link] NVARCHAR(512) NOT NULL;
ALTER TABLE dbo.[Video] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[ProductUsage] ALTER COLUMN [VideoLink] NVARCHAR(512) NOT NULL;
ALTER TABLE dbo.[ProductUsage] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[Thumbnail] ALTER COLUMN [ActorKey] NVARCHAR(200) NOT NULL;
ALTER TABLE dbo.[SkuLocalization] ALTER COLUMN [Locale] NVARCHAR(8) NOT NULL;
ALTER TABLE dbo.[SkuLocalization] ALTER COLUMN [UnitOfMeasure] NVARCHAR(150) NOT NULL;
ALTER TABLE dbo.[SkuLocalization] ALTER COLUMN [SwatchColor] NVARCHAR(50) NOT NULL;
etc..
GO
ALTER TABLE Locale ADD CONSTRAINT PK_Locale PRIMARY KEY (LocaleId)
ALTER TABLE Country ADD CONSTRAINT PK_Country PRIMARY KEY (CountryId)
Note that this alter is non-destructive to the data.
Hope this helps.
Thursday, September 6, 2012
We use public static methods decorated with [WebMethod] to support our Ajax Postbacks.
Recently, I received an error from a UI developing stating he was receiving the following error when attempting his post back:
{
"Message": "Operation is not valid due to the current state of the object.",
"StackTrace": " at System.Web.Script.Serialization.ObjectConverter.ConvertDictionaryToObject(IDictionary`2 dictionary, Type type, JavaScriptSerializer serializer, Boolean throwOnError, Object& convertedObject)\r\n at System.Web.Script.Serialization.ObjectConverter.ConvertObjectToTypeInternal(Object o, Type type, JavaScriptSerializer serializer, Boolean throwOnError, Object& convertedObject)\r\n at System.Web.Script.Serialization.ObjectConverter.ConvertObjectToTypeMain(Object o, Type type, JavaScriptSerializer serializer, Boolean throwOnError, Object& convertedObject)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeInternal(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeDictionary(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeInternal(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeDictionary(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.DeserializeInternal(Int32 depth)\r\n at System.Web.Script.Serialization.JavaScriptObjectDeserializer.BasicDeserialize(String input, Int32 depthLimit, JavaScriptSerializer serializer)\r\n at System.Web.Script.Serialization.JavaScriptSerializer.Deserialize(JavaScriptSerializer serializer, String input, Type type, Int32 depthLimit)\r\n at System.Web.Script.Serialization.JavaScriptSerializer.Deserialize[T](String input)\r\n at System.Web.Script.Services.RestHandler.GetRawParamsFromPostRequest(HttpContext context, JavaScriptSerializer serializer)\r\n at System.Web.Script.Services.RestHandler.GetRawParams(WebServiceMethodData methodData, HttpContext context)\r\n at System.Web.Script.Services.RestHandler.ExecuteWebServiceCall(HttpContext context, WebServiceMethodData methodData)",
"ExceptionType": "System.InvalidOperationException"
}
Goggling this error brought me little support. All the results talked about increasing the aspnet:MaxJsonDeserializerMembers value to handle larger payloads. Since 1) I’m not using the asp.net ajax model and 2) the payload is very small, this clearly was not the cause of my issue.
Here’s the payload the UI developer was sending to the endpoint:
{
"FundingSource": {
"__type": "XX.YY.Engine.Contract.Funding.EvidenceBasedFundingSource, XX.YY.Engine.Contract",
"MeansType": 13,
"FundingMethodName": "LegalTender",
},
"AddToProfile": false,
"ProfileNickName": "",
"FundingAmount": 0
}
By tweaking the JSON I’ve found the culprit.
Apparently the default JSS Serializer used doesn’t like the assembly name in the __type value. Removing the assembly portion of the type name resolved my issue.
{
"FundingSource": {
"__type": "XX.YY.Engine.Contract.Funding.EvidenceBasedFundingSource",
"MeansType": 13,
"FundingMethodName": "LegalTender",
},
"AddToProfile": false,
"ProfileNickName": "",
"FundingAmount": 0
}
Thursday, April 26, 2012
The following query will find all tables in my catalog with a column name like ‘city’
SELECT t.name AS table_name,
SCHEMA_NAME(schema_id) AS schema_name,
c.name AS column_name
FROM sys.tables AS t
INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID
WHERE c.name LIKE '%City%'
ORDER BY schema_name, table_name;
Thursday, March 1, 2012
Did you know you can set your DNS Servers from the command prompt using the NETSH command (available in win 2003 and later)?
You can even specify the order in which the servers are registered in the DNS lookup
netsh interface ip set dns "Local Area Connection" static 8.8.8.8
netsh interface ip add dnsservers "Local Area Connection" 8.8.4.4 index=2
netsh interface ip add dnsservers "Local Area Connection" 10.0.8.100 index=3
If you want to configure DNS to use DHCP:
netsh interface ip set dns “Local Area Connection” dhcp
if you want to clear the table entirely:
netsh interface ip set dns “Local Area Connection” address=none
You’ll need to replace “Local Area Connection” with your LAN Connection name, should it differ.
Wednesday, February 29, 2012
Get a summary aggregation of rows in T-Sql is easy thanks to the Sum operator:
Select Sum(Qty) From Table
Why is there no Product() aggregation operation for T-Sql? Sometimes I want the values multiplied, not added.
Luckily, some one who is much smarter in math than I, observed:
log(A * B) = log(A) + log(B)
So, summing the log, and converting back to its exponential value will yield its product.
Select CAST(EXP(SUM(LOG(Qty))) as int) as ExtendedQTY
Happy Calculating!
UPDATE: The above expression seems to calculate the wrong value when the value being multiplied is a large number. For example:
Select CAST(EXP(LOG(11111)) as int) yields 11110, not 11111.
Try this instead:
DECLARE @ExtendedQty FLOAT
Select @ExtendedQTY = COALESCE(@ExtendedQTY, 1) * Table.Qty From Table
Select @ExtendedQty
As always, your feedback is welcome.
Saturday, January 28, 2012
If your connection hangs while attempting to start sql server broker service, its likely caused by the system trying to gain exclusive access to your database. Some people recommend stopping and restarting the sql server instance. I find that a little heavy-handed, like swatting a fly with a sledge hammer. Instead switch the database into single user mode, enable the broker service, and restore the database to multi-user mode.
1) Set the database to single user mode:
ALTER DATABASE [DBNAME] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
2) Enable Broker Service on the database
ALTER DATABASE [DBNAME] SET ENABLE_BROKER;
3)Restore the database to multi-user mode
ALTER DATABASE [DBNAME] SET MULTI_USER
Of course you’ll need proper permissions, but enabling the service this way prevents interruption to any other databases running on your server.
Also make sure Broker Service is enabled:
SELECT is_broker_enabled FROM sys.databases WHERE name = ‘DBNAME’;It should return 1 if its enabled.
-- Enable Service Broker:
ALTER DATABASE [DBNAME] SET ENABLE_BROKER;
-- Disable Service Broker:
ALTER DATABASE [DBNAME] SET DISABLE_BROKER;
More useful SSBS Queries
If your activation stored procured isn’t firing, it might be because the queue is disabled. Check the status of the queues with:
select [name], is_receive_enabled from sys.service_queues; – 0 indicates the queue is disabled.
To enable the queue:
alter queue QUEUENAME with status=on; – replace QUEUENAME with the name of your queue
Here’s a link to a great site with lots of useful broker service queries
http://myadventuresincoding.wordpress.com/2007/11/22/sql-server-service-broker-tips-and-tricks/
Wednesday, October 26, 2011
Tired of ISP DNS service errors? switch to use Google’s. They are FAST and ALWAYS available.
Primary: 8.8.8.8
Secondary: 8.8.4.4
1: var daysOfWeek = new[] { "Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday" };
2: var workDays = daysOfWeek.Except( new []{ "SUNDAY", "SaTURdaY"}); // Performs a case sensitive search and yields Sunday,Monday - Saturday.
3: workDays = daysOfWeek.Except(new[] { "SUNDAY", "SaTURdaY" },StringComparer.OrdinalIgnoreCase); // Performs a case insensitive search and yields Monday-Friday
The except operator takes a comparer that tells it how to evaluate the two lists. Nice one!
Thursday, June 30, 2011
Sometimes working with the js Serializer is easy, sometimes its not. When I attempt to serialize an object that is derived from a base, the serializer decided whether or not to include the type name.
When its present, the type name is represented by a ___type attribute in the serialized json like this:
{"d":{"__type":"Commerce.Integration.Surfaces.OrderCreationRequest"
,"RepId":0}}
The missing type name is a problem if I intend to ship the object back into a web method that needs to deserialize the object. Without the Type name, serialization will fail and result in a ugly web exception.
The solution, which feels more like a work-around, is to explicitly tell the serializer to ALWAYS generate the type name for each derived type. You make this declaration by adding a [GenerateScriptType())] attribute for each derived type to the top of the web page declaration.
For example, assuming I had 3 derivations of OrderCreationRequest; PersonalOrderCreationRequest, CompanyOrderCreationRequest, InternalOrderCreationRequestion, the code-behind for my web page would be decorated as follows:
[GenerateScriptType(typeof(PersonalOrderCreationRequest))]
[GenerateScriptType(typeof(CompanyOrderCreationRequest))]
[GenerateScriptType(typeof(InternalOrderCreationRequest))]
public partial class OrderMethods : Page
{
...
}
With the type names generated in the serialized JSON, the serializer can successfully deserialize instances of any of these types passed into a web method.
Hope this helps you as much as it did me.