Skip to content

DRNJ

Light at the end of the Technology Tunnel

  • Home
  • About
  • Contact
DRNJ

Category: Uncategorized

AutoMapper and “Could not load type ‘SqlGuidCaster'” Error

April 7, 2025

Background

I had a VS 2022 solution utilising .NET Core 3. I wanted to upgrade it to .NET Core 9 so I “upgraded” the projects withing the solution and updated the nugets to reflect later versions utilising “9”. So far so good

I ran up the WPF UI – this worked correctly, entities were read from SQL via entity framework and all was good and sunny.

Then I ran my unit tests. All good

Then I ran my integration tests. They failed! But, to be fair, they are fairly noddy tests but at least they tested reading/writing entities to the database to test the sanity and functionality of my code. So I had a problem

The Problem

The error returned from the tests (utilising XUnit) was

System.Reflection.ReflectionTypeLoadException : Unable to load one or more of the requested types.
        /// Could not load type 'SqlGuidCaster' from assembly 'Microsoft.Data.SqlClient, Version=5.0.0.0,

Hmm. I checked my solution. I don’t reference Microsoft.Data.SqlClient. So what was going on?

The Solution

The errant line of code throwing an exception was some initialisation code for Automapper

        public static void ConfigureAutoMapperServices(this IServiceCollection services)
        {
            services.AddAutoMapper(AppDomain.CurrentDomain.GetAssemblies());
        }

It would appear that the AppDomain.CurrentDomain.GetAssemblies() was causing the issue. After some searching I found this article and this one. So I think that there is a bug in version 5.x of Microsoft.Data.SqlClient and this is/was causing the issue.

The solution? To explicitly add a nuget for version 6.x.x of Microsoft.Data.SqlClient to the project

Uncategorized .NET Core

OpenVPN on Docker and the Strange Error Message Saga

December 29, 2024

Background

A “dockerised” version of openvpn server was running on a linux server and had been running without issue for years (as per these instructions).

The Problem

However, a fresh install of the OpenVPN client on an iPad led to error messages about an insecure hash

You are using insecure hash algorithm in CA signature.
Please regenerate CA with other hash algorithm

After some head-scratching it was realised that the home-generated certificates (CA and others) had an insecure hash algorithm (i.e. obsolete or unsafe). So it was decided to regenerate the certificates. But, firstly (which turns out to have been the mistake) was to do

apt-get upgrade

to ensure that the latest version of openssl etc was installed.

An attempt to use the previous method for certificate generation was used, however, the CA.sh script no longer works. So the certificates were generated by hand (with thanks to these guys). The openvpn server config was updated for the new certificates and the docker container restarted.

The Saga and its Solution

Hmm, the vpn did not appear to work. Had I made an error in the openvpn.conf file whilst during the fix? No, the problem turns out to be a strange issue where the container (which has run for years) was complaining about two things.

Firstly

docker container start -a my-openvpn-container

gave a clue.

Checking IPv6 Forwarding
Sysctl error for default forwarding, please run docker with '--sysctl net.ipv6.conf.default.forwarding=1'
Sysctl error for all forwarding, please run docker with '--sysctl net.ipv6.conf.all.forwarding=1'

This was a new error to me. The original container had been created via a command

docker run -v $PWD/vpn-data:/etc/openvpn -d -p 3000:1194/udp --cap-add=NET_ADMIN myownvpn

So I tried running the container with –sysctl commands:

docker run --sysctl net.ipv6.conf.default.forwarding=1 --sysctl net.ipv6.conf.all.forwarding=1 -v $PWD/vpn-data:/etc/openvpn -p 3000:1194/udp --cap-add=NET_ADMIN myownvpn

This solved the first issue, although it turns out not to be a real issue with the docker run command, the issue appears to be with a docker container run and stop/started prior to my linux apt-get upgrade. As our American cousins say “go figure”. So this ipv6 “issue” appears to be a red herring (but is documented here for completeness)

The second issue (which turns out to be the real issue) was pulled from the openvpn.log (I have a line log-append /etc/openvpn/logs/openvpn.log in the the openvpn.conf). The error message was

ERROR: Cannot open TUN/TAP dev /dev/net/tun:

After some searching I found this page which suggested adding

--device=/dev/net/tun

to the run command

docker run -d --device=/dev/net/tun --sysctl net.ipv6.conf.default.forwarding=1 --sysctl net.ipv6.conf.all.forwarding=1 -v $PWD/vpn-data:/etc/openvpn -p 3000:1194/udp --cap-add=NET_ADMIN myownvpn

And, voila, everything works again.

Conclusion

I have no idea what happened with the linux upgrade but, firstly, obviously something changed that stopped an existing container that had been stop/started many times from running correctly, no idea what/why. Secondly, no idea what changed to necessitate the addition of –device=/dev/net/tun to the docker run command, but this solved the issue.

Docker, Uncategorized Docker, OpenVPN

Raspberry Pi 4 and Ubuntu and SSD

November 16, 2021

The Problem

I made the mistake of trying to install Ubuntu on an SSD on an already working Raspberry Pi 4. The device as Raspian on it and was working normally for over six months. However, I was playing with docker and wanted some features not available in Raspbian. Rather than flash an SD Card I thought that I would use a Samsung 840 Pro 500GB SSD that I had lying around.

I followed the instructions from Ubuntu I tried to install Ubuntu Server 21 64Bit. The SSD was written correctly but when I booted the Pi 4 the boot took ages and I got many messages of the type

EXT4-fs error (device nvme0n1p7): ext4_find_entry:1463: inode #525023

on the console. I tried Ubuntu 20 instead, this would not boot. I tried workstation 21 and it showed the same issues as server. So I reverted to Ubuntu 21 server.

Investigation

I searched for P i4 Ubuntu SSD issues and found this article. So at least I was not alone in my problems.

After more searching I found an article that suggested adding

pcie_aspm=force

or

nvme_core.default_ps_max_latency_us=5500

To the grub configuration.

Great…apart from it seems that Ubuntu on a Pi 4 doesn’t seem to use grub (perhaps I am wrong) so I couldn’t update any grub boot configuration.

A bit of searching at turns out that on the Pi you can use “quirks” mode for the USB controller. All you need is the device ID….

Following this article I found out how to get this ID for my USB enclosure.

The Solution

I flashed the SSD with a clean install of Ubuntu Server 21 (on my Windows 10 desktop). The SSD has a boot device which is readble from WIndows. I edited cmdline.txt and added

usb-storage.quirks=152d:0578:u

to the end of the single line in that file

dwc_otg.lpm_enable=0 console=serial0,115200 console=tty1 root=LABEL=writable rootfstype=ext4 elevator=deadline rootwait fixrtc quiet splash usb-storage.quirks=152d:0578:u

The 152d:0578 is the ID of my USB controller.

Plug the SSD into the Pi 4 and, voila, it boots and works

 

 

Uncategorized

.Net Core WebAPI Request and Response Logging

July 6, 2020

The Problem

In .NET 4.x it was straightforward to read the request body of an incoming HTTP request, log the information and also log the outgoing response back to the caller. I thought it would be just as easy in .Net Core, how wrong a I was as Microsoft do their utmost to hide the Request body from you.

The Solution

Accessing the Request and Response bodies must be done in HTTP Middleware, and, after reading much on StackOverflow I came up with the following solution (and my thanks to all the original authors and people who helped me). I cannot take credit for this code, I just adapted it from ideas on StackOverflow

Remember 

For .Net Core 2.1  use  context.HttpContext.Request.EnableRewind();
For .Net Core 3.1  context.HttpContext.Request.EnableBuffering();

 

/// <summary>
/// Want to be able to read the Request.Body to get the incoming JSON
/// In .net 4 this was more or less simple. In Core it has changed
/// Need to be able to "rewind" of Request.Body - have to do that
/// Thanks to
/// https://stackoverflow.com/questions/40494913/how-to-read-request-body-in-a-asp-net-core-webapi-controller
/// in request pipeline middleware
/// 
/// For ASP.NET 2.1
/// context.HttpContext.Request.EnableRewind();
/// For ASP.NET 3.1
/// context.HttpContext.Request.EnableBuffering();
///
/// See also https://stackoverflow.com/questions/43403941/how-to-read-asp-net-core-response-body/43404745
/// </summary>
public class LogRequestResponseMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger _logger;

private readonly RecyclableMemoryStreamManager _recyclableMemoryStreamManager;
private const int ReadChunkBufferLength = 4096;

private bool LogRequest;
private bool logResponse;

public LogRequestResponseMiddleware(RequestDelegate next, ILoggerFactory loggerFactory, bool requestLogging, bool responseLogging)
{
//--------------------------------------------------------------------
// Config |
//--------------------------------------------------------------------

this.LogRequest = requestLogging;
this.logResponse = responseLogging;

_next = next;
_logger = loggerFactory.CreateLogger<LogRequestResponseMiddleware>();

//--------------------------------------------------------------------
// Create stream to use in place of HttpContext.Response.Body stream |
// which is write-only |
//--------------------------------------------------------------------

_recyclableMemoryStreamManager = new RecyclableMemoryStreamManager();
}


public virtual async Task Invoke(HttpContext context)
{
//-----------------------------------------------------
// Request |
//-----------------------------------------------------

if (this.LogRequest)
{
//-------------------------------------------------
// Turn on Buffering/Rewind |
//-------------------------------------------------
context.Request.EnableBuffering();

//-----------------------------------------------------
// Log Request |
//-----------------------------------------------------
await this.LogRequestAsync(context);
}


//-----------------------------------------------------
// Response |
//-----------------------------------------------------
if (this.logResponse)
{
//-----------------------------------------------------
// Log Response |
//-----------------------------------------------------
await this.LogResponseAsync(context);
}
else
{
//-----------------------------------------------------
// Fire next in chain |
//-----------------------------------------------------
await _next.Invoke(context);
}

}


/// <summary>
/// Log the incoming request
/// </summary>
/// <param name="context"></param>
/// <returns></returns>
private async Task LogRequestAsync(HttpContext context)
{
using (var requestStream = _recyclableMemoryStreamManager.GetStream())
{
var origPosition = context.Request.Body.Position;
context.Request.Body.Position = 0;
await context.Request.Body.CopyToAsync(requestStream);
context.Request.Body.Position = origPosition;

_logger.LogInformation($"Http Request Information:{Environment.NewLine}" +
$"Schema:{context.Request.Scheme} " +
$"Host: {context.Request.Host} " +
$"Path: {context.Request.Path} " +
$"QueryString: {context.Request.QueryString} " +
$"Request Body: {ReadStreamInChunks(requestStream)}");
}
}


private async Task LogResponseAsync(HttpContext context)
{
//--------------------------------------------------------
// Keep reference to original body |
//--------------------------------------------------------
var originalBody = context.Response.Body;

using (var responseStream = _recyclableMemoryStreamManager.GetStream())
{
//-----------------------------------------------------
// Replace existing with read/write stream |
//-----------------------------------------------------
context.Response.Body = responseStream;

//-----------------------------------------------------
// Fire next in chain |
//-----------------------------------------------------
await _next.Invoke(context);

//-----------------------------------------------------
// Get Result Back |
//-----------------------------------------------------
string resp = ReadStreamInChunks(responseStream);

//------------------------------------------------------
// Write it back to original Response Body |
//------------------------------------------------------

byte[] response = Encoding.ASCII.GetBytes(resp);
await originalBody.WriteAsync(response, 0, response.Length);

//------------------------------------------------------
// Log It |
//------------------------------------------------------

_logger.LogInformation($"Http Response Information:{Environment.NewLine}" +
$"Schema:{context.Request.Scheme} " +
$"Host: {context.Request.Host} " +
$"Path: {context.Request.Path} " +
$"QueryString: {context.Request.QueryString} " +
$"Response Body: {resp}");
}

//-----------------------------------------------------------
// Restore |
//-----------------------------------------------------------
context.Response.Body = originalBody;
}


private static string ReadStreamInChunks(Stream stream)
{
stream.Seek(0, SeekOrigin.Begin);
string result;
using (var textWriter = new StringWriter())
using (var reader = new StreamReader(stream))
{
var readChunk = new char[ReadChunkBufferLength];
int readChunkLength;
//do while: is useful for the last iteration in case readChunkLength < chunkLength
do
{
readChunkLength = reader.ReadBlock(readChunk, 0, ReadChunkBufferLength);
textWriter.Write(readChunk, 0, readChunkLength);
} while (readChunkLength > 0);

result = textWriter.ToString();
}

return result;
}

}
Uncategorized

VMWare Workstation Doesn’t Work on Windows 10

November 22, 2019

The Problem

I recently updated my Windows 10 desktop to the latest “improved” version of Windows 10. I rarely install Windows updates due to my mantra of “if it ain’t broke don’t fix it”, however, I ignored my mantra and opened up a plethora of problems.

One of the big problems is that I run (ran) VMWare Workstation – and version 7 to be specific. This is an old version but my VMs work, I purchased the copy so why upgrade? After the Windows 10 updates VMWare workstation would not run:

VMware Workstation Pro can't run on Windows

It would appear that one (or more) of the Windows updates was prevening VMWare from running.

The Solution

I googled and found many articles recommending uninstalling verious Windows updates – this seems dangerous and can leave you machine unuseable (the voice of experience). I then found this article. Scroll down to

There is no need to replace the entire sysmain.sdb (this action could expose your machine to unpredicatble problems).

You can simply run the Compatibility Administrator Toolkit (https://docs.microsoft.com/it-it/windows-hardware/get-started/adk-install), and disable the entry under the “System Database/Applications/VMware Workstation Pro” path. That’s all.

Here it explains about installing the Compatability Administrator Toolkit (downloadable from Microsoft). Then run the “Compatability Administrator” (this will have been installed as part of the toolkit) by click taskbar search, type “Compatability Administrator”. Within the tool search for “VMWare Workstation” under System Databases-> Application

Right click vmware.exe and click “Disable Entry”. Repeat for VMWare Workstation Pro.

VMWare Workstation should then run

Thanks MicroSoft.

Uncategorized

Docker Hello World on Windows 10 Fails

September 12, 2019

The Problem

I was attempting to follow the Docker “Getting Started” documentation and tried to run the “Hello World” Example and I got

C:\>docker run hello-world
Unable to find image 'hello-world:latest' locally
latest: Pulling from library/hello-world
docker: no matching manifest for windows/amd64 10.0.16299 in the manifest list entries.
See 'docker run --help'.

What does this mean ?

After much head-scratching and searching (mostly unhelpful posts about using Linux containers or “experimental mode”) I came across this article about docker manifests. It did not make much sense but after some more head-scratching I worked out what the error message meant.

I ran

C:\>docker manifest inspect hello-world:latest

And got (not all the file is listed):

{
"schemaVersion": 2,
"mediaType": "application/vnd.docker.distribution.manifest.list.v2+json",
"manifests": [
{
"mediaType": "application/vnd.docker.distribution.manifest.v2+json",
"size": 524,
"digest": "sha256:92c7f9c92844bbbb5d0a101b22f7c2a7949e40f8ea90c8b3bc396879d95e899a",
"platform": {
"architecture": "amd64",
"os": "linux"
}
},

..... text chopped out for this post....

{
"mediaType": "application/vnd.docker.distribution.manifest.v2+json",
"size": 1357,
"digest": "sha256:f9fea6a66184b4907c157323de7978942eab1815be146c93a578b01a3a43ad6d",
"platform": {
"architecture": "amd64",
"os": "windows",
"os.version": "10.0.17134.1006"
}
},
{
"mediaType": "application/vnd.docker.distribution.manifest.v2+json",
"size": 1046,
"digest": "sha256:2bff08b2d77710d0a0a1608b35ea4dfdc28f698f6da86991c3da755968fa56d6",
"platform": {
"architecture": "amd64",
"os": "windows",
"os.version": "10.0.17763.737"
}
}
]
}

The Answer

All the error means is that the docker image repository (the place on the internet where the docker hello world image is stored) does not have a version built for your version of windows. i.e. it has one for Windows 10.0.17134.1006 and Windows 10.0.17763.737 but not for my version of windows: Windows 10.0.16299

How hard would an error message to that effect be?

Uncategorized

NHibernate – Generate All Columns in Update SQL

July 9, 2019

The problem

NHibernate maintains internal state of the entities that it has loaded from the database. When saving an entity back to the DB you may want some properties (columns) included in the UPDATE SQL generated although the property value hasn’t changed. Why ? Because the update on a particular column might force a database trigger to run.

Or, to put it another way:

Database triggers are not really compatible with Object Relationship Mappers

 

In my naievety I had assumed that NHibernate would contain a “simple” option to turn on “please generate all columns in UPDATE SQL”, but, alas, no.

The Solution

I consulted StackOverflow – many posts, all leading nowhere. Then I stumbled across one post which spoke of NHibernate IInterceptors. I then consulted the NHibernate documentation – to say the documentation on interceptors is brief and concise does not do it justice.

After much messing about I worked out how interceptors work and got a working implementation so that my UPDATE SQL now includes all the columns that I require.

To make an interceptor work you need

  • Interceptor code (which implements FindDirty method)
  • Plug interceptor into NHibernate Session Create

Interceptor

The “documentation” recommends deriving your interceptor from the NHibernate EmptyInterceptor. The FindDirty method must be overriden in your interceptor, it returns an array [] of ints, the value represent the index of the propertyNames parameter of the FindDirty method – which is a list of the properties of the entity being saved. In my case I returned all items – so that all UPDATE SQL columns were included

Plug Into NHibernate

In your code (in my case, UnitofWork/Repo pattern code) you need to pass the interceptor as a parameter to the SessionFactory.OpenSession() method – this registers the interceptor with NHibernate and it will be called for all Saves

Code Example

Interceptor

public class MyInterceptor : EmptyInterceptor
{
	/// <summary>
	/// Interceptor method
	/// Return array of indices into propertyNames of columns that will be maked
	/// as dirty i.e. SQL will be generated from them
	/// </summary>
	/// <param name="entity"></param>
	/// <param name="id"></param>
	/// <param name="currentState"></param>
	/// <param name="previousState"></param>
	/// <param name="propertyNames"></param>
	/// <param name="types"></param>
	/// <returns></returns>
	public override int[] FindDirty(object entity, object id, object[] currentState,
	object[] previousState, string[] propertyNames, IType[] types)
	{
		List<int> dirty = new List<int>();

		//---------------------------------------------
		// Mark as dirty for all except for ....      |
		//---------------------------------------------
		for (int i = 0; i < propertyNames.Length; i++)
		{
			if (propertyNames[i] != "ColumnIDontwantInSql" && propertyNames[i] != "AnotherColumnIDontwantInSql")
			{
				dirty.Add(i);
			}
		}
		return dirty.ToArray();
	}
}

Wiring Up

CODE SNIPPET FROM UNIT OF WORK



 public UnitOfWork(INHibernateContext context, ISaveInterceptor interceptor)
 {
     this.context = context;
     this.Interceptor = interceptor;

     // See here for why dummy session
     ISession dummy = context.SessionFactory.OpenSession(this.Interceptor);
     this.Session = context.SessionFactory.OpenSession(dummy.Connection, this.Interceptor);
 }
Uncategorized

NHibernate – the given key was not present in the dictionary

May 29, 2019

The Problem

Using Hibernate on a (.Net/C#) project and came to delete a record from a “collection” and this threw an error

the given key was not present in the dictionary

After much head scratching it was determined that the problem was down to a subtle interaction between NHibernate and a DataBase C# entity class.The class overrode the GetHashCod() method

 

public override int GetHashCode()

{
   ...
}

Originally the underlying database table has a composite key utilising two integer columns, the GetHashCode() method used both of these columns to generate the (unique) hash code. A database modification moved from a composite key to a single key, however, the GetHashCode() method was not updated.

Everything worked fine for read but Delete threw an exception. So what was going on?

Upon investigation it was found that, as part of NHibernate initialisation or lazy-loading, when reading the data the primary key column was populated but the defunct key column was zero, but the GetHashCode() method was called. Later on in the lifecycle all the properties of the Db class were populated and this meant a different hash code was generated (as the defunct column’s value was populated and the GetHashCode() method used this value as well as the primary key value – see earlier). So it looks like that, initially, items were read by NHibernate and added to it’s internal “collection” with hash-codes from partially loaded data, whereas the delete was being made for an item with a hash-code generated from fully populated data – and NHibernate could not find that item in the collection. It looked like NHibernate was using the hash-code to identify individual items.

The solution

Generate a hash-code (GetHashCode() method) using only the primary key.

 

Uncategorized

.NET DataSet to ADO RecordSet Conversion

July 14, 2018

The Problem

Converting an ADO recordset into a .NET DataSet is straightforward, the following code will achieve this

Dim da As New System.Data.OleDb.OleDbDataAdapter<br/>
da.Fill(ds, rs, "ADODB.RecordSet") 

where ds is a DataSet and rs is an ADO Recordset

So you would think that going from a DataSet to an ADO Recordset would be just as simple. Wrong ! This conversion is not built into .NET 1 or 2, the only information I could find on how to do this is article which utilises and XSL stylesheet to transform the data. So I modified the code from this article and have coded up an object which does the conversion.

In addition I can also access the ADO recordset XML as a string from this object. Why would I want to do this ? Well, if you want to pass back an ADO recordset as part of web service results you can’t, you get an error as the web service code can’t serialise the ADO recordset. The way round this is to pass the ADO recordset XML representation as a string over the web service and convert it back into and ADO recordset at the receiving end (this could be a legacy ASP app accessing the web-service via SOAP or XMLHTTP).

The Solution

'-------------------------------------------------------------------------- 
' File : DataConversion.vb 
' 
' Author : DJ 
' 
' Version : <VSS Version Stamp> 
' 
' Purpose : Routines to convert DataSet to ADO Recordset etc etc etc. 
' 
' This object uses and XSLT to transform dataset into ADO recordset. I have 
' modified code from MS KB article to avoid using a temporary file for output. 
' I build up a string of the XML representation of the ADO recordset which can 
' then be sent back to the caller 'as xml' which can be stuffed into an ADO recordset 
' 
' It is a shame that MS did not include Dataset<->Recordset conversion as part 
' of .NET n or .NET n+1 
' 
' Modification History:

' Taken from http://support.microsoft.com/kb/316337 
' 
'-------------------------------------------------------------------------- 
Imports Microsoft.VisualBasic 
Imports System.Data 
Imports System.Xml 
Imports System.Xml.XPath 
Imports System.Xml.Xsl 
Imports System.IO 
Public Class DataConversion

Private xmlstr As String 
Private quote = "'" 
Private error_message As String

Public Sub New() 
Me.error_message = "" 
Me.xmlstr = "" 
End Sub 
Public Property ErrorMessage() 
Get 
Return Me.error_message 
End Get 
Set(ByVal Value) 
'NOP 
End Set 
End Property

'************************************************************************* 
' Class Name : ConvertToRs 
' Description : This class converts a DataSet to a ADODB Recordset. 
'**************************************************************************

'************************************************************************** 
' Method Name : GetADORS 
' Description : Takes a DataSet and converts into a Recordset. The converted 
' ADODB recordset is saved as an XML file. The data is saved 
' to the file path passed as parameter. 
' Output : The output of this method is long. Returns 1 if successfull. 
' If not throws an exception. 
' Input parameters: 
' 1. DataSet object 
' 2. Database Name 
' 3. Output file - where the converted should be written. 
'**************************************************************************

Public Function GetXML() 
Return Me.xmlstr 
End Function 
'---------------------------------------------------------------- 
Public Function GetRS() As ADODB.Recordset 
'---------------------------------------------------------------- 
' Purpose 
' Turn XML string into ADO Recordset 
' 
' Input Params 
' Returns 
' 
'---------------------------------------------------------------- 
Dim strm As New ADODB.Stream 
Dim res_rs As New ADODB.Recordset

'------------------------------------------- 
' Open a stream and write XML String to it | 
'------------------------------------------- 
strm.Open() 
strm.WriteText(Me.xmlstr)

'---------------- 
' Reset positin | 
'---------------- 
strm.Position = 0

'------------------------- 
' Read it into recordset | 
'------------------------- 
res_rs.Open(strm)

GetRS = res_rs

End Function 
Public Function ConvertDSToXML(ByVal ds As DataSet, ByVal xslfile As String) As Boolean

Me.xmlstr = ""

'Create an xmlwriter object, to write the ADO Recordset Format XML 
Try 
' Dim xwriter As New XmlTextWriter(outputfile, System.Text.Encoding.Default)

'call this Sub to write the ADONamespaces to the XMLTextWriter 
WriteADONamespaces() 
'call this Sub to write the ADO Recordset Schema 
WriteSchemaElement(ds)

Dim TransformedDatastrm As New MemoryStream

TransformedDatastrm = TransformData(ds, xslfile) 
'Pass the Transformed ADO REcordset XML to this Sub 
'to write in correct format. 
HackADOXML(TransformedDatastrm)

Return True

Catch ex As Exception 
'Returns error message to the calling function. 
Me.error_message = ex.Message & " - " & ex.ToString 
Return False 
End Try

End Function

Private Sub WriteADONamespaces() 
'The following is to specify the encoding of the xml file 
'WriteProcessingInstruction("xml", "version='1.0' encoding='ISO-8859-1'")

'The following is the ado recordset format 
'<xml xmlns:s='uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882' 
' xmlns:dt='uuid:C2F41010-65B3-11d1-A29F-00AA00C14882' 
' xmlns:rs='urn:schemas-microsoft-com:rowset' 
' xmlns:z='#RowsetSchema'> 
' </xml>

'Write the root element 
WriteStartElement("xml", "")

'Append the ADO Recordset namespaces 
WriteAttributeString("xmlns", "s", "uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882") 
WriteAttributeString("xmlns", "dt", "uuid:C2F41010-65B3-11d1-A29F-00AA00C14882") 
WriteAttributeString("xmlns", "rs", "urn:schemas-microsoft-com:rowset") 
WriteAttributeString("xmlns", "z", "#RowsetSchema") 
WriteEndStartElement()

End Sub

Private Sub WriteSchemaElement(ByVal ds As DataSet) 
'ADO Recordset format for defining the schema 
' <s:Schema id='RowsetSchema'> 
' <s:ElementType name='row' content='eltOnly' rs:updatable='true'> 
' </s:ElementType> 
' </s:Schema>

'write element schema 
WriteStartElement("s", "Schema") 
WriteAttributeString("id", "RowsetSchema") 
WriteEndStartElement()

'write element ElementTyoe 
WriteStartElement("s", "ElementType")

'write the attributes for ElementType 
WriteAttributeString("name", "row") 
WriteAttributeString("content", "eltOnly") 
WriteAttributeString("rs:updatable", "true")

WriteEndStartElement()

WriteSchema(ds) 
'write the end element for ElementType

WriteFullEndElement("s", "ElementType") 
'write the end element for Schema

WriteFullEndElement("s", "Schema")


End Sub

Private Sub WriteSchema(ByVal ds As DataSet)

Dim i As Int32 = 1 
Dim dc As DataColumn

For Each dc In ds.Tables(0).Columns

dc.ColumnMapping = MappingType.Attribute

WriteStartElement("s", "AttributeType") 
'write all the attributes 
WriteAttributeString("name", dc.ToString) 
WriteAttributeString("rs", "number", i.ToString) 
WriteAttributeString("rs", "baseCatalog", "") 
WriteAttributeString("rs", "baseTable", dc.Table.TableName.ToString) 
WriteAttributeString("rs", "keycolumn", dc.Unique.ToString) 
WriteAttributeString("rs", "autoincrement", dc.AutoIncrement.ToString) 
WriteEndStartElement()

'write child element 
WriteStartElement("s", "datatype") 
'write attributes 
WriteAttributeString("dt", "type", GetDatatype(dc.DataType.ToString)) 
WriteAttributeString("dt", "maxlength", dc.MaxLength.ToString) 
WriteAttributeString("rs", "maybenull", dc.AllowDBNull.ToString) 
WriteEndStartElement() 
WriteFullEndElement("s", "datatype")

'write end element for datatype 
'end element for AttributeType 
WriteFullEndElement("s", "AttributeType")

i = i + 1 
Next 
dc = Nothing

End Sub

'Function to get the ADO compatible datatype 
Private Function GetDatatype(ByVal dtype As String) As String 
Select Case (dtype) 
Case "System.Int32" 
Return "int" 
Case "System.DateTime" 
Return "dateTime" 
End Select 
End Function

'Transform the data set format to ADO Recordset format 
'This only transforms the data 
Private Function TransformData(ByVal ds As DataSet, ByVal xslfile As String) As MemoryStream

Dim instream As New MemoryStream 
Dim outstream As New MemoryStream

'write the xml into a memorystream 
ds.WriteXml(instream, XmlWriteMode.IgnoreSchema) 
instream.Position = 0

'load the xsl document 
Dim xslt As New XslTransform 
xslt.Load(xslfile)

'create the xmltextreader using the memory stream 
Dim xmltr As New XmlTextReader(instream) 
'create the xpathdoc 
Dim xpathdoc As XPathDocument = New XPathDocument(xmltr)

'create XpathNavigator 
Dim nav As XPathNavigator 
nav = xpathdoc.CreateNavigator

'Create the XsltArgumentList. 
Dim xslArg As XsltArgumentList = New XsltArgumentList

'Create a parameter that represents the current date and time.

xslArg.AddParam("tablename", "", ds.Tables(0).TableName)

'transform the xml to a memory stream 
xslt.Transform(nav, xslArg, outstream)

instream = Nothing 
xslt = Nothing 
' xmltr = Nothing 
xpathdoc = Nothing


nav = Nothing

Return outstream

End Function

''************************************************************************** 
'' Method Name : ConvertToRs 
'' Description : The XSLT does not tranform with fullendelements. For example, 
'' <root attr=""/> intead of <root attr=""><root/>. ADO Recordset 
'' cannot read this. This method is used to convert the 
'' elements to have fullendelements. 
''************************************************************************** 
Private Sub HackADOXML(ByVal ADOXmlStream As System.IO.MemoryStream)

ADOXmlStream.Position = 0 
Dim rdr As New XmlTextReader(ADOXmlStream) 
Dim outStream As New MemoryStream 
Dim wrt As New XmlTextWriter(outStream, System.Text.Encoding.Default)

rdr.MoveToContent() 
'if the ReadState is not EndofFile, read the XmlTextReader for nodes. 
Do While rdr.ReadState <> ReadState.EndOfFile 
If rdr.Name = "s:Schema" Then 
wrt.WriteNode(rdr, False) 
wrt.Flush() 
ElseIf rdr.Name = "z:row" And rdr.NodeType = XmlNodeType.Element Then 
wrt.WriteStartElement("z", "row", "#RowsetSchema") 
rdr.MoveToFirstAttribute() 
wrt.WriteAttributes(rdr, False) 
wrt.Flush() 
ElseIf rdr.Name = "z:row" And rdr.NodeType = XmlNodeType.EndElement Then 
'The following is the key statement that closes the z:row 
'element without generating a full end element 
wrt.WriteEndElement() 
wrt.Flush() 
ElseIf rdr.Name = "rs:data" And rdr.NodeType = XmlNodeType.Element Then 
wrt.WriteStartElement("rs", "data", "urn:schemas-microsoft-com:rowset") 
ElseIf rdr.Name = "rs:data" And rdr.NodeType = XmlNodeType.EndElement Then 
wrt.WriteEndElement() 
wrt.Flush() 
End If 
rdr.Read() 
Loop

' wrt.WriteEndElement() 
wrt.Flush()


'--------- 
' Kludge | 
'--------- 
Dim byteArray = New Byte(CType(outStream.Length, Integer)) {} 
Dim xstr As String

outStream.Position = 0 
outStream.Read(byteArray, 0, outStream.Length)

xstr = System.Text.Encoding.ASCII.GetString(byteArray) 
' remove trailing null 
xstr = Left(xstr, Len(xstr) - 1) 
Me.xmlstr &= xstr

WriteFullEndElement("xml", "")

End Sub 
Private Sub WriteStartElement(ByVal tag, ByVal val)

Me.xmlstr &= "<" & tag 
If (val <> "") Then Me.xmlstr &= ":" & val 
End Sub

Private Sub WriteAttributeString(ByVal tag, ByVal val) 
Me.xmlstr &= " " & tag & "=" & quote & val & quote

End Sub 
Private Sub WriteAttributeString(ByVal tag1, ByVal tag2, ByVal val) 
Me.xmlstr &= " " & tag1 & ":" & tag2 & "=" & quote & val & quote

End Sub

Private Sub WriteFullEndElement(ByVal tag, ByVal val) 

Me.xmlstr &= "</" & tag 
If (val <> "") Then Me.xmlstr &= ":" & val 
Me.xmlstr &= ">"

End Sub


Private Sub WriteEndStartElement()

Me.xmlstr &= ">" & vbCrLf 
End Sub

Public Function RsToXML(ByVal rs As ADODB._Recordset, ByRef outputXml As String) As Boolean 
'---------------------------------------------------------------- 
' Purpose 
' Convert and ADO Recordset into an XML String 
' Input Params 
' ADO Recordset 
' Returns 
' XML String 
'----------------------------------------------------------------

Try 
Dim streamObj As New ADODB.Stream

rs.Save(streamObj, ADODB.PersistFormatEnum.adPersistXML)

outputXml = streamObj.ReadText()

Return True 
Catch ex As Exception 
Me.error_message = ex.Message & " - " & ex.ToString 
Return False 
End Try

End Function 
Public Function RStoDS(ByVal rs As ADODB._Recordset, ByRef ds As DataSet) As Boolean 
Try 
Dim da As New System.Data.OleDb.OleDbDataAdapter 
da.Fill(ds, rs, "ADODB.RecordSet") 
Return True 
Catch ex As Exception

Me.error_message = ex.Message & " - " & ex.ToString 
Return False 
End Try

End Function 


End Class
<?xml version="1.0"?> 
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0"
xmlns:rs="urn:schemas-microsoft-com:rowset"
xmlns:z="#RowsetSchema">
<xsl:output method="xml" indent="yes"/>
<xsl:param name="tablename"/>
<xsl:template match="NewDataSet">
<rs:data>
<xsl:for-each select="./node()[local-name(.)=$tablename]">
<z:row>
<xsl:for-each select="@*">
<xsl:copy-of select="."/>
</xsl:for-each>
</z:row>
</xsl:for-each>
</rs:data>
</xsl:template>
</xsl:stylesheet>

 

sddsad

Uncategorized

DNS, BIND, root files and forwarders

July 14, 2018

The Problem

Suppose you have a BIND-9 server on an intranet acting as master or slave for the root (‘.’) zone. Then, for an extranet connection you put a forward zone in named.conf….Voila, named will not forward any requests no matter what you do if it is slave (or master) for root.

The Solution

How do you solve it ? Well, you have to put a ‘dummy’ delegation in the root zone file for the zone then named will override this with your forwarders directive.

Simple when you know how….

Uncategorized

Posts navigation

1 2 Next

Recent Posts

  • AutoMapper and “Could not load type ‘SqlGuidCaster'” Error
  • OpenVPN on Docker and the Strange Error Message Saga
  • Docker CLI and Compose Information Message
  • Docker Containers and Azure – An Introduction
  • Serilog in .Net Core 6

Recent Comments

    Archives

    • April 2025
    • December 2024
    • April 2024
    • September 2022
    • November 2021
    • June 2021
    • March 2021
    • July 2020
    • April 2020
    • November 2019
    • September 2019
    • July 2019
    • May 2019
    • February 2019
    • July 2018
    • June 2018

    Categories

    • .NET Core
    • Azure
    • Docker
    • DotNet
    • Security
    • Uncategorized
    • WebAPI
    • Windows

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Idealist by NewMediaThemes