how to replace code that uses now obsolete System.Data.OracleClient namespace classes? - oracle

I've made a "generic" program that converts data from a db to another. It uses configuration files to define the conversion. It uses code like this:
static DbProviderFactory _srcProvFactory;
static DbProviderFactory _trgtProvFactory;
public static bool DoConversions()
{
try
{
if (!InitConfig())
return false;
_srcProvFactory = DbProviderFactories.GetFactory(GetConnectionClassTypeByDatabaseType(Preferences.SourceDatabaseType));
_trgtProvFactory = DbProviderFactories.GetFactory(GetConnectionClassTypeByDatabaseType(Preferences.TargetDatabaseType));
using (DbConnection srcCnctn = _srcProvFactory.CreateConnection(),
trgtCnctn = _trgtProvFactory.CreateConnection())
{
srcCnctn.ConnectionString = Preferences.SourceConnectionString;
srcCnctn.Open();
trgtCnctn.ConnectionString = Preferences.TargetConnectionString;
trgtCnctn.Open();
//DO STUFF
}
}
}
Above GetConnectionClassTypeByDatabaseType-method return strings like "System.Data.OracleClient" depending on config file.
The DO STUFF part calls methods like one below (there's many of these) to find out database table column properties from schema. This is needed cause Oracle, SQL server etc. handle these differently.
public static int GetColumnMaxStringLength(DbProviderFactory provFactory, DataRow schemaTableRow)
{
if (provFactory is OracleClientFactory)
{
return Convert.ToInt32(schemaTableRow["LENGTH"]);
}
else if // OTHER OPTIONS
...
throw new Exception(string.Format("Unsupported DbProviderFactory -type: {0}", provFactory.GetType().ToString()));
}
So how this is supposed to be fixed now when the build says these classes are obsolete? This was supposed to be kind of text book solution when I did this (Pro C# 2008 and the
.NET 3.5 Platform). Now I'm baffled.
Thanks in advance & Best Regards - Matti

ODP.NET or any of the other 3rd party ADO.NET driver providers:
ref: Comparison of 3rd Party Oracle .NET Providers

Related

Recompile assemblies to separate appdomains in NET 5

I have a NET 5.0 console application, from which I am trying to compile and execute external code BUT also be able to update the code, unload the previously created appdomain and re-compile everything.
This is my entire static class that handles code compilation and assembly loading
using System;
using System.IO;
using System.Collections.Generic;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp;
using System.Reflection;
using Microsoft.CodeAnalysis.Emit;
using System.Runtime.Loader;
namespace Scripting
{
public static class ScriptCompiler
{
public static Dictionary<string, AppDomain> _appDomainDict = new();
public static object CompileScript(string scriptpath)
{
var tree = SyntaxFactory.ParseSyntaxTree(File.ReadAllText(scriptpath));
//Adding basic references
List<PortableExecutableReference> refs = new List<PortableExecutableReference>();
var assemblyPath = Path.GetDirectoryName(typeof(object).Assembly.Location);
refs.Add(MetadataReference.CreateFromFile(Path.Combine(assemblyPath, "mscorlib.dll")));
refs.Add(MetadataReference.CreateFromFile(Path.Combine(assemblyPath, "System.dll")));
refs.Add(MetadataReference.CreateFromFile(Path.Combine(assemblyPath, "System.Private.CoreLib.dll")));
refs.Add(MetadataReference.CreateFromFile(Path.Combine(assemblyPath, "System.Core.dll")));
refs.Add(MetadataReference.CreateFromFile(Path.Combine(assemblyPath, "System.Runtime.dll")));
// A single, immutable invocation to the compiler
// to produce a library
string hash_name = scriptpath.GetHashCode();
if (_appDomainDict.ContainsKey(hash_name))
{
AppDomain.Unload(_appDomainDict[hash_name]);
_appDomainDict.Remove(hash_name);
}
AppDomain new_domain = AppDomain.CreateDomain(hash_name);
_appDomainDict[hash_name] = new_domain;
var compilation = CSharpCompilation.Create(hash_name)
.WithOptions(
new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary,
optimizationLevel: OptimizationLevel.Release,
allowUnsafe:true))
.AddReferences(refs.ToArray())
.AddSyntaxTrees(tree);
MemoryStream ms = new MemoryStream();
EmitResult compilationResult = compilation.Emit(ms);
ms.Seek(0, SeekOrigin.Begin);
if (compilationResult.Success)
{
// Load the assembly
Assembly asm = new_domain.Load(ms.ToArray());
object main_ob = asm.CreateInstance("SomeClass");
ms.Close();
return main_ob;
}
else
{
foreach (Diagnostic codeIssue in compilationResult.Diagnostics)
{
string issue = $"ID: {codeIssue.Id}, Message: {codeIssue.GetMessage()}," +
$" Location: { codeIssue.Location.GetLineSpan()}," +
$" Severity: { codeIssue.Severity}";
Callbacks.Logger.Log(typeof(NbScriptCompiler), issue, LogVerbosityLevel.WARNING);
}
return null;
}
}
}
}
Its all good when I am trying load the assembly in the current domain and execute from the instantiated object. The problem with this case is that since I wanna do frequent updates to the code, even if I make sure that the assembly names are different. I'll end up loading a ton of unused assemblies to the current domain.
This is why I've been trying to create a new domain and load the assembly there. But for some reason I get a platform not supported exception. Is this not possible to do in NET 5? Are there any workarounds or am I doing something wrong here.
Ok, it turns out that AppDomain support for NET Core + is very limited and in particular there seems to be only one appdomain
On .NET Core, the AppDomain implementation is limited by design and
does not provide isolation, unloading, or security boundaries. For
.NET Core, there is exactly one AppDomain. Isolation and unloading are
provided through AssemblyLoadContext. Security boundaries should be
provided by process boundaries and appropriate remoting techniques.
Source: https://learn.microsoft.com/en-us/dotnet/api/system.appdomain?view=net-6.0
And indeed, when trying to use AssemblyLoadContext and create object instances through these contexts everything worked like a charm!
One last note is that if the created context is not marked as collectible, its not possible to unload it. But this can be very easily set during AssemblyLoadContext construction.

Using .NET MVC 4.0 VirtualPathProviders in a web farm scenario

We have an application that relies heavily on a virtual file system that extends the .NET System.Web.Hosting.VirtualPathProvider architecture.
We use chaining to search through...
(A)contentresourceprovider (extends virtualpathprovider)
(B)assemblyresourceprovider (ditto)
(C)physical file system
The vpp's are registered on app initialisation like so...
HostingEnvironment.RegisterVirtualPathProvider(new AssemblyResourceProvider());
HostingEnvironment.RegisterVirtualPathProvider(new ContentResourceProvider());
This problem is relative to the ContentResourceProvider (see below)
public class ContentResourceProvider : System.Web.Hosting.VirtualPathProvider
{
#region Methods 
public override bool FileExists(string virtualPath)
{
var result = ContentResourceProvider.IsContentResourcePath(virtualPath) ? ((ContentResourceVirtualFile)this.GetFile(virtualPath)).Exists : Previous.FileExists(virtualPath);
return result;
}
public override CacheDependency GetCacheDependency(string virtualPath, System.Collections.IEnumerable virtualPathDependencies, DateTime utcStart)
{
var result = ContentResourceProvider.IsContentResourcePath(virtualPath) ? new ContentResourceCacheDependency(virtualPath) : Previous.GetCacheDependency(virtualPath, virtualPathDependencies, utcStart);
return result;
}
public override System.Web.Hosting.VirtualFile GetFile(string virtualPath)
{
var result = ContentResourceProvider.IsContentResourcePath(virtualPath) ? new ContentResourceVirtualFile(virtualPath) : Previous.GetFile(virtualPath);
return result;
}
public override string GetFileHash(string virtualPath, System.Collections.IEnumerable virtualPathDependencies)
{
var result = base.GetFileHash(virtualPath, virtualPathDependencies);
return result;
}
public static bool IsContentResourcePath(string virtualPath)
{
var pattern = #"~?/\(ContentManagementResource\)";
var result = Regex.IsMatch(virtualPath, pattern, RegexOptions.CultureInvariant | RegexOptions.IgnoreCase);
return result;
}
#endregion Methods 
}
Basically this all works great and whenever we update content via our cms tools the virtual files and removed from cache and reopened/compiled into razor views.
All good so far....
However we recently deployed this to a server farm and it seems the whole innate System.Web.Hosting virtual file provider thingy does not scale (obviously) it's relative to the particular server that stores the virtual files in memory etc.
This means that if User A makes a change to a razor view via cms tools and the operation is carried out on Server A, user B making another change on Server B will then provide undesirable results despite the fact the content is coming from the same Database. This is because the virtual files systems caching the razor views are unique to the web server they are running on, likewise the cachedepedencies are only triggered on the relative server.
I've tried using the distributed cache packages like AppFabric, Memcached, NCache. These aren't adequate because they depend on you manually writng code that inserts Serialized objects into the distributed cache. Firstly the .NET vpp memory stuff is effectively managed under the hood with strict design patterns enforced, also the CacheDependency class isn't serializable.
Having done many hours of research I cannot find a solution online.
Surely the whole .NET MVC 4.0 VirtualPathProvider/CacheDependency systems architecture must have a way of scaling across more then one server?
I guess from my understanding of this matter I'm asking is there a way to scale the System.Web.Hosting cache memory across more than one server so that virtual content and Cache Dependencies attached to these files work as intended?
Any help would be greatly appreciated.

Generate Bare Definitions for a Project or Namespace (Visual Studio)

In developing an SDK for use within our product, we want to provide users (developers) a Visual Studio plugin, mainly to provide them Intellisense during their development and ensure their code compiles for them. To do this, we strip the contents of all of our SDK APIs and put them all in a separate project.
For example:
public IEnumerable<string> AvailableConnections(bool querySystem) {
var connections = ConnectionList();
if(querySystem)
connections = connections.Concat(SystemConnections());
... // Filter connections somehow
return connections;
}
public void WriteToStream(Stream strFrom, Stream strTo) {
byte[] buffer = new byte[32 * 1024]; // 32 KiB
int len;
while ( (len = input.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, len);
}
}
Becomes:
public IEnumerable<string> AvailableConnections(bool querySystem) { return null; }
public void WriteToStream(Stream strFrom, Stream strTo) { }
My question: Does a tool exist to automate this, whether for a particular project or particular namespace? Ideally, it would intake a project or namespace and output all of the public classes/functions replacing their definitions with a simple return of the return type's default value. Visual Studio seems to do almost this when you view a class from which you don't have the source (e.g., you'll see IEnumerable<T> [from metadata]).
It sounds like you want to provide interfaces to your API.
You can build this into your project and essentially you will always have an assembly that shows all the public members without containing your implementation code.
Create a project that contains only the API, and reference that from your main project so that your concrete code (your implementation) implements the interfaces.
The API assembly would contain mostly interfaces and perhaps some abstract classes an helper, which you could share with developers.
Taking your example, you would have an interface like
public interface IMySdkThing
{
IEnumerable<string> AvailableConnections(bool querySystem);
void WriteToStream(Stream strFrom, Stream strTo);
}
Your implementation would be declared like:
public class MySdkThing : IMySdkThing
{
// all the code you showed, just as it is
}
All that said, it isn't clear how this will be useful to the developer. He or she will need a dll with some actual, executable code in it to use your library. Intellisense and compile-time checking come for free; you don't have to do anything special.

Fetch windows setting value

How do I fetch the Measurement System setting value in javascript?
I'm guessing that it would be throw some WinJS call.
The logical place would be Windows.Globalization, but not seeing if offered there. One pretty simple workaround - faster to write than to research the setting :) is to create a Windows Runtime Component in C# that calls in to System.Globalization:
namespace WindowsRuntimeComponent
{
public sealed class RegionalSettings
{
public bool isMetric()
{
return System.Globalization.RegionInfo.CurrentRegion.IsMetric;
}
}
}
Then add as a reference to your JavaScript app and invoke there:
var r = new WindowsRuntimeComponent.RegionalSettings;
var isMetric = r.isMetric();

Entity Framework 4.3.1 add-migration error: "model backing the context has changed"

I'm getting an error when trying to run the EF 4.3.1 add-migrations command:
"The model backing the ... context has changed since the database was created".
Here's one sequence that gets the error (although I've tried probably a dozen variants which also all fail)...
1) Start with a database that was created by EF Code First (ie, already contains a _MigrationHistory table with only the InitialCreate row).
2) The app's code data model and database are in-sync at this point (the database was created by CF when the app was started).
3) Because I have four DBContexts in my "Services" project, I didn't run 'enable-migrations' command (it doesn't handle multipe contexts). Instead, I manually created the Migrations folder in the Services project and the Configuration.cs file (included at end of this post). [I think I read this in a post somewhere]
4) With the database not yet changed, and the app stopped, I use the VS EDM editor to make a trivial change to my data model (add one property to an existing entity), and have it generate the new classes (but not modify the database, obviously). I then rebuild the solution and all looks OK (but don't delete the database or restart the app, of course).
5) I run the following PMC command (where "App" is the name of one of the classes in Configuration.cs):
PM> add-migration App_AddTrivial -conf App -project Services -startup Services -verbose
... which fails with the "The model ... has changed. Consider using Code First Migrations..." error.
What am I doing wrong? And does anyone else see the irony in the tool telling me to use what I'm already trying to use ;-)
What are the correct steps for setting-up a solution starting with a database that was created by EF CF? I've seen posts saying to run an initial migration with -ignorechanges, but I've tried that and it doesn't help. Actually, I've spent all DAY testing various permutations, and nothing works!
I must be doing something really stupid, but I don't know what!
Thanks,
DadCat
Configuration.cs:
namespace mynamespace
{
internal sealed class App : DbMigrationsConfiguration
{
public App()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.App.Repository.Migrations";
}
protected override void Seed(.Services.App.Repository.ModelContainer context)
{
}
}
internal sealed class Catalog : DbMigrationsConfiguration<Services.Catalog.Repository.ModelContainer>
{
public Catalog()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.Catalog.Repository.Migrations";
}
protected override void Seed(Services.Catalog.Repository.ModelContainer context)
{
}
}
internal sealed class Portfolio : DbMigrationsConfiguration<Services.PortfolioManagement.Repository.ModelContainer>
{
public Portfolio()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.PortfolioManagement.Repository.Migrations";
}
protected override void Seed(Services.PortfolioManagement.Repository.ModelContainer context)
{
}
}
internal sealed class Scheduler : DbMigrationsConfiguration<.Services.Scheduler.Repository.ModelContainer>
{
public Scheduler()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.Scheduler.Repository.Migrations";
}
protected override void Seed(Services.Scheduler.Repository.ModelContainer context)
{
}
}
}
When using EF Migrations you should have one data context per database. I know that it can grow really large, but by trying to split it you will run into several problems. One is the migration issue that you are experiencing. Later on you will probably be facing problems when trying to make queries joining tables from the different contexts. Don't go that way, it's against how EF is designed.

Resources