ServiceTracker in OSGi r4.1 - osgi

Im using a org.osgi.util.tracker.ServiceTracker (PrintableServiceTracker implements ServiceTrackerCustomizer and simply prints when a new service is added).
Filter filter = bc.createFilter("(objectClass=se.enea.print.Printable)");
tracker = new ServiceTracker(bc, filter, new PrintableServiceTracker(bc));
I've read about "pseudo registration" in the new ebook "OSGi in action" and I wonder if I have to do pseudo registration explicitly or if the framwork handles this automatically?
(Will already installed Printable services be caught by the ServiceTracker. will ServiceTracker.addingService(ServiceReference) be called for each of the pre installed Printable services)

Not sure what pseudo registration means, but the method addingService in PrintableServiceTracker will be called not only for new services but also for existing services.

Related

how to use services before app build in .net core 6.0

I have earlier achieved this .net 3.1. But it couldn't be possible with .Net 6 because of startup.cs removed.
I have registered a few services,
builder.Services.AddControllers();
// Learn more about configuring Swagger/OpenAPI at https://aka.ms/aspnetcore/swashbuckle
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
var appSettings = builder.Configuration.GetSection("AppSettings").Get<AppSettings>();
builder.Services.AddScoped<IEncryption, Encryption>();
//Here I need to get the IEncryption Service, and call the method in this service to encrypt/decrypt the connection string to pass to DBContext Service.
builder.Services.AddDbContext<CatalogDbContext>(options => options.UseNpgsql(
appSettings.ConnectionString));
var app = builder.Build();
Earlier in .NET 3.1, I used BuildServicProvider() to get the Encryption service, and call the methods in that service to do the required logic then got the proper connection string I wanted that would be passed to the DBContext service on the next line.
Now, .NET 6/7 is forced to use the services only after app = builder.Build(); so, I can't register the DBCOntext after the build() method.
How can I solve this case? Any recommended approach to do this in .NET 6/7?
You still can useStartup.cs in .net 6
var builder = WebApplication.CreateBuilder(args);
var startup = new Startup(builder.Configuration);
startup.ConfigureServices(builder.Services); // calling ConfigureServices method
var app = builder.Build();
startup.Configure(app, builder.Environment); // calling Configure method
And then you can use ConfigureServices and Configure methods to register your services before building.
You didn't need to use BuildServiceProvider in .NET Core 3.1 either. AddDbContext has an overload that provides access to an IServiceProvider instance :
builder.Services.AddDbContext<CatalogDbContext>((services,options) =>{
var myOwnDecrypter=services.GetRequiredService<IMyOwnDecrypter>();
var cns=myOwnDecrypter.Decrypt(appSettings.ConnectionString,key);
options.UseNpgsql(cns);
});
or, if you use the ASP.NET Core Data Protection package :
builder.Services.AddDataProtection();
...
builder.Services.AddDbContext<CatalogDbContext>((services,options) =>{
var protector = services.GetDataProtector("Contoso.Example.v2");
var cns=protector.Unprotect(appSettings.ConnectionString);
options.UseNpgsql(cns);
});
or, if IConfiguration.GetConnectionString is used :
builder.Services.AddDataProtection();
...
builder.Services.AddDbContext<CatalogDbContext>((services,options) =>{
var conn_string=services.GetService<IConfiguration>()
.GetConnectionString("MyConnectionString");
var protector = services.GetDataProtector("Contoso.Example.v2");
var cns=protector.Unprotect(conn_string);
options.UseNpgsql(cns);
});
That said, it's the configuration provider's job to decrypt encrypted settings, not the service/context's. ASP.NET Core's configuration allows using multiple different configuration sources in the same host, not just a single settings file. There's nothing special about appsettings.json. That's just the default settings file name.
You can add another settings file with sensitive contents with AddJsonSettings. That file could use the file system's encryption, eg NTFS Encryption, to ensure it's only readable by the web app account
You can read settings from a key management service, like Hashicorp, Azure Key Vault, Amazon Key Management etc.
You can create your own provider that decrypts its input. The answers to this SO questino show how to do this and one of them inherits from JsonConfigurationProvider directly.
Important Caveat: In general, my suggestion below is a bad practice
Do not call BuildServiceProvider
Why is bad? Calling BuildServiceProvider from application code results in more than one copy of singleton services being created which might result in incorrect application behavior.
Justification: I think it is safe to call BuildServiceProvider as long as you haven't registered any singletons before calling it. Admittedly not ideal, but it should work.
You can still callBuildServiceProvider() in .Net6:
builder.Services.AddScoped<IEncryption, Encryption>();
// create service provider
var provider = builder.Services.BuildServiceProvider();
var encryption = scope.ServiceProvider.GetService<IEncryptionService>();
// use service here
or alternatively
builder.Services.AddScoped<IEncryption, Encryption>();
var provider = builder.Services.BuildServiceProvider();
using (var scope = provider.CreateScope()) {
var encryption = scope.ServiceProvider.GetService<IEncryptionService>();
// use service here
}
Alternative:
You can still use the classic startup structure in .Net6/7. We upgraded our .Net3.1 projects to .Net6 without having to rewrite/restructure the Startup()

Anyone using Serilog.Extras.MSOwin

I was wondering if anyone has seen a demo/example of using the Serilog.Extras.MSOwin package with a web api project or a example/tutorial of using Serilog with a web api project.
Any help greatly appreciated,
Jim
I will take this as question as "How do I used Serilog.Extras.MSOwin?" and given it is currently a rather small library answer here.
This reflects the current library (1.4.102) and is subject to change in the future.
Serilog.Extras.MSOwin provides two things: a Microsoft.Owin.Logging.ILoggerFactory implementation to have OWIN's logging infrastructure write to Serilog (more details about logging in OWIN in this blog post) and Guid identifier (RequestId) for each web request to aid in associating logged events.
The Logging integration is done with the following:
IAppBuilder app = ...;
Serilog.ILogger logger = ...'
app.SetLoggerFactory( new Serilog.Extras.MSOwin.LoggerFactory( logger ) );
The request id functionality needs to be registered in the OWIN pipeline:
IAppBuilder app = ...;
app.UseSerilogRequestContext("RequestId");
You will want to register that very early in the pipeline because any logging occurring before that pipeline step will not have the request id available.
You also need will need to retrieve it from the LogContext using Enrich.FromLogContext() and add that property to what you write to your sinks. For example,
const string DefaultOutputTemplate =
"{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} ({RequestId}) {Message}{NewLine}{Exception}";
ILogger logger =
new LoggerConfiguration().Enrich.FromLogContext()
.WriteTo
.RollingFile(
"log.txt",
outputTemplate: DefaultOutputTemplate)
.CreateLogger();
Serilog.Extras.MSOwin was superseded by SerilogWeb.Owin (which has since also been discontinued.)

Best Way to Send Scheduled E-Mail in .NET MVC3 Application using MVCMailer

I am working on a .NET MVC3 C# Application. This application is hosted on our own Server.
Now I want to Send Scheduled Email in my application like daily(at a specific time),Weekly, monthly and so on...
Currently I am using MVCMailer to send Emails in my application.
I tried Fluent Scheduler to send scheduled Emails, but it doesn't works with MVCMailer. It Works fine if I send mails without MVCMailer and for other scheduling jobs.
It gives me a ERROR NULLReferenceException and says HTTPContext cannot be null.
What can I do to solve this problem.
Also suggest me which will be the best way to send E-mails in my applicaton.
Windows Service (Having own server)
Scheduler (Fluent Scheduler)
SQL Scheduled jobs
I am attaching ERROR snapshot:
It could be that MVCMailer depends on an HttpContext, which will not exist on your scheduled threadlocal's.
You could consider scrapping MvcMailer and implementing your own templating solution. Something like RazorEngine (https://github.com/Antaris/RazorEngine), which gives you the full power of Razor without having to run ontop on an Http stack. You could still source your templates from disk so that your designers could modify it.
Then you could mail the results using the standard classes available from .net.
For e.g.:
string template = File.ReadAllText(fileLocation);//"Hello #Model.Name, welcome to RazorEngine!";
string emailBody = Razor.Parse(template, new { Name = "World" });
SmtpClient client = new SmtpClient();
client.Host = "mail.yourserver.com";
MailMessage mm = new MailMessage();
mm.Sender = new MailAddress("foo#bar.com", "Foo Bar");
mm.From = new MailAddress("foo#bar.com", "Foo Bar");
mm.To.Add = new MailAddress("foo#bar.com", "Foo Bar");
mm.Subject = "Test";
mm.Body = emailBody;
mm.IsBodyHtml = true;
client.Send(mm);
Obviously you could clean this all up. But it wouldn't take to much effort to use the above code and create some reusable classes. :)
Since you already have the FluentScheduler code set up, you may as well stick with that I guess. A windows service does also sound appealing, however I think that it's your call to make. If it's a simple mail service you are after I can't think of any reason not to do it via FluentScheduler.
I have created a full example of this available here: https://bitbucket.org/acleancoder/razorengine-email-example/src/dfee804d526ef3cd17fb448970fbbe33f4e4bb79?at=default
You can download the website to run locally here: https://bitbucket.org/acleancoder/razorengine-email-example/downloads
Just make sure to change the Default.aspx.cs file to have your correct mail server details.
Hope this helps.
Since MVC Mailer works best in the HTTP stack (i.e. from controllers), I've found that a very reliable way to accomplish this is by using Windows Task Schedule from a server somewhere. You could even spin up a micro instance on Amazon Web Server.
Use "curl" to call the URL of your controller that does the work and sends the emails.
Just setup a Scheduled Task (or Cron if you want to use *IX) to call "c:\path_to_curl\curl.exe http://yourserver.com/your_controller/your_action".
You could even spin up a *IX server on AWS to make it even cheaper.

I want my Domino Servlet to get an authenticated user session

It seems a like a pretty fundamental question, in a running Servlet hosted on Domino I want to access Domino resources that I have wisely protected using the the very fine security of IBM Notes and Domino.
I want the Servlet to be able to read and write data to Domino whilst keeping that data from the client that called the Servlet (or xAgent) and preventing the client from writing directly.
I'd be happy to be able to get a session that represented the signer of the application. I can get a session for a registered user by calling the Servlet using ?open&login and signing in. That's not practical.
I've looked here: How can you use SessionAsSigner in a Java Bean called from an XPage? where Mark Leusink (https://stackoverflow.com/users/1177870/mark-leusink) implies the use of ExtLib's getCurrentSessionAsSigner() could be used. I've tried it, having signed the whole application with a single user id and it doesn't return a session. The answer seems to lie in the Servlet's inability to get a FacesContext object.
This feels like the answer should be obvious but it isn't to me. Any ideas?
FacesContext is JSF stuff and can be used from XAgent (=XPage).
In a servlet you can do this:
Session session = NotesFactory.createSession(null, "user", "password");
Server ID usually has no password and doing this will use the server ID:
Session session = NotesFactory.createSession();
Check the source of the WebDav project on OpenNTF. It has all the code you need
There have been lots of good answers to the original question. Thanks very much.
The solution I propose to use is to port the code I have to OSGi plugins. It appears that java code/Servlets within the NSF context are subject to security controls that are relaxed when the same code runs within the OSGi context. The code:
try {
NotesThread.sinitThread();
Session s = NotesFactory.createSession("","<my username>","<my password>");
.....
session = null;
} catch (Exception e) {
} finally {
NotesThread.stermThread();
}
Runs fine in the OSGI context, but within in an NSF produc
com.ibm.domino.osgi.core.context.ContextInfo.getUserSession()
Jason - I assume you basically want the same functionality you would get running a Web Query Save agent if you didn't select run as Web User selected, in other words as the signer of the code.
You could try setting up a internet site rule to allow basic authentication for the specific application path you wanted to use - might be worth using a subdomain for this.
Then within the Servlet call this URL, whilst setting the Basic authorization parameters (username & password).
Something like this.
URL url = new URL(URL_TO_CALL);
String authStr = "USERNAME:PASSWORD";
String authEncoded = Base64.encodeBytes(authStr.getBytes());
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
connection.setDoOutput(true);
connection.setRequestProperty("Authorization", "Basic " + authEncoded);
InputStream is = connection.getInputStream();

Accessing entire netflix catalog via API v1.5

Netflix recently updated their API methods for obtaining the full Netflix catalog. I'm curious if anyone has had any success accessing these new xml documents and downloading them via API v1.5 (9/2012). Previously, you could download the entire Netflix catalog via one API call (which I had working perfectly). Now, there are supposedly two calls to make: one for dvd's and one for streaming movies.
I cannot make these calls return anything except for an empty array. Please don't offer an answer unless you have personally downloaded the entire catalog via these new API's.
Bonus points if you can tell me how to do it in Ruby.
http://developer.netflix.com/blog/read/Update_Changes_for_the_Public_API
This did it for me (download the netflix instant cat)...it's in php but can prob be easily rewritten in ruby..this is using JR Collings OAuthsimple
args = Array(
max_results=> 20,
start_index=>0
);
//args don't matter, netflix doesn't listen here
// this is the URL path (note the lack of arguments.)
$rpath = "http://api-public.netflix.com/catalog/titles/streaming";
// Create the Signature object.
$roauth = new OAuthSimple();
$rsigned = $roauth->sign(Array(path=>$rpath,
parameters=>$args,
signatures=> Array('consumer_key'=>YOURKEY,
'shared_secret'=>YOURSECRET,
)));
$getxml = file_get_contents($rsigned['signed_url']);
file_put_contents("streaming.xml", $getxml);

Resources