NLog blob storage extension with dynamic connection string - runtime

BlobStorage target property is not changed as I expected.
I use this code. (of course le connection string is the real azureblobstorage one)
LogManager.Configuration.Variables["simple-log-file-name"] = "simple-log.txt";
LogManager.Configuration.Variables["blob-container"] = "logs";
LogManager.Configuration.Variables["blobconst"] = "DefaultEndpointsProtocol=https;AccountName=....";
Target is setup in nlog.config:
<target xsi:type="AzureBlobStorage" name="simple-log-target" blobName="${var:simple-log-file-name}" container="${var:blob-container}" connectionString="${var:blobconst}"....
At this point nlog fail to setup with
Error AzureBlobStorageTarget(Name=simple-log-target): Failed to create
BlobClient with connectionString=. Exception:
System.ArgumentNullException: Value cannot be null. (Parameter
'connectionString')
If I put the Aure connection string in nlog config target, it works: It writes to storage defined by nlog.config in container and blob that are set in runtime.
<target xsi:type="AzureBlobStorage" name="simple-log-target" blobName="${var:simple-log-file-name}" container="${var:blob-container}" connectionString="DefaultEndpointsProtocol=https;AccountName=...." ...
Is it possible to define this connectionString at runtime at all ?

Instead of using NLog-Configuration-Variables then try using GDC:
GlobalDiagnosticsContext.Set("simple-log-file-name", "simple-log.txt");
GlobalDiagnosticsContext.Set("blob-container", "logs");
GlobalDiagnosticsContext.Set("blobconst", "DefaultEndpointsProtocol=https;AccountName=....";
And setup NLog.config like this (Make sure to assign GDC-values before creating first NLog.Logger-object):
<target xsi:type="AzureBlobStorage" name="simple-log-target" blobName="${gdc:simple-log-file-name}" container="${gdc:blob-container}" connectionString="${gdc:blobconst}"
See also: NLog.Extensions.AzureBlobStorage - Azure ConnectionString

Related

Cannot bind environment variable to application.properties

Im working with Spring-boot and PostgreSQL and failed to bind the database password to the application.properties. I have already set the DATABASE_PASSWORD to env but its still failed to bind the properties
spring.datasource.url=jdbc:postgresql://${DATABASE_HOST}:${DATABASE_PORT}/${DATABASE_NAME}?reWriteBatchedInserts=true
spring.datasource.username=${DATABASE_USER}
spring.datasource.password=${DATABASE_PASSWORD}
Description:
Failed to bind properties under 'spring.datasource.password' to
java.lang.String:
Property: spring.datasource.password
Value: ${DATABASE_PASSWORD}
Origin: class path resource [application.properties]:16:28
Reason: Could not resolve placeholder 'DATABASE_PASSWORD' in value "${DATABASE_PASSWORD}"
if you have set your database_password to a system environment as you say, than spring should use that as this says -:
The values in application.properties are filtered through the existing Environment when they are used, so you can refer back to previously defined values (for example, from System properties).
Did you try restarting ?

Databrick - Reading BLOB from Mounted filestorage

I am using Azure databricks and I ran the following Python code:
sas_token = "<my sas key>"
dbutils.fs.mount(
source = "wasbs://<container>#<storageaccount>.blob.core.windows.net",
mount_point = "/mnt/gl",
extra_configs = {"fs.azure.sas.<container>.<storageaccount>.blob.core.windows.net": sas_token})
This seemed to run fine. So I then ran:
df = spark.read.text("/mnt/gl/glAgg_LE.csv")
Which gave me the error:
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Not sure what I'm doing wrong though. I'm pretty sure my sas key is correct.
Ok if you are getting this error - double check both the SAS key and the container name.
Turned out I had pointed it to the wrong container!

NLog environment layout renderer doesn't work when run as a scheduled task

I'm using NLog 4.4.12 with .NET 4.6.2 on Windows Server 2012.
I've configured my app.config with the nlog section as follows:
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<include file="conf/nlog.${environment:MY_ENV}.config"/>
</nlog>
The idea here is that it will bring in an environment specific configuration for setting up NLog.
I setup a System Environment Variable (not user) called MY_ENV, and set its value to production.
If I run the program as the currently logged in user, the proper config file is found, and logs are written as expected.
However, if I setup a scheduled task, through Windows Task Scheduler, to run the program with the same user credentials just used, the Environment Variable seems not to be found?
I have configured the scheduled task to run whether the user is logged on or not, and I have enabled Run with highest privileges.
When I enable the internal diagnostics of NLog, I see the following in the diagnostic log:
2017-10-19 19:27:42.1744 Error Error when including 'conf/nlog..config'. Exception: System.IO.FileNotFoundException: Included file not found: E:\Utilities\MyApp\conf/nlog..config
at NLog.Config.XmlLoggingConfiguration.ParseIncludeElement(NLogXmlElement includeElement, String baseDirectory, Boolean autoReloadDefault)
2017-10-19 19:27:42.1744 Error Parsing configuration from E:\Utilities\MyApp\MyApp.exe.Config failed. Exception: NLog.NLogConfigurationException: Exception when parsing E:\Utilities\MyApp\MyApp.exe.Config. ---> NLog.NLogConfigurationException: Error when including: conf/nlog..config ---> System.IO.FileNotFoundException: Included file not found: E:\Utilities\MyApp\conf/nlog..config
at NLog.Config.XmlLoggingConfiguration.ParseIncludeElement(NLogXmlElement includeElement, String baseDirectory, Boolean autoReloadDefault)
--- End of inner exception stack trace ---
at NLog.Config.XmlLoggingConfiguration.ParseIncludeElement(NLogXmlElement includeElement, String baseDirectory, Boolean autoReloadDefault)
at NLog.Config.XmlLoggingConfiguration.ParseNLogElement(NLogXmlElement nlogElement, String filePath, Boolean autoReloadDefault)
at NLog.Config.XmlLoggingConfiguration.ParseTopLevel(NLogXmlElement content, String filePath, Boolean autoReloadDefault)
at NLog.Config.XmlLoggingConfiguration.Initialize(XmlReader reader, String fileName, Boolean ignoreErrors)
--- End of inner exception stack trace ---
As you can see, it seems to be getting an empty string or null value for the layout rendering of the path for the include file. This results in the file path getting rendered as: conf/nlog..config.
I'm not sure this is a problem with NLog specifically. I looked into the source code for the Environment Layout Renderer, and it eventually calls Environment.GetEnvironmentVariable(string).
I feel like this might have something more to do with the nature of Scheduled Tasks in Windows, but I'm not sure what my options are.
This task needs to run automatically whether the user is logged in or not.
Can someone explain what is going on here? Why is NLog unable to pull in the proper environment variable? What can I do to fix this problem?
I think this is the problem, when changing the environment variable, you need to restart Taskeng.exe.
You should terminate Taskeng.exe and the next time scheduled task is run it will get an updated environment.
https://superuser.com/questions/331077/accessing-environment-variables-in-a-scheduled-task

Setting elasticsearch properties in spark-submit

I'm trying to launch Spark jobs that use Elastic Search input via command line using spark-submit as described in http://www.elasticsearch.org/guide/en/elasticsearch/hadoop/current/spark.html
I'm setting the properties in a file, but when launching spark-submit it gives the following warnings:
~/spark-1.0.1-bin-hadoop1/bin/spark-submit --class Main --properties-file spark.conf SparkES.jar
Warning: Ignoring non-spark config property: es.resource=myresource
Warning: Ignoring non-spark config property: es.nodes=mynode
Warning: Ignoring non-spark config property: es.query=myquery
...
Exception in thread "main" org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed
My config file looks like (with correct values):
es.nodes nodeip:port
es.resource index/type
es.query query
Setting the properties in the Configuration object in the code works, but I need to avoid this workaround.
Is there a way to set those properties via command line?
I don't know if you resolved your issue (if so, how?), but I found this solution:
import org.elasticsearch.spark.rdd.EsSpark
EsSpark.saveToEs(rdd, "spark/docs", Map("es.nodes" -> "10.0.5.151"))
Bye
When you pass a config file to spark-submit, it only loads configs that start with 'spark.'
So, in my config I simply use
spark.es.nodes <es-ip>
and in the code itself I have to do
val conf = new SparkConf()
conf.set("es.nodes", conf.get("spark.es.nodes"))

DefaultDataPath is empty in VS2010 SQL2008 database deployment script

I have a VS2010 database project pointing to a SQL2005 database. When I deploy it, it correct picks up the DefaultDataPath from the SQL instance and everything works.
Today, I changed the project type from SQL205 to SQL2008 and changed the deploy properties to point to my SQL2008 server. However, now when I try to deploy, I get this error:
Error SQL01268: .Net SqlClient Data Provider: Msg 5105, Level 16, State 2, Line 1 A file activation error occurred. The physical file name '\AutoDeployedTRS.mdf' may be incorrect. Diagnose and correct additional errors, and retry the operation.
Error SQL01268: .Net SqlClient Data Provider: Msg 1802, Level 16, State 1, Line 1 CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
An error occurred while the batch was being executed.
The reason for this error is that the SQL script created by VS contain these three lines:
:setvar DatabaseName "AutoDeployedTRS"
:setvar DefaultDataPath "\"
:setvar DefaultLogPath "\"
If I check the SQL Instance properties (through the UI or by reading the registry), they are set correctly so it seems like VS2010 can't pick them up for some reason.
Any ideas?
Try to go to location Schema Objects\Database level objects\Storage\Files.
There can be found two files:
Open file [your_database_name].sql and make parameter
FILENAME = '$(DefaultDataPath)$(DatabaseName).mdf'.
Then open file [your_database_name]_log.sql and make parameter
FILENAME = '$(DefaultDataPath)$(DatabaseName)_log.ldf'.
After that try to deploy your project. This parameters now are defined during deployment according to current target database path. Hope it will help you.

Resources