I'm using WebSphere 7.0.0.37 and jython
I need to change the 'Container-managed authentication alias', unfortunatelly I can't find anything in API, inspecting attributes of existing DataSources or any example for that task.
I have succesfully changed the 'composant-managed authentication alias' with:
AdminConfig.modify(DataSourceProvider, '[[name "basename"] [authDataAlias "' + nameNode + '/' + aliasJaas + '" ] ')
How can i do that?
thank you!
Here is some logic which you could use to solve your problem.
# Create new alias
cellName = AdminConfig.showAttribute(AdminConfig.list("Cell"), "name")
security = AdminConfig.getid('/Cell:' + cellName + '/Security:/')
myAlias = 'blahAlias'
user = 'blah'
pswd = 'blah'
jaasAttrs = [['alias', myAlias], ['userId', user], ['password', pswd ]]
print AdminConfig.create('JAASAuthData', security, jaasAttrs)
print "Alias = " + myAlias + " was created."
# Get a reference to your DataSource (assume you know how to do this):
myDS = ...
# Set new alias on DataSource
AdminConfig.modify('MappingModule', myDS, '[[authDataAlias ' + myAlias + '] [mappingConfigAlias DefaultPrincipalMapping]]')
Note that if you can figure out how to do a given task in the Admin Console, you can use the "Command Assist" function to get a Jython snippet to do the equivalent via wsadmin. See here.
Related
I am having trouble to access Azure container from Azure/Databricks.
I follow instructions from this tuto, so I started to create my container and generate sas.
Then on a databricks notebook I delivered the following command
dbutils.fs.mount( source = endpoint_source, mount_point = mountPoint_folder, extra_configs = {config : sas})
where I replace endppoint_source, mountPoint_folder, sas by the following
container_name = "containertobesharedwithdatabricks"
storage_account_name = "atabricksstorageaccount"
storage_account_url = storage_account_name + ".blob.core.windows.net"
sas = "?sv=2021-06-08&ss=bfqt&srt=o&sp=rwdlacupiytfx&se=..."
endpoint_source = "wasbs://"+ storage_account_url + "/" + container_name
mountPoint_folder = "/mnt/projet8"
config = "fs.azure.sas."+ container_name + "."+ storage_account_url
but I ended with the following exception:
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container $root in account atabricksstorageaccount.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.
I cannot figure out why databricks cannot find the root container.
Any help would be mutch appreciated. Thanks in advance.
The storage account and folder exist, as can be seen from this capture, so I am puzzled out.
Using the same approach as yours, I got the same error:
Using the following code, I was able to mount successfully. Change the endpoint_source value to the format wasbs://<container-name>#<storage-account-name>.blob.core.windows.net.
endpoint_source = 'wasbs://data#blb2301.blob.core.windows.net'
mp = '/mnt/repro'
config = "fs.azure.sas.data.blb2301.blob.core.windows.net"
sas = "<sas>"
dbutils.fs.mount( source = endpoint_source, mount_point = mp, extra_configs = {config : sas})
My bad..., I put a "/" instead of "#" between the container_name and the storage_account_url and inverse the order, so the right synthax is:
endpoint_source = "wasbs://"+ container_name + "#" + storage_account_url
Im using SlackAPIPostOperator in Airflow to send Slack messages on task failures.
I wondered if there's a smart way to add a link to the airflow UI logs page of the failed task to the slack message.
The following is an example I want to achieve:
http://myserver-uw1.myaws.com:8080/admin/airflow/graph?execution_date=...&arrange=LR&root=&dag_id=MyDAG&_csrf_token=mytoken
The current message is:
def slack_failed_task(context):
failed_alert = SlackAPIPostOperator(
task_id='slack_failed',
channel="#mychannel",
token="...",
text=':red_circle: Failure on: ' +
str(context['dag']) +
'\nRun ID: ' + str(context['run_id']) +
'\nTask: ' + str(context['task_instance']))
return failed_alert.execute(context=context)
You can build the url to the UI with the config value base_url under the [webserver] section and then use Slack's message format <http://example.com|stuff> for links.
from airflow import configuration
def slack_failed_task(context):
link = '<{base_url}/admin/airflow/log?dag_id={dag_id}&task_id={task_id}&execution_date={execution_date}|logs>'.format(
base_url=configuration.get('webserver', 'BASE_URL'),
dag_id=context['dag'].dag_id,
task_id=context['task_instance'].task_id,
execution_date=context['ts'])) # equal to context['execution_date'].isoformat())
failed_alert = SlackAPIPostOperator(
task_id='slack_failed',
channel="#mychannel",
token="...",
text=':red_circle: Failure on: ' +
str(context['dag']) +
'\nRun ID: ' + str(context['run_id']) +
'\nTask: ' + str(context['task_instance']) +
'\nSee ' + link + ' to debug')
return failed_alert.execute(context=context)
We can also do this using the log_url attribute in the Task Instance
def slack_failed_task(context):
failed_alert = SlackAPIPostOperator(
task_id='slack_failed',
channel="#mychannel",
token="...",
text=':red_circle: Failure on: ' +
str(context['dag']) +
'\nRun ID: ' + str(context['run_id']) +
'\nTask: ' + str(context['task_instance']) +
'\nLogs: <{url}|to Airflow UI>'.format(url=context['task_instance'].log_url)
)
return failed_alert.execute(context=context)
I know this is available since version 1.10.4 at the very least.
Is possible to change Websphere datasource IP address? I've tried this script but it doesn't work
def updateDataSourceIP(newIP):
datasources = AdminConfig.getid('/DataSource:/').splitlines()
for datasource in datasources:
propertySet = AdminConfig.showAttribute(t1, 'propertySet')
propertyList = AdminConfig.list('J2EEResourceProperty', propertySet).splitlines()
for prop in propertyList:
if (AdminConfig.showAttribute(prop, 'name') == 'serverName'):
oldip = AdminConfig.showAttribute(prop, 'value')
print "Updating serverName attribute of datasource '" + datasource + "' from " + oldip + " to " + sys.argv[0]
AdminConfig.modify(prop, '[[value ' + newIP + ']]')
AdminConfig.reset();
In your example code, you are using
AdminConfig.reset()
at the end of the script, which discards all changes. Try switching to
AdminConfig.save()
I have a Jython script calling wsadmin libraries to configure a WAS server.
I have these functions:
def createWasObject(was_object_type, was_path, object_params):
if isinstance(was_path, basestring):
was_path = AdminConfig.getid(was_path)
str_params = '['
for k,v in object_params.items():
str_params = str_params + '[' + k + ' "' + v + '"] '
str_params = str_params + ']'
return AdminConfig.create(was_object_type, was_path, str_params)
def createJdbcProviders(was_path, jdbc_providers):
was_object_type = 'JDBCProvider'
for jdbc_provider in jdbc_providers:
jdbc = createWasObject(was_object_type, was_path, jdbc_provider['params'])
print jdbc
for datasource in jdbc_provider['datasources']:
ds = createWasObject('Datasource', jdbc, datasource['params'])
print
The "print jdbc" prints:
Teradata JDBC Provider(cells/jsr-websphere-1Cell01/nodes/jsr-websphere-1Node01/servers/jsr-business|resources.xml#JDBCProvider_1444648929602)"
Which looks like a correct object ID
However, when using it to create a datasource, I get the following error:
WASX7017E: Exception reçue lors de l'exécution du fichier "/root/jsr_auto_deployment/jsr.py" ; informations sur l'exception : com.ibm.ws.scripting.ScriptingException: Invalid object name: "Teradata JDBC Provider(cells/jsr-websphere-1Cell01/nodes/jsr-websphere-1Node01/servers/jsr-business|resources.xml#JDBCProvider_1444648929602)"
I am using Jython 2.7 through a Thin client. Reusing an object returned by AdminConfig.create() was working well with a Jython script ran through wsadmin.sh
My problem was on that line:
was_path = AdminConfig.getid(was_path)
Most of the time I was passing a string but this time I was already using an ID. So I removed the AdminConfig.getid in the function and added it in the calls when necessary only.
How to create service on Windows XP Embedded (Sc.exe is not installed) ?
You could try using VBScript, I found this code from here, OP says it work's perfectly, so maybe worth trying.
' Connect to WMI.
set objServices = GetObjecT("winmgmts:root\cimv2")
' Obtain the definition of the Win32_Service class.
Set objService = objServices.Get("Win32_Service")
' Obtain an InParameters object specific to the Win32_Service.Create method.
Set objInParam = objService.Methods_("Create").inParameters.SpawnInstance_()
' Add the input parameters.
objInParam.Properties_.item("Name") = "GPServer" '< - Service Name
objInParam.Properties_.item("DisplayName") = "GPServer" '< - Display Name, what you see in the Services control panel
objInParam.Properties_.item("PathName") = "c:\Server\srvany.exe" '< - Path and Command Line of the executable
objInParam.Properties_.item("ServiceType") = 16
objInParam.Properties_.item("ErrorControl") = 0
objInParam.Properties_.item("StartMode") = "Manual"
objInParam.Properties_.item("DesktopInteract") = True
'objInParam.Properties_.item("StartName") = ".\Administrator" '< - If null, will run as Local System
'objInParam.Properties_.item("StartPassword") = "YourPassword" '< - Only populate if the SatrtName param is populated
'More parameters and return statuses are listed in MSDN: "Create Method of the Win32_Service Class"
' Execute the method and obtain the return status.
' The OutParameters object in objOutParams is created by the provider.
Set objOutParams = objService.ExecMethod_("Create", objInParam)
wscript.echo objOutParams.ReturnValue