Saltstack user.present does not set uid while creating the user - windows

My goal is to have a user with a given uid. I try to have a simple user created with the very basic state:
Add Student:
user.present:
- name: Student
- uid: 333123123123
- allow_uid_change: True
333123123123 is just some dummy value. I'd like something more meanigful later, but this is what I use for testing.
This creates the user perfectly fine, but with generated uid:
ID: Add Student
Function: user.present
Name: Student
Result: True
Comment: New user Student created
Started: 19:47:33.543457
Duration: 203.157 ms
Changes:
----------
account_disabled:
False
account_locked:
False
active:
True
comment:
description:
disallow_change_password:
False
expiration_date:
2106-02-07 07:28:15
expired:
True
failed_logon_attempts:
0
fullname:
Student
gid:
groups:
home:
homedrive:
last_logon:
Never
logonscript:
name:
Student
passwd:
None
password_changed:
2022-02-21 19:47:33
password_never_expires:
False
profile:
None
successful_logon_attempts:
0
uid:
S-1-5-21-3207633127-2685365797-3805984769-1043
Summary
------------
Succeeded: 1 (changed=1)
Failed: 0
------------
Total states run: 1
Total run time: 203.157 ms
Now, if I try running state.apply again, I get the following message:
ID: Add Student
Function: user.present
Name: Student
Result: False
Comment: Encountered error checking for needed changes. Additional info follows:
- Changing uid (S-1-5-21-3207633127-2685365797-3805984769-1043 -> 333123123123) not permitted, set allow_uid_change to True to force this change. Note that this will not change file ownership.
Started: 19:47:45.503643
Duration: 7000.025 ms
Changes:
Summary
------------
Succeeded: 0
Failed: 1
------------
Total states run: 1
Total run time: 7.000 s
So it IS being considered, checked and verified - but not working while creating the user. The syntax seems to be confirmed. Why is it not getting applied upon creating the user?

It is possible to change a user's SID, but it requires unsupported registry hacking. Creating a new user with a specific SID would be even harder. Salt won't do that.
If you need to know the SID of a Windows user, you have to create it first and then query it. If you need it in a following state in the same run, then you can use slots.

Related

How to create a random string in Artillery for every connection

I am doing a load test for socket.io using Artillery
# my-scenario.yml
config:
target: "http://localhost:4000"
phases:
- duration: 60
arrivalRate: 10
engines:
socketio-v3: {transports: ['websocket']}
socketio:
query:
address: "{{ $randomString() }}"
scenarios:
- name: My sample scenario
engine: socketio-v3
I need the address field to be a different random string of fixed length for every connection.
Currently, $randomString() only generates 1 random string which is used for all connections, and also I can't control its length.
Thanks in advance!

Failed while loading data from data set SQLQueryDataSet

I am receiving this error:
DataSetError: Failed while loading data from data set SQLQueryDataSet(load_args={}, sql=select * from table)
when I run (within kedro jupyter notebook):
%reload_kedro
c:\users\name.virtualenvs\pipenv_kedro\lib\site-packages\ipykernel\ipkernel.py:283:DeprecationWarning: should_run_async will not call transform_cell automatically in the future. Please pass the result to transformed_cell argument and any exception that happen during the transform in preprocessing_exc_tuple in IPython 7.17 and above.
and should_run_async(code)
2021-04-21 15:29:12,278 - kedro.framework.session.store - INFO - read() not implemented for BaseSessionStore. Assuming empty store.
2021-04-21 15:29:12,696 - root - INFO - ** Kedro project Project
2021-04-21 15:29:12,698 - root - INFO - Defined global variable context, session and catalog
2021-04-21 15:29:12,703 - root - INFO - Registered line magic run_viz
Then this:
catalog.list()
#['table', 'parameters']
catalog.load('table')
where my catalog.yml file contains:
table:
type: pandas.SQLQueryDataSet
credentials: secret
sql: select * from table
layer: raw
However, I am able to pull back the expected result when I run this (within the same kedro jupyter notebook):
from kedro.extras.datasets.pandas import SQLQueryDataSet
sql = "select * from table"
credentials = {
"con": secret
}
data_set = SQLQueryDataSet(sql=sql,
credentials=credentials)
sql_data = data_set.load()
How can I fix this error?
The discrepancy I believe comes from the credentials. In your catalog you had
table:
type: pandas.SQLQueryDataSet
credentials: secret
but in the notebook you were testing with
credentials = {
"con": secret
}
The value mapped in the yaml file should match to the name of an entry in credentials.yml so something like
# in catalog.yml
table:
type: pandas.SQLQueryDataSet
credentials: db_creds
# in credentials.yml
db_creds:
con: secret

how do you start a workflow from another workflow and retrieve the return value of called workflow

I am testing google workflow and would like to call a workflow from another workflow but as a separate process (not a subworkflow)
I am able to start the execution but currently unable to retrieve the return value. I receive instead an instance of the execution:
{
"argument": "null",
"name": "projects/xxxxxxxxxxxx/locations/us-central1/workflows/child-workflow/executions/9fb4aa01-2585-42e7-a79f-cfb4b57b22d4",
"startTime": "2020-12-09T01:38:07.073406981Z",
"state": "ACTIVE",
"workflowRevisionId": "000003-cf3"
}
parent-workflow.yaml
main:
params: [args]
steps:
- callChild:
call: http.post
args:
url: 'https://workflowexecutions.googleapis.com/v1beta/projects/my-project/locations/us-central1/workflows/child-workflow/executions'
auth:
type: OAuth2
scope: 'https://www.googleapis.com/auth/cloud-platform'
result: callresult
- returnValue:
return: ${callresult.body}
child-workflow.yaml:
- getCurrentTime:
call: http.get
args:
url: https://us-central1-workflowsample.cloudfunctions.net/datetime
result: CurrentDateTime
- readWikipedia:
call: http.get
args:
url: https://en.wikipedia.org/w/api.php
query:
action: opensearch
search: ${CurrentDateTime.body.dayOfTheWeek}
result: WikiResult
- returnOutput:
return: ${WikiResult.body[1]}
also as an added question how can create a dynamic url from a variable. ${} doesn't seem to work there
As Executions are async API calls, you need to POLL for the workflow to see when finished.
You can have the following algorithm:
main:
steps:
- callChild:
call: http.post
args:
url: ${"https://workflowexecutions.googleapis.com/v1beta/projects/"+sys.get_env("GOOGLE_CLOUD_PROJECT_ID")+"/locations/us-central1/workflows/http_bitly_secrets/executions"}
auth:
type: OAuth2
scope: 'https://www.googleapis.com/auth/cloud-platform'
result: workflow
- waitExecution:
call: CloudWorkflowsWaitExecution
args:
execution: ${workflow.body.name}
result: workflow
- returnValue:
return: ${workflow}
CloudWorkflowsWaitExecution:
params: [execution]
steps:
- init:
assign:
- i: 0
- valid_states: ["ACTIVE","STATE_UNSPECIFIED"]
- result:
state: ACTIVE
- check_condition:
switch:
- condition: ${result.state in valid_states AND i<100}
next: iterate
next: exit_loop
- iterate:
steps:
- sleep:
call: sys.sleep
args:
seconds: 10
- process_item:
call: http.get
args:
url: ${"https://workflowexecutions.googleapis.com/v1beta/"+execution}
auth:
type: OAuth2
result: result
- assign_loop:
assign:
- i: ${i+1}
- result: ${result.body}
next: check_condition
- exit_loop:
return: ${result}
What you see here is that we have a CloudWorkflowsWaitExecution subworkflow which will loop 100 times at most, also has a 10 second delay, it will stop when the workflow has finished, and returns the result.
The output is:
argument: 'null'
endTime: '2020-12-09T13:00:11.099830035Z'
name: projects/985596417983/locations/us-central1/workflows/call_another_workflow/executions/05eeefb5-60bb-4b20-84bd-29f6338fa66b
result: '{"argument":"null","endTime":"2020-12-09T13:00:00.976951808Z","name":"projects/985596417983/locations/us-central1/workflows/http_bitly_secrets/executions/2f4b749c-4283-4c6b-b5c6-e04bbcd57230","result":"{\"archived\":false,\"created_at\":\"2020-10-17T11:12:31+0000\",\"custom_bitlinks\":[],\"deeplinks\":[],\"id\":\"j.mp/2SZaSQK\",\"link\":\"//<edited>/2SZaSQK\",\"long_url\":\"https://cloud.google.com/blog\",\"references\":{\"group\":\"https://api-ssl.bitly.com/v4/groups/Bg7eeADYBa9\"},\"tags\":[]}","startTime":"2020-12-09T13:00:00.577579042Z","state":"SUCCEEDED","workflowRevisionId":"000001-478"}'
startTime: '2020-12-09T13:00:00.353800247Z'
state: SUCCEEDED
workflowRevisionId: 000012-cb8
in the result there is a subkey that holds the results from the external Workflow execution.
The best method is now the workflows.executions.run helper method, which formats the request and blocks until the workflow execution has completed:
- run_execution:
try:
call: googleapis.workflowexecutions.v1.projects.locations.workflows.executions.run
args:
workflow_id: ${workflow}
location: ${location} # Defaults to current location
project_id: ${project} # Defaults to current project
argument: ${arguments} # Arguments could be specified inline as a map instead.
result: r1
except:
as: e
steps: ... # handle a failed execution

How do I transfer data from one oracle database to another oracle database with airflow

i'm trying to transfer data from one oracle_db1.table1 to another oracle_db2.table1. I've already installed the backport-provider: https://pypi.org/project/apache-airflow-backport-providers-oracle/.
Import works fine now. But trying first tasks i get this error. I think it's something about the connection:
Here the error log
[2020-08-18 12:30:15,485] {logging_mixin.py:112} INFO -
[2020-08-18 12:30:15,485] {base_hook.py:84}
INFO - Using connection to: id: DB1234.
Host: 192.168.50.123:1521/testserver, Port: 1521, Schema: blup, Login: blup, Password: xxXXX, extra: None
[2020-08-18 12:30:15,485] {logging_mixin.py:112} INFO -
[2020-08-18 12:30:15,485] {connection.py:342} ERROR - Expecting value: line 1 column 1 (char 0).
And here is my example DAG Task:
T3 = OracleToOracleOperator(
task_id="insert_data_to_db",
oracle_destination_conn_id= "BCDEFG",
destination_table= "BCDEFG.TEST_BENUTZER3",
oracle_source_conn_id= "DESTINATION_DB",
source_sql= """
SELECT * FROM DESTINATION_DB.BENUTZER
""",
source_sql_params=None,
rows_chunk=5000
)
Thanx in advance
problem with the connection. There were inputs in "extra". I deleted them. then it works

Symfony-2 gives more than one validation error message

My validation.yml is given:
task:
- Email:
message: The email "{{ value }}" is not a valid email.
- MinLength: { limit: 50, message: You must be 50 or under to enter. }
My issue is that if I give "wrong-email" in the task-field it gives two error messages:
The email "wrong-email" is not a valid email.
You must be 50 or under to enter.
Actually, I want to show only one error-message at a time.
That means it should check for the validation "MinLength" only if it is a valid email.
Validation sequencing can be done using group sequences. I fixed group sequences for the YAML driver only today, so you might need to wait for the next release of the 2.0 or master branch.
MyEntity:
group_sequence: [MyEntity, Extra]
properties:
task:
- Email: { message: ... }
- MinLength { limit: 50, message: ..., groups: Extra }
Now the constraints in group "Extra" will only be validated if all constraints in group "MyEntity" (i.e. the default group) succeed.

Resources