Azure Databricks API: import entire directory with notebooks - azure-databricks

I need to import many notebooks (both Python and Scala) to Databricks using Databricks REST API 2.0
My source path (local machine) is ./db_code and destination (Databricks workspace) is /Users/dmitriy#kagarlickij.com
I'm trying to build 2.0/workspace/import call, so my body is: { "content": "$SOURCES_PATH", "path": "$DESTINATION_PATH", "format": "SOURCE", "language": "SCALA", "overwrite": true }
However I'm getting error: Could not parse request object: Illegal character and as per documentation content must be The base64-encoded content
Should I encode all notebooks to base64?
Maybe there're some examples of importing directory to Databricks using API?

After some research I've managed to get it work:
Write-Output "Task: Create Databricks Directory Structure"
Get-ChildItem "$SOURCES_PATH" -Recurse -Directory |
Foreach-Object {
$DIR = $_.FullName.split("$WORKDIR_PATH/")[1]
$BODY = #{
"path" = "$DESTINATION_PATH/$DIR"
}
$BODY_JSON = $BODY | ConvertTo-Json
Invoke-RestMethod -Method POST -Uri "https://$DATABRICKS_REGION.azuredatabricks.net/api/2.0/workspace/mkdirs" -Headers $HEADERS -Body $BODY_JSON | Out-Null
}
Write-Output "Task: Deploy Scala notebooks to Databricks"
Get-ChildItem "$SOURCES_PATH" -Recurse -Include *.scala |
Foreach-Object {
$NOTEBOOK_NAME = $_.FullName.split("$WORKDIR_PATH/")[1]
$NOTEBOOK_BASE64 = [Convert]::ToBase64String([IO.File]::ReadAllBytes("$_"))
$BODY = #{
"content" = "$NOTEBOOK_BASE64"
"path" = "$DESTINATION_PATH/$NOTEBOOK_NAME"
"language" = "SCALA"
"overwrite" = "true"
"format" = "SOURCE"
}
$BODY_JSON = $BODY | ConvertTo-Json
Invoke-RestMethod -Method POST -Uri "https://$DATABRICKS_REGION.azuredatabricks.net/api/2.0/workspace/import" -Headers $HEADERS -Body $BODY_JSON | Out-Null
}

Yes, the notebooks must be encoded to base64. You can use Powershell to achieve this. Try the below.
$BinaryContents = [System.IO.File]::ReadAllBytes("$SOURCES_PATH")
$EncodedContents = [System.Convert]::ToBase64String($BinaryContents)
The body for the REST call would look like this.
{
"format": "SOURCE",
"content": "$EncodedContents",
"path": "$DESTINATION_PATH",
"overwrite": "true",
"language": "Scala"
}
With respect to importing whole directories, you could loop through the files and folders in your local directory using Powershell and make a REST call for each notebook.

Related

PowerShell to download Zip file from GitHub API

I would like to write a PowerShell script to download the GitHub repo in ZIP format by following this instruction:
https://docs.github.com/en/rest/reference/repos#contents
$Token = 'MyUserName:MyPAT'
$Base64Token = [System.Convert]::ToBase64String([char[]]$Token)
$Headers = #{
"Authorization" = 'Basic {0}' -f $Base64Token;
"accept" = "application/vnd.github.v3+json"
}
$Uri = "https://api.github.com/repos/{owner}/{repo}/zipball"
$r = Invoke-WebRequest -Headers $Headers -Uri $Uri -Method Get | Out-File "D:\MyRepo.zip"
The code did download the zip file but I got this error message when I tried to open the zip file:
D:\MyRepo.zip
The archive is either in unknown format or damaged
I am very new to PowerShell, any help is appreciated!
You may need to look more closely at download-a-repository-archive-zip instructions. It says the response will have a 302 redirect to the URL for downloading. Invoke-WebRequest will not automatically redirect, but it will provide the response headers.
If you change your last line to be:
$response = Invoke-WebRequest -Headers $Headers -Uri $Uri -Method Get
you can review the $response object's Headers and issue another Invoke-WebRequest with the same headers and the 302 Uri:
$RedirectedResponse = Invoke-WebRequest -Headers $Headers -Uri $RedirectedURI -Method Get
$RedirectedResponse.Content will have the encoded file contents that you can decode and write to your local filesystem.
EDIT: I got to a system where I had GitHub access and tested the script. I found that the first response had a byte array with the zip file contents. This functionality is too useful not to share! Here's a script that works to download a repo:
$user = 'bjorkstromm'
$repo = 'depends'
$uri = "https://api.github.com/repos/$user/$repo/zipball/"
if(!$cred){$cred = Get-Credential -Message 'Provide GitHub credentials' -UserName $user}
$headers = #{
"Authorization" = "Basic " + [convert]::ToBase64String([char[]] ($cred.GetNetworkCredential().UserName + ':' + $cred.GetNetworkCredential().Password))
"Accept" = "application/vnd.github.v3+json"
}
$response = Invoke-WebRequest -Method Get -Headers $headers -Uri $uri
$filename = $response.headers['content-disposition'].Split('=')[1]
Set-Content -Path (join-path "$HOME\Desktop" $filename) -Encoding byte -Value $response.Content

Powershell: Find the currently logged on user and copy a unique file from a shared drive

Title says most of it. I have generated unique files for users who will be running their scripts remotely. The script is supposed to find the name of the currently logged on user and copy that unique file to C:\Users\Public. Currently however I am running into an issue where the system seems to default to my username. I have tried multiple methods sourced from here and stack overflow and cannot seem to get a good result, as everyone ends up with my unique file. I have tried the following:
$env:username
$env:userprofile
$currentuser=[System.Security.Principal.WindowsIdentity]::GetCurrent().Name
The script looks as such:
$currentuser=[System.Security.Principal.WindowsIdentity]::GetCurrent().Name
if ($currentuser = "Domain/username1") {
copy-item -Path "shared network location\username1file" -Destination "C:\Users\Public"
}
elseif ($currentuser = "Domain\username2") {
copy-item -Path "shared network location\username2file" -Destination "C:\Users\Public"
}
elseif ($currentuser = "domain\username3") {
copy-item -Path "shared network location\username3file" -Destination "C:\Users\Public"
}
Can anyone provide me any advice on how to fix this?
To get the currently logged on user and not the user currently running the script, you can use WMI Win32_ComputerSystem.
Also, I would recommend using switch instead of multiple elseif:
$currentuser = (Get-WmiObject -Class Win32_ComputerSystem).UserName
$file = switch ($currentuser) {
'Domain\username1' { "shared network location\username1file"; break }
'Domain\username2' { "shared network location\username2file"; break }
'Domain\username3' { "shared network location\username3file"; break }
default { $null }
}
if ($file) {
Copy-Item -Path $file -Destination "C:\Users\Public"
}
else {
Write-Host "No file to copy defined for user '$currentuser'"
}

Downloading certain files using powershell produce corrupt files

So I have a powershell script that I wrote which crawls through a particular website and downloads all of the software hosted on the site to my local machine. The website in question is nirsoft.net, and I will include the full script below. Anyway, so I have this script that downloads all of the application files hosted on the website, when I notice something odd: while most of the file downloads completed successfully, there are several files that were not downloaded successfully, resulting in a corrupt file of 4KB:
For those of you who are familiar with Nirsoft's software, the tools are very powerful, but also constantly misidentified as dangerous because of the password cracking tools, so my guess as to why this is happening is that, since powershell's If I were to guess as to why this was happening, I would guess that, due to the fact that powershell's "Invoke-webrequest cmdlet" uses Internet Explorer's engine for its core functionality, Internet Explorer is flagging the files as dangerous and refusing to download them, thus causing powershell to fail to download the file. I confirmed this by trying to manually download each of the corrupt files using internet explorer, which marked them all as malicious. However, this is where things get strange. In order to bypass this limitation, I attempted a variety of other methods to download the file within my script, like using a pure dotnet object ( (New-object System.Net.WebClient).DownloadFile("url","file") ) and even some third party command line tools (wget for windows, wget in cygwin, etc), but no matter what I tried, not a single alternative method I used was able to download a non-corrupt file. So what I want to know is if there is a way around this, and I want to know why even third party tools are affected by this. Is there some kind of rule that any scripting tool has to use Internet Explorer's engine in order to connect to the internet or something? Thanks in advance. Oh, and one last thing before I post the script. Below is the url to one of the files that I am having difficulty in downloading via powershell, which you can use to run individual tests rather than the whole script:
enter link description here
And without further ado, here is the script. Thank again:
$VerbosePreference = "Continue"
$DebugPreference = "Continue"
$present = $true
$subdomain = $null
$prods = (Invoke-WebRequest "https://www.nirsoft.net/utils/index.html").links
Foreach ($thing in $prods)
{
If ($thing.Innertext -match "([A-Za-z]|\s)+v\d{1,3}\.\d{1,3}(.)*")
{
If ($thing.href.Contains("/"))
{
}
$page = Invoke-WebRequest "https://www.nirsoft.net/utils/$($thing.href)"
If ($thing.href -like "*dot_net_tools*")
{
$prodname = $thing.innerText.Trim().Split(" ")
}
Else
{
$prodname = $thing.href.Trim().Split(".")
}
$newlinks = $page.links | Where-Object {$_.Innertext -like "*Download*" -and ($_.href.endswith("zip") -or $_.href.endswith("exe"))}
# $page.ParsedHtml.title
#$newlinks.href
Foreach ($item in $newlinks)
{
$split = $item.href.Split("/")
If ($item.href -like "*toolsdownload*")
{
Try
{
Write-host "https://www.nirsoft.net$($item.href)"
Invoke-WebRequest "https://www.nirsoft.net$($item.href)" -OutFile "$env:DOWNLOAD\test\$($split[-1])" -ErrorAction Stop
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
elseif ($item.href.StartsWith("http") -and $item.href.Contains(":"))
{
Try
{
Write-host "$($item.href)"
Invoke-WebRequest $item.href -OutFile "$env:DOWNLOAD\test\$($split[-1])" -ErrorAction Stop
}
Catch
{
Write-Host "$($item.href)" -ForegroundColor Red
}
}
Elseif ($thing.href -like "*/dot_net_tools*")
{
Try
{
Invoke-WebRequest "https://www.nirsoft.net/dot_net_tools/$($item.href)" -OutFile "$env:DOWNLOAD\test\$($split[-1])" -ErrorAction Stop
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
Else
{
Try
{
Write-Host "https://www.nirsoft.net/utils/$($item.href)"
Invoke-WebRequest "https://www.nirsoft.net/utils/$($item.href)" -OutFile "$env:DOWNLOAD\test\$($item.href)" -ErrorAction Stop
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
If ($item.href.Contains("/"))
{
If (!(Test-Path "$env:DOWNLOAD\test\$($split[-1])"))
{
$present = $false
}
}
Else
{
If (!(Test-Path "$env:DOWNLOAD\test\$($item.href)"))
{
$present = $false
}
}
}
}
}
If ($present)
{
Write-Host "All of the files were downloaded!!!" -ForegroundColor Green
}
Else
{
Write-Host "Not all of the files downloaded. Something went wrong." -ForegroundColor Red
}
You have two separate issues.
For anything Defender flags, it doesn't matter if you save it to disk with this or that. You could simply add an exclusion for the directory in Defender.
The other issue is pointed out by Guenther, you need to provide a referrer at least on some of the downloads. With the following changes I was able to download them all.
$VerbosePreference = "Continue"
$DebugPreference = "Continue"
$present = $true
$subdomain = $null
$path = c:\temp\downloadtest\
New-Item $path -ItemType Directory -ErrorAction SilentlyContinue | Out-Null
Add-MpPreference -ExclusionPath $path
$prods = (Invoke-WebRequest "https://www.nirsoft.net/utils/index.html").links
Foreach ($thing in $prods)
{
If ($thing.Innertext -match "([A-Za-z]|\s)+v\d{1,3}\.\d{1,3}(.)*")
{
If ($thing.href.Contains("/"))
{
}
$page = Invoke-WebRequest "https://www.nirsoft.net/utils/$($thing.href)"
If ($thing.href -like "*dot_net_tools*")
{
$prodname = $thing.innerText.Trim().Split(" ")
}
Else
{
$prodname = $thing.href.Trim().Split(".")
}
$newlinks = $page.links | Where-Object {$_.Innertext -like "*Download*" -and ($_.href.endswith("zip") -or $_.href.endswith("exe"))}
# $page.ParsedHtml.title
#$newlinks.href
Foreach ($item in $newlinks)
{
$split = $item.href.Split("/")
If ($item.href -like "*toolsdownload*")
{
Try
{
Write-host "https://www.nirsoft.net$($item.href)"
Invoke-WebRequest "https://www.nirsoft.net$($item.href)" -OutFile "$path\$($split[-1])" -ErrorAction Stop -Headers #{Referer="https://www.nirsoft.net$($item.href)"}
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
elseif ($item.href.StartsWith("http") -and $item.href.Contains(":"))
{
Try
{
Write-host "$($item.href)"
Invoke-WebRequest $item.href -OutFile "$path\$($split[-1])" -ErrorAction Stop -Headers #{Referer="$($item.href)"}
}
Catch
{
Write-Host "$($item.href)" -ForegroundColor Red
}
}
Elseif ($thing.href -like "*/dot_net_tools*")
{
Try
{
Invoke-WebRequest "https://www.nirsoft.net/dot_net_tools/$($item.href)" -OutFile "$path\$($split[-1])" -ErrorAction Stop -Headers #{Referer="https://www.nirsoft.net/dot_net_tools/$($item.href)"}
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
Else
{
Try
{
Write-Host "https://www.nirsoft.net/utils/$($item.href)"
Invoke-WebRequest "https://www.nirsoft.net/utils/$($item.href)" -OutFile "$path\$($item.href)" -ErrorAction Stop -Headers #{Referer="https://www.nirsoft.net/utils/$($item.href)"}
}
Catch
{
Write-Host $thing.href -ForegroundColor Red
}
}
If ($item.href.Contains("/"))
{
If (!(Test-Path "$path\$($split[-1])"))
{
$present = $false
}
}
Else
{
If (!(Test-Path "$path\$($item.href)"))
{
$present = $false
}
}
}
}
}
If ($present)
{
Write-Host "All of the files were downloaded!!!" -ForegroundColor Green
}
Else
{
Write-Host "Not all of the files downloaded. Something went wrong." -ForegroundColor Red
}
I'd also recommend you turn the download routine into a function that you can pass the relative URL portion so you don't have to repeat code several times.

How to update configuration in any environments in urban code using rest call?

I am trying to create rest api to update configuration in all environments in urban code.
Is there any rest client or do we need write any custom code.?
How do I start? Your kind suggestions please
Are you asking about updating environment properties? If so. I do this with Powershell.
$webUrl = "https://ibm-ucd.myCompany.com"
$ucdApiUserName = "yourCLIAccount"
$ucdApiPass = "yourCLIpassword"
$appName = "MyApplication"
$environment = "ADT"
$propertyNewValue = "myNewValue"
$credential = New-Object System.Management.Automation.PSCredential ($ucdApiUserName,(ConvertTo-SecureString $ucdApiPass -AsPlainText -Force))
####################################################################
## Bypass Cert Issues with connecting to HTTPS API
####################################################################
$certData = [string][System.Net.ServicePointManager]::CertificatePolicy
if ($certData -ne "TrustAllCertsPolicy")
{
add-type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"#
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
}
$hash = [ordered] #{
"application" = "$AppName";
"environment" = "$environment";
"isSecure" = "false";
"name" = "myEnvProperty";
"value" = "$propertyNewValue"
}
$newValuesJson = $hash | ConvertTo-Json
Write-Host "Updating uDeploy environment properties for $environment"
$uri = "$WebUrl/cli/environment/propValue?"
Invoke-RestMethod -Method 'PUT' -ContentType "application/json" -Credential
$credential -uri $uri -Body $newValuesJson | Out-Null

Batch script to Queue the builds in tfs 2015/2017

I was trying to execute builds using a Batch script, I wrote one but getting this error:
please define the Build definition name. tfsbuild start /collection:https://tfs.prod.dcx.int.bell.ca/tfs/bellca/Consumer/builds/All Definitions/{Release)/{Project-name}/{Build definition name}
How can can I fix this?
The tfsbuild command line tool is only for XAML builds. For modern builds, you'll need to use the REST API or the C# wrapper for the REST API.
The documentation has good examples, but basically POST https://{instance}/DefaultCollection/{project}/_apis/build/builds?api-version={version}
with an appropriate body:
{
"definition": {
"id": 25
},
"sourceBranch": "refs/heads/master",
"parameters": "{\"system.debug\":\"true\",\"BuildConfiguration\":\"debug\",\"BuildPlatform\":\"x64\"}"
}
Yes, just as Daniel said, you need to use the REST API, See Queue-a-build.
You could simply use below PowserShell script to queue the builds (Just replace the params accordingly):
Param(
[string]$collectionurl = "http://server:8080/tfs/DefaultCollection",
[string]$projectName = "ProjectName",
[string]$keepForever = "true",
[string]$BuildDefinitionId = "34",
[string]$user = "username",
[string]$token = "password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
function CreateJsonBody
{
$value = #"
{
"definition": {
"id": $BuildDefinitionId
},
"parameters": "{\"system.debug\":\"true\",\"BuildConfiguration\":\"debug\",\"BuildPlatform\":\"x64\"}"
}
"#
return $value
}
$json = CreateJsonBody
$uri = "$($collectionurl)/$($projectName)/_apis/build/builds?api-version=2.0"
$result = Invoke-RestMethod -Uri $uri -Method Post -Body $json -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}

Resources