Is it possible to upload images to Windows azure blob storage from a SQL SSIS package? SSIS will read new images (on daily basis) from one of my on-Premise SQL Server (table) and upload images to blob storage.
What a fun question this was! I got to thread together a lot of pieces that I had never tried.
I first built out a simple console app based on the fine manual over on HOW TO: Blob Storage. Knowing that I had working code allowed me to adapt it for SSIS.
I created 3 SSIS Variables at the Package level. AccountName, AccountKey and ContainerName. They are all data type String. These provide credentials + the folder where my uploaded data will reside.
Data Flow
The general look of your data flow is rather simple. A data source to a Script Component that will act as a Destination. You will need two columns: one provides a unique name for the blob and the other will be the binary bits.
My source is a trivial table. It has Country Names and their flag (stored as varbinary(max)) which you yourself can scrape from the CIA World Handbook if you're so inclined.
The Destination will be a bit of C#. Add a Script Component of type Destination.
On the Script tab, I have 3 ReadOnly Variables listed User::AccountKey,User::AccountName,User::ContainerName
On the Input Columns tab, I select CountryName and FlagImage.
The script itself follows. As noted in the How To, you will need to add a reference to Microsoft.WindowsAzure.Storage assembly before you can access the last 3 assemblies there.
using System;
using System.Data;
using System.IO;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
// Must add reference to Microsoft.WindowsAzure.Storage for this to work
// http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
/// <summary>
/// Watch me load data to Azure from SSIS
/// </summary>
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
/// <summary>
/// The storage account used
/// </summary>
private CloudStorageAccount storageAccount;
/// <summary>
/// An entity to work with the Blobs
/// </summary>
private CloudBlobClient blobClient;
/// <summary>
/// Blobs live in containers
/// </summary>
private CloudBlobContainer container;
/// <summary>
/// blockBlob instead of a pageBlob
/// </summary>
private CloudBlockBlob blockBlob;
/// <summary>
/// This method is called once, before rows begin to be processed in the data flow.
///
/// You can remove this method if you don't need to do anything here.
/// </summary>
public override void PreExecute()
{
base.PreExecute();
string cs = string.Empty;
string csTemplate = string.Empty;
string accountName = string.Empty;
string accountKey = string.Empty;
string containerName = string.Empty;
accountName = Variables.AccountName;
accountKey = Variables.AccountKey;
containerName = Variables.ContainerName;
csTemplate = "DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}";
cs = string.Format(csTemplate, accountName, accountKey);
this.storageAccount = CloudStorageAccount.Parse(cs);
this.blobClient = this.storageAccount.CreateCloudBlobClient();
this.container = this.blobClient.GetContainerReference(containerName);
this.container.CreateIfNotExists();
this.container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
}
/// <summary>
/// For each row passing through, upload to Azure
/// </summary>
/// <param name="Row">The row that is currently passing through the component</param>
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string blobName = string.Empty;
using (MemoryStream memStream = new MemoryStream(Row.FlagImage.GetBlobData(0, (int)Row.FlagImage.Length)))
{
this.blockBlob = this.container.GetBlockBlobReference(Row.CountryName);
this.blockBlob.UploadFromStream(memStream);
}
}
}
Global Assembly Cache (GAC)
Assemblies you wish to use within SSIS must reside in the GAC. Assemblies cannot go into the GAC unless they are signed. Fortunately, the Azure assemblies are signed so from a Visual Studio Command Prompt, type gacutil -if "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.1\ref\Microsoft.WindowsAzure.Storage.dll" or the equivalent of where your version of that assembly exists
Load successful
And as proof, here's a shot from Azure Storage Explorer
SSIS 2012 and above now have a Microsoft-supported task to upload/download data to Azure Storage:
Example. "Microsoft SQL Server 2016 Integration Services Feature Pack for Azure": https://www.microsoft.com/en-us/download/details.aspx?id=49492
just search for 2012 and 2014 if that's what you are using.
Hope that helps!
Related
I'm using C# Xamarin, and SkiaSharp to render image from resource folder. But I cannot get the correct image location.
Where I can find this image when run project? I try to looking for but no result:
You can get resource directory path using DirectoryInfo
https://samsung.github.io/TizenFX/API4/api/Tizen.Applications.DirectoryInfo.html#Tizen_Applications_DirectoryInfo_Resource
Here is a example that how to use
https://github.com/xamarin/Xamarin.Forms/blob/b59bb767a4367240983e93ab8e1a9a050dfea23b/Xamarin.Forms.Platform.Tizen/ResourcePath.cs#L27-L30
Thank you #Seungkenun Le. Base on his comment, I write a simple function to get resource path in Tizen Watch for Xamarin:
/// <summary>
/// Gets the resource path.
/// </summary>
/// <returns></returns>
internal static string GetResourcePath()
{
Tizen.Applications.Application app = Tizen.Applications.Application.Current;
if (app != null)
{
string resourcePath = app.DirectoryInfo.Resource;
if (Directory.Exists(resourcePath))
{
return resourcePath;
}
}
return string.Empty;
}
I am using API Gateway Pattern in a Micro services architecture in which the Front End Angular app makes an HTTP request to my API Gateway project which is simply a ASP.net Core 3.1 Web API project. Currently I only have 2 micro services and an API Gateway and all of them are of type ASP.net Core 3.1 Web API project. The API Gateway project has all the controllers of my micro services. The purpose of the API Gateway is just to receive the request from Front end and make an HTTP Request to the appropriate Micro service.
Now in the AccountController.cs of my API Gateway project, I have the following code
/// <summary>
/// Gets the detail of an account by its id
/// </summary>
/// <param name="organizationId">Id of the Organization of which the account belongs to</param>
/// <param name="accountId">Id of Account of which information is being requested</param>
/// <returns>Account's Details</returns>
[HttpGet("{organizationId}/{accountId}")]
public async Task<IActionResult> GetAccountAsync(Guid organizationId, Guid accountId)
{
_uri = new Uri(uriString: $"{_configurationService.AccountAPI}GetAccount/{organizationId}/{accountId}");
using var result = await _client.GetAsync(_uri);
var content = await result.Content.ReadAsStringAsync();
return Ok(content.AsObject<MessageResponse<AccountDetailVM>>());
}
After searching about the SSRF issue on stackoverflow I found the following recommendation at Veracode community.
Veracode Static Analysis will report a flaw with CWE 918 if it can
detect that data from outside of the application (like an HTTP Request
from a user, but also a file that may have been uploaded by a user,
database data, webservice data, etc) is able to change the nature of a
network request.
On Stackoverflow I found the following fix
For CWE ID 918 it is hard to make Veracode recognize your fix unless you have static URL. You need to validate all your inputs that become parts of your request URL.
That means I had to sanitize my input parameters OrganizationId and AccountId before appending them to the request URL.
Also another question on the veracode community suggested
The only thing that Veracode Static Analysis will automatically detect as a remediation for this flaw category is to change the input to be hardcoded
and they proposed a solution for the query string
The given example appears to take a model identifier and put it in the
URL used in an internal request. We would recommend validating the ID
per the rules you have for this datatype (typically this should only
be alphanumeric and less than 255 characters) and URLencode it before
appending it to a URL.
After all those stuff, I have made the following changes to my code
Made sure OrganizationId and AccountId Guid are not empty
URL Encoded the string
Here is the code after changes
/// <summary>
/// Gets the detail of an account by its id
/// </summary>
/// <param name="organizationId">Id of the Organization of which the account belongs to</param>
/// <param name="accountId">Id of Account of which information is being requested</param>
/// <returns>Account's Details</returns>
[HttpGet("{organizationId}/{accountId}")]
public async Task<IActionResult> GetAccountAsync(Guid organizationId, Guid accountId)
{
if (organizationId != Guid.Empty && accountId != Guid.Empty)
{
string url = HttpUtility.UrlEncode($"{_configurationService.AccountAPI}GetAccount/{organizationId}/{accountId}");
using var result = await _client.GetAsync(url);
var content = await result.Content.ReadAsStringAsync();
return Ok(content.AsObject<MessageResponse<AccountDetailVM>>());
}
return BadRequest();
}
Thats All I could do to sanitize my input parameters OrganizationId and AccountId but after all those changes veracode still identifies a SSRF flaw on line
using var result = await _client.GetAsync(url);
I found a hack to fix this issue, I just appended the query string parameters to the Base Address of httpClient and veracode stopped giving me error.
Here is how the solution looks like
/// <summary>
/// Gets the detail of an account by its id
/// </summary>
/// <param name="organizationId">Id of the Organization of which the account belongs to</param>
/// <param name="accountId">Id of Account of which information is being requested</param>
/// <returns>Account's Details</returns>
[HttpGet("{organizationId}/{accountId}")]
public async Task<IActionResult> GetAccountAsync(Guid organizationId, Guid accountId)
{
if (organizationId != Guid.Empty && accountId != Guid.Empty)
{
var httpClient = new HttpClient();
//Appended the parameters in base address to
//to fix veracode flaw issue
httpClient.BaseAddress = new Uri($"{_configurationService.AccountAPI}GetAccount/{organizationId}/{accountId}");
//passing empty string in GetStringAsync to make sure
//veracode doesn't treat it like modifying url
var content = await httpClient.GetStringAsync("");
return Ok(content.AsObject<MessageResponse<AccountDetailVM>>());
}
return BadRequest();
}
So we have an inventory list with details about each item, mostly drop down menus and check boxes, some comments and descriptions. These records are in SharePoint. Sometimes we need to update multiple items in there in addition to a large number of other steps and I am trying to automate most of these steps including the updates to their SharePoint record. What is the best way to go about this in PowerShell from any remote computer?
Would I connect to the database, find the record and and update the record there? Is there an easier way? I tried finding PowerShell CLI tools for SharePoint but I don't see them available anywhere.
For example, I might want to update this field here:
I think the the best Automate update list item in SharePoint remotely is using CSOM(C# code) API.
Here is a demo about update list item for your reference:
// Starting with ClientContext, the constructor requires a URL to the
// server running SharePoint.
ClientContext context = new ClientContext("http://SiteUrl");
// Assume that the web has a list named "Announcements".
List announcementsList = context.Web.Lists.GetByTitle("Announcements");
// Assume there is a list item with ID=1.
ListItem listItem = announcementsList.GetItemById(1);
// Write a new value to the Body field of the Announcement item.
listItem["Body"] = "This is my new value!!";
listItem.Update();
context.ExecuteQuery();
For do authentication to access SharePoint, we can do as below:
/// <summary>
/// set authentication of SharePoint online
/// </summary>
/// <param name="clientContext"></param>
/// <param name="userName"></param>
/// <param name="password"></param>
public static void setOnlineCredential(ClientContext clientContext,string userName,string password)
{
//set the user name and password
SecureString secureString = new SecureString();
foreach (char c in password.ToCharArray())
{
secureString.AppendChar(c);
}
clientContext.Credentials = new SharePointOnlineCredentials(userName, secureString);
}
/// <summary>
/// set authentication of SharePoint on-premise
/// </summary>
/// <param name="clientContext"></param>
/// <param name="userName"></param>
/// <param name="password"></param>
/// <param name="domain"></param>
public static void setClientCredential(ClientContext clientContext, string userName, string password, string domain)
{
clientContext.Credentials = new NetworkCredential(userName, password, domain);
}
If you want to use PowerShell, We can use PowerShell to use CSOM as this article talked: CSOM SharePoint PowerShell Reference and Example Codes and we just need to modify the code above to be PowerShell code
I have a simple webapi2 project.
The only information I can seem to find myself refers to the older webapi1
From my controller if I have
/// <summary>
/// Gets a list of not very interesting information
/// </summary>
/// <returns>The list</returns>
[ResponseType(typeof(ExampleModel))]
public IHttpActionResult Get()
{
var data = new List<ExampleModel>()
{
new ExampleModel()
{
Date = DateTime.Now,
Name = "Tom"
},
new ExampleModel()
{
Date = DateTime.Now.AddDays(-20),
Name = "Bob"
}
};
why is no information appearing when I try browse to the help page. I am told No documentation available.
Is there a magic switch somewhere that will turn on automated population of this data?
if you referring to displaying the xml comments, then you can find my answer here:
ASP.NET Web API Help Page documentation using Xml comments on controllers
Be sure to uncomment this code in Areas/HelpPage/App_Start/HelpPageConfig.cs
// Uncomment the following to use the documentation from XML documentation file.
config.SetDocumentationProvider(new XmlDocumentationProvider(HttpContext.Current.Server.MapPath("~/App_Data/XmlDocument.xml")));
Also make sure the xml file goes in App_Data not bin where it defaults to in project properties
Hi I'm trying to execute stored procedure using entity framework like below.
var empid = new SqlParameter("#empid", "E001");
var dept = new SqlParameter("#dept", "D001");
var selectData = dbContext.ExecuteStoreQuery<EmpDO>("getEmployeeDetails #empid, #deptid", empid, deptid);
When I try to run my application, I'm getting below error. Any idea?
Is this a Typo? The variable is dept, but the variable used in the function call is deptid. Code is changed below.
var empid = new SqlParameter("#empid", "E001");
var dept = new SqlParameter("#dept", "D001");
var selectData = dbContext.ExecuteStoreQuery<EmpDO>("getEmployeeDetails #empid, #deptid", empid, dept);
I just looked on the community site and they have...
Known Problems
Directly executing store commands using methods such as
ObjectContext.ExecuteStoreCommand or ObjectContext.ExecuteStoreQuery
is not supported. You may, however, create a DbCommand from the
database connection using code such as this:
using EFProviderWrapperToolkit;
...
context.Connection.GetStoreConnection().CreateCommand()
Take a look at the link below for a question asked previously. Unless the method is now implemented in the EFPrivderWrapperToolkit, this should answer your question as to why you are getting the error. The DbCommand method is overridden by the toolkit but not implemented.
/// <summary>
/// Creates and returns a <see cref="T:System.Data.Common.DbCommand"/> object associated with the current connection.
/// </summary>
/// <returns>
/// A <see cref="T:System.Data.Common.DbCommand"/> object.
/// </returns>
protected override DbCommand CreateDbCommand()
{
throw new NotSupportedException();
}
Similar StackOverflow Question