I have changed the ports that Azure Storage Emulator runs on from 10000,10001,10002 to 10003,10004,10005 from the config file at "C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\WAStorageEmulator.exe.config"
Now when I try to access Development Storage from Server Explorer in Visual Studio 2013 it fails to access the updated ports. I tried to manually add external storage and specify the endpoints to reflect the updated ports with the following info default storage account information:
DefaultEndpointsProtocol=http
AccountName=devstoreaccount1
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==
BlobEndpoint=http://127.0.0.1:10003/devstoreaccount1
QueueEndpoint=http://127.0.0.1:10004/devstoreaccount1
TableEndpoint=http://127.0.0.1:10005/devstoreaccount1
but that still does not allow it to connect. I also tried the same endpoints but without the storage account suffix. It even reverts the ports to 10000,10001,10002 when I refresh the External Storage. I assume it is reading from some config somewhere but I cannot seem to google any answer as to where this is being read from.
So how can I configure Server Explorer to reflect the updated ports?
The ports are hard coded into the CloudStorageAccount class so no you can't modifiy them:
private static CloudStorageAccount GetDevelopmentStorageAccount(Uri proxyUri)
{
UriBuilder uriBuilder = proxyUri != (Uri)null ? new UriBuilder(proxyUri.Scheme, proxyUri.Host) : new UriBuilder("http", "127.0.0.1");
uriBuilder.Path = "devstoreaccount1";
uriBuilder.Port = 10000;
Uri uri1 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri2 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri3 = uriBuilder.Uri;
uriBuilder.Path = "devstoreaccount1-secondary";
uriBuilder.Port = 10000;
Uri uri4 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri5 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri6 = uriBuilder.Uri;
CloudStorageAccount cloudStorageAccount = new CloudStorageAccount(new StorageCredentials("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="), new StorageUri(uri1, uri4), new StorageUri(uri2, uri5), new StorageUri(uri3, uri6), (StorageUri)null);
cloudStorageAccount.Settings = (IDictionary<string, string>)new Dictionary<string, string>();
cloudStorageAccount.Settings.Add("UseDevelopmentStorage", "true");
if (proxyUri != (Uri)null)
cloudStorageAccount.Settings.Add("DevelopmentStorageProxyUri", proxyUri.ToString());
cloudStorageAccount.IsDevStoreAccount = true;
return cloudStorageAccount;
}
Unfortunately, there is no support for changing the Azure Storage Emulator ports.
Related
I want to download a Blob from a private container in an Azure Storage Account from my local machine using Visual Studio (2022).
To achieve that I am using
DefaultAzureCredential credential = new DefaultAzureCredential();
Uri uri = new Uri("https://xxx.blob.core.windows.net/xxx/xxx.json");
BlobClient blobClient = new BlobClient(uri, credential);
Response<BlobDownloadResult> downloadResponse = blobClient.DownloadContent();
When I execute the code I get the following error
Issuer validation failed. Issuer did not match.
I authenticated in VS 2022 as described here: Managed Identity - how to debug locally
What do I need to do to successfully download the Blob?
As described in https://learn.microsoft.com/en-us/dotnet/api/azure.identity.defaultazurecredential?view=azure-dotnet DefaultAzureCredential tries different options to get a credential, and one of them is VisualStudioCredential.
For me to get it working locally I had to provide the VisualStudioTenantId:
DefaultAzureCredentialOptions defaultAzureCredentialOptions = new DefaultAzureCredentialOptions()
{
VisualStudioTenantId = "xxx"
};
DefaultAzureCredential credential = new DefaultAzureCredential(defaultAzureCredentialOptions);
Uri uri = new Uri("https://xxx.blob.core.windows.net/xxx/xxx.json");
BlobClient blobClient = new BlobClient(uri, credential);
Response<BlobDownloadResult> downloadResponse = blobClient.DownloadContent();
Writing a dotnet core app. I need to log in with network credentials as the service (which happens to be a TFS on-prem server) uses those to authenticate. From my (and another team members') windows machine, the following code works:
Console.WriteLine("Type in your DOMAIN password:");
var pass = GetPassword(); //command line secure string magic from SO
var networkCredential = new NetworkCredential("USERNAME", pass, "DOMAINNAME");
string tfsDefaultCollection = "https://TFSURL/DefaultCollection";
string testUrl = $"{tfsDefaultCollection}/_apis/tfvc/changesets/1234/changes?api-version=2.2";
var httpClientHandler = new HttpClientHandler
{
Credentials = networkCredential
};
var client = new HttpClient(httpClientHandler)
{
BaseAddress = new Uri(testUrl)
};
httpClientHandler.PreAuthenticate = true;
var test = client.GetAsync(testUrl).Result;
Console.WriteLine(test);
But it doesn't work from my mac. I get a 401 unauthorized. Both used the same, hardwired connection. AND this works on my mac:
curl --ntlm --user "DOMAINNAME\USERNAME" "https://TFSURL/DefaultCollection/_apis/tfvc/changesets/1234/changes?api-version=2.2"
So that rules out a connectivity question, I would think. Am I missing something I need to be doing on my mac? Can anybody point me to some documentation or way to troubleshoot what both of these requests are doing at the lowest level to see if there is a difference?
Well finally some google-foo got me there. There's a bug in dotnet core for linux/mac. This issue describes the fix:
https://github.com/dotnet/corefx/issues/25988#issuecomment-412534360
It has to do with the host machine you are connecting to uses both Kerberos and NTLM authentication methods.
Implemented below:
AppContext.SetSwitch("System.Net.Http.UseSocketsHttpHandler", false);
Console.WriteLine("Type in your DOMAIN password:");
var pass = GetPassword(); //command line secure string magic from SO
var networkCredential = new NetworkCredential("USERNAME", pass, "DOMAINNAME");
string tfsDefaultCollection = "https://TFSURL/DefaultCollection";
string testUrl = $"{tfsDefaultCollection}/_apis/tfvc/changesets/1234/changes?api-version=2.2";
var myCache = new CredentialCache
{
{
new Uri(testUrl), "NTLM",
networkCredential
}
};
var httpClientHandler = new HttpClientHandler
{
Credentials = myCache
};
var client = new HttpClient(httpClientHandler)
{
BaseAddress = new Uri(testUrl)
};
httpClientHandler.PreAuthenticate = true;
var test = client.GetAsync(testUrl).Result;
Console.WriteLine(test);
Thanks to #dmcgill50 for getting me on the right googling track.
I am just try to connect my Blob Storage with .net SDK with MVC application and here is my code;
public static CloudBlobClient CreateClient(UnitOfWork uow)
{
CloudStorageList credentials;
CloudBlobClient client;
credentials = uow.RepositoryFor<CloudStorageList>().GetAll(filter: xx => !xx.IsDeleted).FirstOrDefault();
var storageCredentials = new StorageCredentials(credentials.Name, credentials.PrimaryAccessKey);
var storage =
new CloudStorageAccount(storageCredentials,true);
client = storage.CreateCloudBlobClient();
return client;
}
But I am facing the error when I reached the line;
var storage =
new CloudStorageAccount(storageCredentials,true);
I have just mention the error in the subject i.e. primary and secondear location URI in a storageUI must point to the same resource.
Any help will be a favor.
Regards,
var storageCredential = new StorageCredentials(*, **);
is the account name : this is the name of your storagename
** is the keyValue : vieuw in the image
When I call my WEB API from my Console Application, I encounter:
The remote server returned an error: (401) Unauthorized.
This application runs in Interanet (Windows Authentication)
Uri uri = new Uri("http://myServer/api/main/foo");
WebClient client = new WebClient();
client.Credentials = CredentialCache.DefaultCredentials;
using (Stream data = client.OpenRead(uri))
{
using (StreamReader sr = new StreamReader(data))
{
string result = sr.ReadToEnd();
Console.WriteLine(result);
}
}
Updated
If I replace
client.Credentials = CredentialCache.DefaultCredentials;
with this line
client.Credentials = new NetworkCredential( username, password);
it works fine but I need the current credential to be set automatically.
Any idea?
Thanks in advance ;)
You use the default windows credentials here
client.Credentials = CredentialCache.DefaultCredentials;
Specify the credential that you want to authenticate using the following code:
var credential = new NetworkCredential(, , );
serverReport.ReportServerCredentials.NetworkCredentials = credential;
following line is the cause of this behaviour :
client.Credentials = CredentialCache.DefaultCredentials;
Actually this line assigns the credentials of the logged in user or the user being impersonated ( which is only possible in web applications ) , so what I believe is that you have to provide credentials explicitly (http://msdn.microsoft.com/en-us/library/system.net.credentialcache(v=vs.110).aspx) , thanks.
Im creating a simple WCF service for receiving crash reports.
The service will run self-hosted as a console program and must run without any installation of certificates.
Security-wise i need to ensure that the data send by the client is only send to our server and that the data is not intercepted. From the server point of view i would also like to ensure that the connecting client is using a specific certificate (embedded in the client assembly) to discourage abuse of the service.
I have created a single self-signed certificate and plan to embed the .cer (containing the public part of the certificate) in the client assembly and embed the PFX containing the certificate with the private key into the service host program assembly. (I was led to believe by this that i could use a single certificate).
My problem is that no matter how is setup this up i get the following error:
"An error occurred while making the HTTP request to https://localhost:8080/errorservice. This could be due to the fact that the server certificate is not configured properly with HTTP.SYS in the HTTPS case. This could also be caused by a mismatch of the security binding between the client and the server."
There shouldnt be a mismatch between the bindings, as they are created using the same code:
public static BasicHttpBinding CreateStreamingBinding() {
BasicHttpBinding streamBinding = new BasicHttpBinding();
streamBinding.TransferMode = TransferMode.StreamedRequest;
streamBinding.MaxReceivedMessageSize = long.MaxValue;
streamBinding.Security = new BasicHttpSecurity
{
Transport = new HttpTransportSecurity
{
ClientCredentialType = HttpClientCredentialType.None,
ProxyCredentialType =HttpProxyCredentialType.None
},
Mode = BasicHttpSecurityMode.Transport,
};
streamBinding.MaxBufferSize = int.MaxValue;
streamBinding.MessageEncoding = WSMessageEncoding.Mtom;
streamBinding.SendTimeout = new TimeSpan( 1, 0, 0, 0, 0 );
streamBinding.ReceiveTimeout = new TimeSpan( 1, 0, 0, 0, 0 );
return streamBinding;
}
On the client the code to create service is setup like this (the certificate location is just for testing):
protected ErrorReportingServiceClient CreateClient() {
X509Certificate2 cert = new X509Certificate2( #"C:\certs\reporting.cer" );
EndpointAddress endpointAddress = new EndpointAddress( new Uri( ReportingServiceUri ));
ErrorReportingServiceClient client = new ErrorReportingServiceClient( CreateStreamingBinding(), endpointAddress );
client.ClientCredentials.ServiceCertificate.DefaultCertificate = cert;
client.ClientCredentials.ServiceCertificate.Authentication.CertificateValidationMode = X509CertificateValidationMode.None;
client.ClientCredentials.ClientCertificate.Certificate = cert;
return client;
}
On the service side the setup is as follows:
X509Certificate2 cert = new X509Certificate2( #"C:\certs\reporting.pfx", <password>);
BasicHttpBinding basicHttpBinding = CreateStreamingBinding();
host.Credentials.ClientCertificate.Certificate = cert;
host.Credentials.ClientCertificate.Authentication.CertificateValidationMode = X509CertificateValidationMode.None;
host.Credentials.ServiceCertificate.Certificate = cert;
host.AddServiceEndpoint( contractType, basicHttpBinding, baseAddress );
Any help on how to setup this correctly would be greatly appreciated.
The question was answered on the MSDN forums:
http://social.msdn.microsoft.com/Forums/en-US/wcf/thread/14f44296-5e3d-4df5-8cc4-a185415852b7