I tried to set a new parse server url while inside the app, but there was no change what so ever.
Parse.server = #"https://serverNewUrl.com"; or [Parse setServer:#"https://serverNewUrl.com"];
NSLog(#"%#",Parse.server); -----> This prints the new server set
NSLog(#"%#",Parse.currentConfiguration.server); -----> This prints the old one set
How did you achieve the change ? Parse iOS SDK 1.18
Try to use the initialize function. I believe it should work even if you use it a second time to change the server URL.
let parseConfig = ParseClientConfiguration {
$0.applicationId = "parseAppId"
$0.clientKey = "parseClientKey"
$0.server = "parseServerUrlString"
}
Parse.initialize(with: parseConfig)
Related
Using Octopus Deploy to deploy a simple API.
The first step of our deployment process is to generate an HTML report with the delta of the scripts run vs the scripts required to run. I used this tutorial to create the step.
The relevant code in my console application is:
var reportLocationSection = appConfiguration.GetSection(previewReportCmdLineFlag);
if (reportLocationSection.Value is not null)
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
try
{
var report = reportLocationSection.Value;
var fullReportPath = Path.Combine(report, deltaReportName);
Console.WriteLine($"Generating upgrade report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
return operationError;
}
}
The Powershell which I am using in the script step is:
# Get the extracted path for the package
$packagePath = $OctopusParameters["Octopus.Action.Package[DatabaseUpdater].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Database.ConnectionString"]
$reportPath = $OctopusParameters["Project.HtmlReport.Location"]
Write-Host "Report Path: $($reportPath)"
$exeToRun = "$($packagePath)\DatabaseUpdater.exe"
$generatedReport = "$($reportPath)\UpgradeReport.html"
Write-Host "Generated Report: $($generatedReport)"
if ((test-path $reportPath) -eq $false){
New-Item "Creating new directory..."
} else {
New-Item "Directory already exists."
}
# Run this .NET app, passing in the Connection String and a flag
# which tells the app to create a report, but not update the database
& $exeToRun --connectionString="$($connectionString)" --previewReportPath="$($reportPath)"
New-OctopusArtifact -Path "$($generatedReport)"
The error reported by Octopus is:
'Could not find file 'C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html'.'
I'm guessing that is being thrown when this powershell line is hit: New-OctopusArtifact ...
And that seems to indicate that the report was never created.
I've used a bit of logging to log out certain variables and the values look sound:
Report Path: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9
Generated Report: C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
Generating upgrade report at C:\DeltaReports\Some API\2.9.15-DbUp-Test-9\UpgradeReport.html
As you can see in the C#, the relevant code is wrapped in a try/catch block, but I'm not sure whether the error is being written out there or at a later point by Octopus (I'd need to do a pull request to add a marker in the code).
Can anyone see a way forward win resolving this? Has anyone else encountered this?
Cheers
I recently redid some of the work from that article for this video up on YouTube. I did run into some issues with the .SQL files not being included in the assembly. I think it was after I upgraded to .NET 6. But that might be a coincidence.
Anyway, because the files weren't being included in the assembly, when I ran the command line app via Octopus, it wouldn't properly generate the file for me. I ended up configuring the project to copy the .SQL files to a folder in the output directory instead of embedding them in the assembly. You can view a sample package here.
One thing that helped me is running the app in a debugger with the same parameters just to make sure it was actually generating the file. I'm sure you already thought of that, but I'd be remiss if I forgot to include it in my answer. :)
FWIW, this is my updated scripts.
First, the Octopus Script:
$packagePath = $OctopusParameters["Octopus.Action.Package[Trident.Database].ExtractedPath"]
$connectionString = $OctopusParameters["Project.Connection.String"]
$environmentName = $OctopusParameters["Octopus.Environment.Name"]
$reportPath = $OctopusParameters["Project.Database.Report.Path"]
cd $packagePath
$appToRun = ".\Octopus.Trident.Database.DbUp"
$generatedReport = "$reportPath\UpgradeReport.html"
& $appToRun --ConnectionString="$connectionString" --PreviewReportPath="$reportPath"
New-OctopusArtifact -Path "$generatedReport" -Name "$environmentName.UpgradeReport.html"
My C# code can be found here but for ease of use, you can see it all here (I'm not proud of how I parse the parameters).
static void Main(string[] args)
{
var connectionString = args.FirstOrDefault(x => x.StartsWith("--ConnectionString", StringComparison.OrdinalIgnoreCase));
connectionString = connectionString.Substring(connectionString.IndexOf("=") + 1).Replace(#"""", string.Empty);
var executingPath = Assembly.GetExecutingAssembly().Location.Replace("Octopus.Trident.Database.DbUp", "").Replace(".dll", "").Replace(".exe", "");
Console.WriteLine($"The execution location is {executingPath}");
var deploymentScriptPath = Path.Combine(executingPath, "DeploymentScripts");
Console.WriteLine($"The deployment script path is located at {deploymentScriptPath}");
var postDeploymentScriptsPath = Path.Combine(executingPath, "PostDeploymentScripts");
Console.WriteLine($"The deployment script path is located at {postDeploymentScriptsPath}");
var upgradeEngineBuilder = DeployChanges.To
.SqlDatabase(connectionString, null)
.WithScriptsFromFileSystem(deploymentScriptPath, new SqlScriptOptions { ScriptType = ScriptType.RunOnce, RunGroupOrder = 1 })
.WithScriptsFromFileSystem(postDeploymentScriptsPath, new SqlScriptOptions { ScriptType = ScriptType.RunAlways, RunGroupOrder = 2 })
.WithTransactionPerScript()
.LogToConsole();
var upgrader = upgradeEngineBuilder.Build();
Console.WriteLine("Is upgrade required: " + upgrader.IsUpgradeRequired());
if (args.Any(a => a.StartsWith("--PreviewReportPath", StringComparison.InvariantCultureIgnoreCase)))
{
// Generate a preview file so Octopus Deploy can generate an artifact for approvals
var report = args.FirstOrDefault(x => x.StartsWith("--PreviewReportPath", StringComparison.OrdinalIgnoreCase));
report = report.Substring(report.IndexOf("=") + 1).Replace(#"""", string.Empty);
if (Directory.Exists(report) == false)
{
Directory.CreateDirectory(report);
}
var fullReportPath = Path.Combine(report, "UpgradeReport.html");
if (File.Exists(fullReportPath) == true)
{
File.Delete(fullReportPath);
}
Console.WriteLine($"Generating the report at {fullReportPath}");
upgrader.GenerateUpgradeHtmlReport(fullReportPath);
}
else
{
var result = upgrader.PerformUpgrade();
// Display the result
if (result.Successful)
{
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Success!");
}
else
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(result.Error);
Console.WriteLine("Failed!");
}
}
}
I hope that helps!
After long and detailed investigation, we discovered the answer was quite obvious.
We assumed the existing deploy process configuration was sound. Because we never had a problem with it (until now). As it transpires, there was a problem which led to the Development deployments being deployed twice.
Hence, the errors like the one above and others which talked about file handles being held by another process.
It was actually obvious in hindsight, but we were blind to it as we thought the existing process was sound 😣
I set up an Elastic Cloud to offload my local elasticsearch config (as one does), but for reasons unknown to me, I can't get it to show any logs in Elastic Cloud, despite it working fine locally.
The code I got: (modified for privacy reasons)
//var uri = new Uri("http://localhost:9200"); // old one
var uri = new Uri("https://my-server.kb.eastus2.azure.elastic-cloud.com:9243");
var sinkOptions = new ElasticsearchSinkOptions(uri)
{
AutoRegisterTemplate = true,
ModifyConnectionSettings = x => x.BasicAuthentication("elastic", "the password I was given"),
IndexFormat = $"test-logs-{env.EnvironmentName?.ToLower().Replace('.', '-')}-{DateTime.Now:yyyy-MM}",
};
Log.Logger = new LoggerConfiguration()
.ReadFrom.Configuration(config)
.Enrich.FromLogContext()
.Enrich.WithMachineName()
.WriteTo.Console()
.WriteTo.Elasticsearch(sinkOptions)
.Enrich.WithProperty("Environment", env.EnvironmentName)
.CreateLogger();
There are two possible reasons I can think of that might be the cause of this not working:
The credentials are wrong
The Uri is wrong
Every solution I've been given so far has provided the data in this fashion, and nowhere does it say what the URI I'm supposed to use looks like.
I get no errors.
I get no warnings.
I get no logs.
What am I doing wrong here?
The issue was using the incorrect uri. I wrote
my-server.kb.eastus2.azure.elastic-cloud.com:9243 rather than
my-server.es.eastus2.azure.elastic-cloud.com:9243.
Note the very tiny difference that is kb vs es in the url
I am building an iOS app with Swift2.0, XCode 7.2
I am trying to make an api call to:
htttp://xyz.com/t/restaurants-us?KEY=someKey&filters={"locality":{"$eq":"miami"}}
let endPoint:String = "htttp://xyz.com/t/restaurants-us?KEY=someKey&filters={%22locality%22:{%22$eq%22:%22miami%22}}"
When I try to create an URL using this string(endPoint):
let url = NSURL(string: endPoint), a nil is returned.
So I tried encoding the string before trying to create URL:
let encodedString = endPoint.stringByAddingPercentEncodingWithAllowedCharacters(NSCharacterSet.URLQueryAllowedCharacterSet())
Now the encodedString:
"htttp://xyz.com/t/restaurants-us?KEY=someKey&filters=%7B%2522locality%2522:%7B%2522$eq%2522:%2522miami%2522%7D%7D"
But now when i create a NSURL session and send the request, I get an unexpected response from the server:
Reply from server:
{
"error_type" = InvalidJsonArgument;
message = "Parameter 'filters' contains an error in its JSON syntax. For documentation, please see: http://developer.factual.com.";
status = error;
version = 3;
}
So if I don't encode the string, I will not be able to create NSURL.
But if I encode and send the request, the server is not able to handle the request.
Can anyone please suggest a workaround.
When you declare endpoint, you have already percent-encoded some characters (the quotes). When you ask iOS to percent-encode it, it percent-encodes the percent-encodes. Decoding the encodedString results in:
htttp://xyz.com/t/restaurants-us?KEY=someKey&filters={%22locality%22:{%22$eq%22:%22miami%22}}
Instead, you should start with actual quotes in endpoint:
let endPoint:String = "htttp://xyz.com/t/restaurants-us?KEY=someKey&filters={\"locality\":{\"$eq\":\"miami\"}}"
I am trying to port tests from using FakeRequest to using WithServer.
In order to simulate a session with FakeRequest, it is possible to use WithSession("key", "value") as suggested in this post: Testing controller with fake session
However when using WithServer, the test now looks like:
"render the users page" in WithServer {
val users = await(WS.url("http://localhost:" + port + "/users").get)
users.status must equalTo(OK)
users.body must contain("Users")
}
Since there is no WithSession(..) method available, I tried instead WithHeaders(..) (does that even make sense?), to no avail.
Any ideas?
Thanks
So I found this question, which is relatively old:
Add values to Session during testing (FakeRequest, FakeApplication)
The first answer to that question seems to have been a pull request to add .WithSession(...) to FakeRequest, but it was not applicable to WS.url
The second answer seems to give me what I need:
Create cookie:
val sessionCookie = Session.encodeAsCookie(Session(Map("key" -> "value")))
Create and execute request:
val users = await(WS.url("http://localhost:" + port + "/users")
.withHeaders(play.api.http.HeaderNames.COOKIE -> Cookies.encodeCookieHeader(Seq(sessionCookie))).get())
users.status must equalTo(OK)
users.body must contain("Users")
Finally, the assertions will pass properly, instead of redirecting me to the login page
Note: I am using Play 2.4, so I use Cookies.encodeCookieHeader, because Cookies.encode is deprecated
I am doing a video crawler in ruby. In there I have to log in to a page by enabling cookies and download pages. For that I am using the CURL library in ruby. I can successfully log in, but I can't download the pages inside that with curl. How can I fix this or download the pages otherwise?
My code is
curl = Curl::Easy.new(1st url)
curl.follow_location = true
curl.enable_cookies = true
curl.cookiefile = "cookie.txt"
curl.cookiejar = "cookie.txt"
curl.http_post(1st url,field)
curl.perform
curl = Curl::Easy.perform(2nd url)
curl.follow_location = true
curl.enable_cookies = true
curl.cookiefile = "cookie.txt"
curl.cookiejar = "cookie.txt"
curl.http_get
code = curl.body_str
What I've seen in writing my own similar "post-then-get" script is that ruby/Curb (I'm using version 0.7.15 with ruby 1.8) seems to ignore the cookiejar/cookiefile fields of a Curl::Easy object. If I set either of those fields and the http_post completes successfully, no cookiejar or cookiefile file is created. Also, curl.cookies will still be nil after your curl.http_post, however, the cookies ARE set within the curl object. I promise :)
I think where you're going wrong is here:
curl = Curl::Easy.perform(2nd url)
The curb documentation states that this creates a new object. That new object doesn't have any of your existing cookies set. If you change your code to look like the following, I believe it should work. I've also removed the curl.perform for the first url since curl.http_post already implicitly does the "perform". You were basically http_post'ing twice before trying your http_get.
curl = Curl::Easy.new(1st url)
curl.follow_location = true
curl.enable_cookies = true
curl.http_post(1st url,field)
curl.url = 2nd url
curl.http_get
code = curl.body_str
If this still doesn't seem to be working for you, you can verify if the cookie is getting set by adding
curl.verbose = true
Before
curl.http_post
Your Curl::Easy object will dump all the headers that it gets in the response from the server to $stdout, and somewhere in there you should see a line stating that it added/set a cookie. I don't have any example output right now but I'll try to post a follow-up soon.
HTTPClient automatically enables cookies, as does Mechanize.
From the HTTPClient docs:
clnt = HTTPClient.new
clnt.get_content(url1) # receives Cookies.
clnt.get_content(url2) # sends Cookies if needed.
Posting a form is easy too:
body = { 'keyword' => 'ruby', 'lang' => 'en' }
res = clnt.post(uri, body)
Mechanize makes this sort of thing really simple (It will handle storing the cookies, among other things).