Go-Swagger: Failed to load API definition 405 - go

I'm trying to generate a swagger docs in golang with go-swagger mod. Basically i'm generating the specs by source
swagger generate spec -o ./gen/swagger-local.yaml --scan-models
On localhost it's ok, the docs was generated and I can visualize on browser (localhost).
However, in production, I received an error 405 (invalid):
{"schemaValidationMessages":[{"level":"error","message":"Can't read from file {{myURL/swagger.yaml}}"}]}
Searching more about it, I found this thread ==> Github
Apparently, it's because I need to disable the validate() or configure it, but I didn't find any more information. Any help?

Related

XML Sitemap not working with Google Search Console..?

I have an XML sitemap located at https://store.usbswiper.com/sitemap_index.xml, which as you can see loads just fine.
However, Google Search Console is telling me it can't fetch the sitemap.
When I use this validator it's giving me a successful validation.
I have checked the robots.txt, and it's not blocking anything. It specifices the sitemap URL correctly as well.
Any info on why Google Search Console is giving me this "couldn't fetch" message would be greatly appreciated.
EDIT: When I first ran that validator it gave me this error:
Incorrect http header content-type: "" (expected: "application/xml")
I added a robots.txt and then when I ran it and posted this thread it was validating successfully. I just now tried again and it's failing again with the same message. I don't understand why it's working sometimes and sometimes not. The Search Console hasn't successfully loaded at all no matter what the validator is doing.
Add the full link in console;
for example, add https://example.net/sitemap.xml
rather than just sitemap.xml.

Swashbuckle + XmlComments work locally, but fail swagger generation on server

I have a webapi project, and I am utilizing the swashbuckle framework to flush out api documentation.
I have followed the directions to build the documentation xml file with my controller and DTO's, and it all works great locally.
However, when generating the swagger document, a 500 error is thrown. I have confirmed if i remove my xml registration line, the swagger doc is generated and returned successfully.
here is my registration line:
GlobalConfiguration.Configuration.EnableSwagger(c =>
{
...
c.IncludeXmlComments($"{System.AppDomain.CurrentDomain.BaseDirectory}bin\\Company.MyApp.xml");
...
Update: I did some additional logging, and while this lne for IncludeXmlComments runs through successfully on startup, when I request the swagger.json file from the server, I am getting a System.IO.FileNotFoundException: Could not find file 'D:\home\site\wwwroot\bin\Monetary.Scheduling.xml' exception. when I use the Kudu tools to look into this directory, I cannot find this file.
TL;DR: Why is this file showing up fine locally, but when I deploy to Azure using Kudu or a Octopus nugget package, this file is not there?
The problem is that somehow the XML file is not making it to your server...
I had this exact issue with an Azure deployment it worked fine in my local machine but not in azure
...and it was because I was missing the XML doc in the release config

Cloud Code Functions will not work after Parse Server migration to Heroku

I followed the migration path from the parse website to Heroku.
Parse initializes but I cannot find any of my cloud code functions from my JS Angular Web App, example :-
Parse.Cloud.run('checkStats',{'id' : id }
The network tab shows POST request with a 404.
http://..../parse/1/functions/checkStats
As I test I used hurl.it (http tool) and got the same results, I then changed the url and removed the /1/ and the function works.
http://..../parse/functions/checkStats
Any ideas?
Figured it out, I had to update my Parse JS SDK to the latest version (1.6.14) on the client side. In addition, on the cloud code side, code changes were required as Parse.User.current() or Parse.Cloud.useMasterKey() are no longer supported, instead you need to use different functions for the same result... refer to
https://github.com/ParsePlatform/parse-server

MediaWiki InstantCommons file download error

My goal: I'd like to use an image from commons.mediawiki.org within a MediaWiki installation.
First I was trying to debug my InstantCommons configuration: Referring to files on commons.mediawiki.org failed for some reason. After activating various debugging options I learned that though general image download succeeded some kind of thumbnail followup request issued by the MediaWiki installation failed, which resulted into an overall error from the ForeignAPIRepo-Module.
As I can not deal with this error right now I thought I'd try something else as some kind of fallback: Download the MediaWiki image by specifiying the image URL in the upload image web page. The idea is to let MediaWiki download the image and include this image as regular wiki content. This way I would require to add license details manually and add a few comments, but this would be better than having no image.
But trying this I strangely get an error: It says "Fehler beim Senden der Anfrage" which means "Error while sending the request". But the internal request seems succeed in the logs. Here is what MediaWiki was logging:
[fileupload] Temporary file created "/tmp/URLdafce5345aa3-1"
[fileupload] Starting download from "https://upload.wikimedia.org/wikipedia/commons/c/c7/Broccoli%2C_Champignons%2C_Karotten_%2810581663524%29.jpg" <followRedirects>
[fileupload] <Error, collected 1 error(s) on the way, integer value set>
+------+---------------------------+------------------------------------------+
| 1 | http-request-error | |
+------+---------------------------+------------------------------------------+
[fileupload] Download by URL completed with HTTP status 200
Comment: All other log messages do not indicate anything that looks like an error or is related to the task of downloading the image, so I skipped them here.
The URL is correct, the image can be downloaded from the URL, MediaWiki receives a response code of 200, but instead of processing the response it indicates an error. Why? For http and https URLs I get the same result in the log.
Has anybody encountered this problem before in MediaWiki installations? Does anyone have any idea what the reason for this behaviour could be?
Comment: The wiki is of version 1.25.2 and a standard installation including SWM on an up to date standard Ubuntu Linux OS. Nothing exotic, nothing modified in any way.
Comment: Yes, I could upgrade to the latest version but, I'm not sure if this really solves the problem: I know that this featured did work in some other MediaWiki installations I have set up some time ago. Does anyone have a clue why this feature could fail here? Has anyone encountered something like this before?
Edit: I experimented with downloading from another MediaWiki instance of exactly the same version - 1.25.2 - in my local network. This did not succeed as well. But I get a different error message (translated): "The file .... could not be stored at ...". The "funny part": Though the error message indicated otherwise the file has been downloaded successfully and stored as expected. It has the correct user rights as one would expect, but log messages indicate that there are bugs in MediaWiki regarding this part: ("PHP Notice: Undefined property: UploadFromUrl::$nbytes") Maybe the uploading implementation is buggy somehow and the problems I am running into are typical?
There are multiple bugs with HTTPS support in MediaWiki, php-curl etc. See https://www.mediawiki.org/wiki/InstantCommons#HTTPS for debugging information, there is no magic bullet.

Dredd - API Blueprint Testing Tool. "Undefined" issue

I installed dredd - API Blueprint Testing Tool and trying to test our APIary API against the implementation.
In the blueprint I have just one resource which is correctly implemented on somehost... but test fails :(
test command:
dredd apiary.apib http://somehost.de:8443/imp-endpoint
output:
Info: Beginning Dredd testing...
undefined
I tried also with more options to get more information what is undefined.. like -l verbose and some other options. But I did not get more information about the failure :(
Does anyone have experience with it? Thank you!!! :)
Answered on Github but I'll reproduce the main points here for reference:
I wasn't able to to reproduce your "undefined" problem, but there are a few issues that, when corrected, make everything work.
At the moment, the base url can't have a path in it (see #43). This is solved in #45 but hasn't yet been merged. So your command should be 'dredd apiary.apib http://somehost.de:8443/' and then '/imp-endpoint' should be a prefix on the urls.
The URI template is incorrect in your apib file. Instead of /api/V0/Resources/CarSharing/Cars?{lat}&{lng}&{radius}, it should be /imp-endpoint/api/V0/Resources/CarSharing/Cars{?lat,lng,radius}. See RFC6750 for reference.
The line-endings should be Unix style, not Windows (\n, not \n\r). When I first ran the apib you provided, I got the error: the use of carriage return(s) '\r' in source data isn't currently supported, please contact makers (this is actually enforced by the blueprint parser, see snowcrash)
Hope that helps! I get a 401 when running the test, so you'll need to provide HTTP Basic authentication information (this can be done in the header section of the blueprint or with the -u flag on dredd, as in -u username:password)

Resources