We've created a local NuGet feed with Nuget.Server. It is a simple ASP.NET application that is hosted on an IIS web server that is part of our local company network.
The url of the feed looks like this:
https://abc.company.com/packages/nuget
The ISS authentication is enabled as followed:
The .NET authrization looks like this:
If I call the above mentioned feed url with Postman or Fiddler I get the following response:
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<service xml:base="https://abc.company.com/packages/nuget/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:app="http://www.w3.org/2007/app" xmlns="http://www.w3.org/2007/app">
<workspace>
<atom:title>Default</atom:title>
<collection href="Packages">
<atom:title>Packages</atom:title>
</collection>
</workspace>
</service>
If I now add the URL in the Visual Studio NuGet package sources:
and then choose the newly created package source the following screen appears where I enter my domain credentials (domain\user), but nothing happens:
As described before, when I access the site with any Browser (IE, Chrome, Edge) I don't have to enter my credentials and Fiddler/Postman do also not require any credentials.
On the VS PackageManager Output I get the following error message:
[Local] The V2 feed at 'https://abc.company.com/packages/nuget/Search()?$filter=IsAbsoluteLatestVersion&searchTerm=''&targetFramework='net462'&includePrerelease=true&$skip=0&$top=26' returned an unexpected status code '403 Forbidden ( The server denied the specified Uniform Resource Locator (URL). Contact the server administrator. )'.
When I call this URL with a browser I don't get any errors.
What is wrong in this setup?
The problem was caused by our company proxy, that was also taken into account when opening internal sites. So I've added the following environment variable:
NO_PROXY
as value I've added the company internal domain:
NO_PROXY = "abc.company.com"
No it works locally and on our build agents.
For my case, I have to disabled SSL Require in IIS SSL Settings
Related
I have a protected page setup in AEM using the Authentication Requirement checkbox on the author. Then over in the OSGi I have config for my external Okta SAML config:
<?xml version="1.0" encoding="UTF-8"?>
<jcr:root xmlns:sling="http://sling.apache.org/jcr/sling/1.0"
xmlns:jcr="http://www.jcp.org/jcr/1.0"
jcr:primaryType="sling:OsgiConfig"
identitySyncType="default"
keyStorePassword="admin"
service.ranking="5002"
idpHttpRedirect="{Boolean}false"
createUser="{Boolean}true"
defaultRedirectUrl="/"
userIDAttribute="ssoGuid"
idpIdentifier=""
assertionConsumerServiceURL=""
defaultGroups="[everyone]"
storeSAMLResponse="{Boolean}false"
signatureMethod="http://www.w3.org/2001/04/xmldsig-more#rsa-sha256"
idpCertAlias="certalias___1657659258516"
addGroupMemberships="{Boolean}true"
path="[/content/mySite]"
digestMethod="http://www.w3.org/2001/04/xmlenc#sha256"
synchronizeAttributes="[...]"
clockTolerance="60"
groupMembershipAttribute="groupMembership"
idpUrl="oktaURL"
serviceProviderEntityId="https://stage.mySite.com"
logoutUrl=""
handleLogout="{Boolean}false"
userIntermediatePath="sso"
spPrivateKeyAlias=""
useEncryption="{Boolean}false"
nameIdFormat="urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress"/>
And in my okta config, I have https://stage.mySite.com/saml_login as the SSO URL and https://stage.mySite.com as the audience restriction.
When I navigate to the requested page in AEM I get redirected to Okta, I sign in and am redirected to https://stage.mySite.com/saml_login, all of this is expected, here is where it gets weird, I then get a 301 redirect to https://stage.mySite.com/saml_login.html which then gives a 404. It seems like AEM does not have a listener setup and so does the redirect.
Any thoughts on what i might have misconfigured?
In my case, it was a dispatcher config issue (or nginx, not sure where the rewrite was done).
It was setup to append '.html' if it does not exist in the requested url. I needed to make an exception for that rule.
Can someone explain how workspace proxy works?
Whats the right configuration so I can make requests from shell (please see below)?
I have Geoserver running in a docker container and is listening in the host on port 12018.
Everything is fine accesing through the web browser.
The following URL request works on browser:
http://localhost:12018/geoserver/geonode/ows?service=WFS&version=1.0.0&request=GetFeature&typeName=my_data_name35&maxFeatures=50&outputFormat=application%2Fjson
Using typeName as geonode:my_data_name35 also works:
http://localhost:12018/geoserver/geonode/ows?service=WFS&version=1.0.0&request=GetFeature&typeName=geonode%3Amy_data_name35&maxFeatures=50&outputFormat=application%2Fjson
But from cURL, the first request returns:
<?xml version="1.0" ?>
<ServiceExceptionReport
version="1.2.0"
xmlns="http://www.opengis.net/ogc"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wfs/1.0.0/OGC-exception.xsd">
<ServiceException code="InvalidParameterValue" locator="typeName">
Feature type :my_data_name35 unknown
</ServiceException></ServiceExceptionReport>
And also from cURL, the second request returns:
<?xml version="1.0" ?>
<ServiceExceptionReport
version="1.2.0"
xmlns="http://www.opengis.net/ogc"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wfs/1.0.0/OGC-exception.xsd">
<ServiceException code="InvalidParameterValue" locator="typeName">
Feature type geonode:my_data_name35 unknown
</ServiceException></ServiceExceptionReport>
Any help is appreciated. Thanks!
I found the problem, very basic actually.
The resource requested needs authentication, where the browser passes the cookie.
Using cURL, also needs to pass authentication.
It does not return forbidden maybe because some resources don't need authentication.
Sorry for the noise.
I have this static site. When development, I am accessing it with the following url:
file:///C:/development/pokedex/index.html
But this static site works with angular, and when angular try to get any view via ajax, it fails becouse Cross origin requests are only supported for protocol schemes.
Now, I tried to modify the Hosts file to try to access it into localhost or any other url different than file:///etcetc, adding the following line to such file:
hosts:
file:///C:/development/pokedex pokedex
But when I attemp to access http://pokedex, nothing happends.
How can I debug my site without having to upload it to a server just to debug it?
HELP! - I am trying to take a PayPal Payments Pro (Magento 1.8.1) API live and I am getting the following error:
exception 'Exception' with message 'PayPal NVP CURL connection error #35:
error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure'
in <my_root_folders>/app/code/core/Mage/Paypal/Model/Api/Nvp.php:986
In the payment_paypal_direct.log file I have the following for every error:
2014-11-08T02:12:36+00:00 DEBUG (7): Array
(
[url] => https://api-3t.sandbox.paypal.com/nvp
[DoDirectPayment] => Array
(
No matter how I set the various flags for sandbox mode, my errors all show the sandbox URL for the API. I have even double checked the paypal/wpp/sandbox_flag in the core_config table in the db and it is flipping from 0 to 1 when I change the configuration in Magento's admin.
Has anyone experienced this persistent sandbox URL?
Sandbox Mode = OFF
SSL Verification = Disabled (have tried it enabled too, no difference)
all caching is disabled (I clear cache often just in case)
I reindex entire site frequently
There were two issues effecting my website:
Our server was not configured in response to the POODLE vulnerability and PayPal was rejecting the server connection.
Sandbox setting was enabled for a child "Configuration Scope" and edits made to the "Default Config" (the parent/master config) were being overridden.
Hopefully this may help someone.
In addition to the above answer you may configure your server and disable SSLv3 by editing you Apache's httpd.conf and adding the following code:
SSLHonorCipherOrder On
SSLProtocol -All +TLSv1
You may also do this via WHM if you have a VPS or Dedicated Server:
Go to Service Configuration -> Apache Configuration -> Include Editor -> Pre Main Include
and add the above two lines.
I am creating a screen sharing addon in Firefox and need to use : media.getusermedia.screensharing" feature.
But since I am working locally using AddOn SDK, my url of opened html file is : resource://jid1-q3wuqdulcvnnrq-at-jetpack/toolbar_button/data/index.html
Now, on this link I get an error :
In about:config, please enable media.getusermedia.screensharing.enabled
and add this site's domain name to media.getusermedia.screensharing.allowed_domains in about:config
Also It requires an https connection.
While the same WebRTC works fine in Chrome extension.
Can someone please guide on how to add local file url to allowed domains? or to use getusermedia for local development.
A search on mxr for screensharing:
http://mxr.mozilla.org/mozilla-release/search?string=screensharing
which led to this whitelist:
http://mxr.mozilla.org/mozilla-release/source/dom/media/MediaManager.cpp#151
So then i checked domains already in this pref, they were: webex.com,*.webex.com,collaborate.com,*.collaborate.com
so when i did this: Services.io.newURI('http://www.webex.com', null, null) i got this:
So it looks like whatever is in host is what we want in this pref. So i tried newURI of file uri:
Services.io.newURI('file:///C:/Users/Vayeate/Documents/GitHub/Profilist/bullet_aurora.png',null,null)
it dumps this:
so im thinking in the the pref add this:
,,
which is a blank space, which is what it looks like it is for file uris
so mine would look like: webex.com,*.webex.com,collaborate.com,,*.collaborate.com
or can even trying just an asterik so like:
webex.com,*.webex.com,collaborate.com,*.collaborate.com,*